WorldWideScience

Sample records for risk model version

  1. MATILDA Version-2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part II

    Science.gov (United States)

    2017-07-28

    risk assessment for “unsafe” scenarios. Recently, attention in the DoD has turned to Probabilistic Risk Assessment (PRA) models [5,6] as an...corresponding to the CRA undershoot boundary. The magenta- coloured line represents the portion of the C-RX(U) circle that would contribute to the...Tertiary Precaution Surface. Undershoot related laser firing restrictions within the green- coloured C-RX(U) can be ignored. Figure 34

  2. History-Dependent Risk Attitude, Second Version

    OpenAIRE

    David Dillenberger; Kareen Rozen

    2011-01-01

    We propose a model of history-dependent risk attitude, allowing a decision maker’s risk attitude to be affected by his history of disappointments and elations. The decision maker recursively evaluates compound risks, classifying realizations as disappointing or elating using a threshold rule. We establish equivalence between the model and two cognitive biases: risk attitudes are reinforced by experiences (one is more risk averse after disappointment than after elation) and there is a primacy ...

  3. Modelling Risk to US Military Populations from Stopping Blanket Mandatory Polio Vaccination (Open Access Publisher’s Version)

    Science.gov (United States)

    2017-09-14

    2014. [24] “United Nations, Department of Economic and Social Affairs, Population Division, World Population Prospects, the 2015 Revision,” http...Research Article Modelling Risk to US Military Populations from Stopping Blanket Mandatory Polio Vaccination Colleen Burgess,1,2 Andrew Burgess,2 and...for polio transmission within military populations interacting with locals in a polio-endemic region to evaluate changes in vaccination policy

  4. The Unified Extensional Versioning Model

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred; Christensen, H. B.

    1999-01-01

    Versioning of components in a system is a well-researched field where various adequate techniques have already been established. In this paper, we look at how versioning can be extended to cover also the structural aspects of a system. There exist two basic techniques for versioning - intentional...

  5. Versions of the Waste Reduction Model (WARM)

    Science.gov (United States)

    This page provides a brief chronology of changes made to EPA’s Waste Reduction Model (WARM), organized by WARM version number. The page includes brief summaries of changes and updates since the previous version.

  6. MATILDA Version 2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part I

    Science.gov (United States)

    2017-03-13

    support of airborne laser designator use during test and training exercises on military ranges. The initial MATILDA tool, MATILDA PRO Version-1.6.1...2]. The use of the ALARP principle in UK hazard assessment arises from the provisions of the UK Health and Safety at Work Act of 1974 [18]. Given...The product of the probabilistic fault/failure laser hazard analysis is the ex- pectation value: the likelihood that an unprotected observer outside

  7. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  8. Modeling report of DYMOND code (DUPIC version)

    International Nuclear Information System (INIS)

    Park, Joo Hwan; Yacout, Abdellatif M.

    2003-04-01

    The DYMOND code employs the ITHINK dynamic modeling platform to assess the 100-year dynamic evolution scenarios for postulated global nuclear energy parks. Firstly, DYMOND code has been developed by ANL(Argonne National Laboratory) to perform the fuel cycle analysis of LWR once-through and LWR-FBR mixed plant. Since the extensive application of DYMOND code has been requested, the first version of DYMOND has been modified to adapt the DUPIC, MSR and RTF fuel cycle. DYMOND code is composed of three parts; the source language platform, input supply and output. But those platforms are not clearly distinguished. This report described all the equations which were modeled in the modified DYMOND code (which is called as DYMOND-DUPIC version). It divided into five parts;Part A deals model in reactor history which is included amount of the requested fuels and spent fuels. Part B aims to describe model of fuel cycle about fuel flow from the beginning to the end of fuel cycle. Part C is for model in re-processing which is included recovery of burned uranium, plutonium, minor actinide and fission product as well as the amount of spent fuels in storage and disposal. Part D is for model in other fuel cycle which is considered the thorium fuel cycle for MSR and RTF reactor. Part E is for model in economics. This part gives all the information of cost such as uranium mining cost, reactor operating cost, fuel cost etc

  9. Modeling report of DYMOND code (DUPIC version)

    Energy Technology Data Exchange (ETDEWEB)

    Park, Joo Hwan [KAERI, Taejon (Korea, Republic of); Yacout, Abdellatif M [Argonne National Laboratory, Ilinois (United States)

    2003-04-01

    The DYMOND code employs the ITHINK dynamic modeling platform to assess the 100-year dynamic evolution scenarios for postulated global nuclear energy parks. Firstly, DYMOND code has been developed by ANL(Argonne National Laboratory) to perform the fuel cycle analysis of LWR once-through and LWR-FBR mixed plant. Since the extensive application of DYMOND code has been requested, the first version of DYMOND has been modified to adapt the DUPIC, MSR and RTF fuel cycle. DYMOND code is composed of three parts; the source language platform, input supply and output. But those platforms are not clearly distinguished. This report described all the equations which were modeled in the modified DYMOND code (which is called as DYMOND-DUPIC version). It divided into five parts;Part A deals model in reactor history which is included amount of the requested fuels and spent fuels. Part B aims to describe model of fuel cycle about fuel flow from the beginning to the end of fuel cycle. Part C is for model in re-processing which is included recovery of burned uranium, plutonium, minor actinide and fission product as well as the amount of spent fuels in storage and disposal. Part D is for model in other fuel cycle which is considered the thorium fuel cycle for MSR and RTF reactor. Part E is for model in economics. This part gives all the information of cost such as uranium mining cost, reactor operating cost, fuel cost etc.

  10. Forsmark - site descriptive model version 0

    International Nuclear Information System (INIS)

    2002-10-01

    During 2002, the Swedish Nuclear Fuel and Waste Management Company (SKB) is starting investigations at two potential sites for a deep repository in the Precambrian basement of the Fennoscandian Shield. The present report concerns one of those sites, Forsmark, which lies in the municipality of Oesthammar, on the east coast of Sweden, about 150 kilometres north of Stockholm. The site description should present all collected data and interpreted parameters of importance for the overall scientific understanding of the site, for the technical design and environmental impact assessment of the deep repository, and for the assessment of long-term safety. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. The site descriptive models are devised and stepwise updated as the site investigations proceed. The point of departure for this process is the regional site descriptive model, version 0, which is the subject of the present report. Version 0 is developed out of the information available at the start of the site investigation. This information, with the exception of data from tunnels and drill holes at the sites of the Forsmark nuclear reactors and the underground low-middle active radioactive waste storage facility, SFR, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. For this reason, the Forsmark site descriptive model, version 0, as detailed in the present report, has been developed at a regional scale. It covers a rectangular area, 15 km in a southwest-northeast and 11 km in a northwest-southeast direction, around the

  11. Forsmark - site descriptive model version 0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-10-01

    During 2002, the Swedish Nuclear Fuel and Waste Management Company (SKB) is starting investigations at two potential sites for a deep repository in the Precambrian basement of the Fennoscandian Shield. The present report concerns one of those sites, Forsmark, which lies in the municipality of Oesthammar, on the east coast of Sweden, about 150 kilometres north of Stockholm. The site description should present all collected data and interpreted parameters of importance for the overall scientific understanding of the site, for the technical design and environmental impact assessment of the deep repository, and for the assessment of long-term safety. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. The site descriptive models are devised and stepwise updated as the site investigations proceed. The point of departure for this process is the regional site descriptive model, version 0, which is the subject of the present report. Version 0 is developed out of the information available at the start of the site investigation. This information, with the exception of data from tunnels and drill holes at the sites of the Forsmark nuclear reactors and the underground low-middle active radioactive waste storage facility, SFR, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. For this reason, the Forsmark site descriptive model, version 0, as detailed in the present report, has been developed at a regional scale. It covers a rectangular area, 15 km in a southwest-northeast and 11 km in a northwest-southeast direction, around the

  12. Simpevarp - site descriptive model version 0

    International Nuclear Information System (INIS)

    2002-11-01

    During 2002, SKB is starting detailed investigations at two potential sites for a deep repository in the Precambrian rocks of the Fennoscandian Shield. The present report concerns one of those sites, Simpevarp, which lies in the municipality of Oskarshamn, on the southeast coast of Sweden, about 250 kilometres south of Stockholm. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. SKB maintains two main databases at the present time, a site characterisation database called SICADA and a geographic information system called SKB GIS. The site descriptive model will be developed and presented with the aid of the SKB GIS capabilities, and with SKBs Rock Visualisation System (RVS), which is also linked to SICADA. The version 0 model forms an important framework for subsequent model versions, which are developed successively, as new information from the site investigations becomes available. Version 0 is developed out of the information available at the start of the site investigation. In the case of Simpevarp, this is essentially the information which was compiled for the Oskarshamn feasibility study, which led to the choice of that area as a favourable object for further study, together with information collected since its completion. This information, with the exception of the extensive data base from the nearby Aespoe Hard Rock Laboratory, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. Against this background, the present report consists of the following components: an overview of the present content of the databases

  13. Tier I Rice Model - Version 1.0 - Guidance for Estimating Pesticide Concentrations in Rice Paddies

    Science.gov (United States)

    Describes a Tier I Rice Model (Version 1.0) for estimating surface water exposure from the use of pesticides in rice paddies. The concentration calculated can be used for aquatic ecological risk and drinking water exposure assessments.

  14. Simpevarp - site descriptive model version 0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-11-01

    During 2002, SKB is starting detailed investigations at two potential sites for a deep repository in the Precambrian rocks of the Fennoscandian Shield. The present report concerns one of those sites, Simpevarp, which lies in the municipality of Oskarshamn, on the southeast coast of Sweden, about 250 kilometres south of Stockholm. The site description will have two main components: a written synthesis of the site, summarising the current state of knowledge, as documented in the databases containing the primary data from the site investigations, and one or several site descriptive models, in which the collected information is interpreted and presented in a form which can be used in numerical models for rock engineering, environmental impact and long-term safety assessments. SKB maintains two main databases at the present time, a site characterisation database called SICADA and a geographic information system called SKB GIS. The site descriptive model will be developed and presented with the aid of the SKB GIS capabilities, and with SKBs Rock Visualisation System (RVS), which is also linked to SICADA. The version 0 model forms an important framework for subsequent model versions, which are developed successively, as new information from the site investigations becomes available. Version 0 is developed out of the information available at the start of the site investigation. In the case of Simpevarp, this is essentially the information which was compiled for the Oskarshamn feasibility study, which led to the choice of that area as a favourable object for further study, together with information collected since its completion. This information, with the exception of the extensive data base from the nearby Aespoe Hard Rock Laboratory, is mainly 2D in nature (surface data), and is general and regional, rather than site-specific, in content. Against this background, the present report consists of the following components: an overview of the present content of the databases

  15. Land-Use Portfolio Modeler, Version 1.0

    Science.gov (United States)

    Taketa, Richard; Hong, Makiko

    2010-01-01

    -on-investment. The portfolio model, now known as the Land-Use Portfolio Model (LUPM), provided the framework for the development of the Land-Use Portfolio Modeler, Version 1.0 software (LUPM v1.0). The software provides a geographic information system (GIS)-based modeling tool for evaluating alternative risk-reduction mitigation strategies for specific natural-hazard events. The modeler uses information about a specific natural-hazard event and the features exposed to that event within the targeted study region to derive a measure of a given mitigation strategy`s effectiveness. Harnessing the spatial capabilities of a GIS enables the tool to provide a rich, interactive mapping environment in which users can create, analyze, visualize, and compare different

  16. Version control of pathway models using XML patches.

    Science.gov (United States)

    Saffrey, Peter; Orton, Richard

    2009-03-17

    Computational modelling has become an important tool in understanding biological systems such as signalling pathways. With an increase in size complexity of models comes a need for techniques to manage model versions and their relationship to one another. Model version control for pathway models shares some of the features of software version control but has a number of differences that warrant a specific solution. We present a model version control method, along with a prototype implementation, based on XML patches. We show its application to the EGF/RAS/RAF pathway. Our method allows quick and convenient storage of a wide range of model variations and enables a thorough explanation of these variations. Trying to produce these results without such methods results in slow and cumbersome development that is prone to frustration and human error.

  17. Environmental modeling and health risk analysis (ACTS/RISK)

    National Research Council Canada - National Science Library

    Aral, M. M

    2010-01-01

    ... presents a review of the topics of exposure and health risk analysis. The Analytical Contaminant Transport Analysis System (ACTS) and Health RISK Analysis (RISK) software tools are an integral part of the book and provide computational platforms for all the models discussed herein. The most recent versions of these two softwa...

  18. Integrated Reliability and Risk Analysis System (IRRAS), Version 2.5: Reference manual

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1991-03-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.5 and is the subject of this Reference Manual. Version 2.5 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 7 refs., 348 figs

  19. Integrated Reliability and Risk Analysis System (IRRAS) Version 2.0 user's guide

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1990-06-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Also provided in the system is an integrated full-screen editor for use when interfacing with remote mainframe computer systems. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 2.0 and is the subject of this user's guide. Version 2.0 of IRRAS provides all of the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance. 9 refs., 292 figs., 4 tabs

  20. Model Adequacy Analysis of Matching Record Versions in Nosql Databases

    Directory of Open Access Journals (Sweden)

    E. V. Tsviashchenko

    2015-01-01

    Full Text Available The article investigates a model of matching record versions. The goal of this work is to analyse the model adequacy. This model allows estimating a user’s processing time distribution of the record versions and a distribution of the record versions count. The second option of the model was used, according to which, for a client the time to process record versions depends explicitly on the number of updates, performed by the other users between the sequential updates performed by a current client. In order to prove the model adequacy the real experiment was conducted in the cloud cluster. The cluster contains 10 virtual nodes, provided by DigitalOcean Company. The Ubuntu Server 14.04 was used as an operating system (OS. The NoSQL system Riak was chosen for experiments. In the Riak 2.0 version and later provide “dotted vector versions” (DVV option, which is an extension of the classic vector clock. Their use guarantees, that the versions count, simultaneously stored in DB, will not exceed the count of clients, operating in parallel with a record. This is very important while conducting experiments. For developing the application the java library, provided by Riak, was used. The processes run directly on the nodes. In experiment two records were used. They are: Z – the record, versions of which are handled by clients; RZ – service record, which contains record update counters. The application algorithm can be briefly described as follows: every client reads versions of the record Z, processes its updates using the RZ record counters, and saves treated record in database while old versions are deleted form DB. Then, a client rereads the RZ record and increments counters of updates for the other clients. After that, a client rereads the Z record, saves necessary statistics, and deliberates the results of processing. In the case of emerging conflict because of simultaneous updates of the RZ record, the client obtains all versions of that

  1. Solar Advisor Model User Guide for Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Gilman, P.; Blair, N.; Mehos, M.; Christensen, C.; Janzou, S.; Cameron, C.

    2008-08-01

    The Solar Advisor Model (SAM) provides a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, from photovoltaic systems for residential and commercial markets to concentrating solar power and large photovoltaic systems for utility markets. This manual describes Version 2.0 of the software, which can model photovoltaic and concentrating solar power technologies for electric applications for several markets. The current version of the Solar Advisor Model does not model solar heating and lighting technologies.

  2. External Cephalic Version-Related Risks A Meta-analysis

    NARCIS (Netherlands)

    Grootscholten, Kim; Kok, Marjolein; Oei, S. Guid; Mol, Ben W. J.; van der Post, Joris A.

    2008-01-01

    OBJECTIVE: To systematically review the literature on external cephalic version-related complications and to assess if the outcome of a version attempt is related to complications. DATA SOURCES: In March 2007 we searched MEDLINE, EMBASE, and the Cochrane Central Register of Controlled Trials.

  3. External cephalic version-related risks: a meta-analysis.

    Science.gov (United States)

    Grootscholten, Kim; Kok, Marjolein; Oei, S Guid; Mol, Ben W J; van der Post, Joris A

    2008-11-01

    To systematically review the literature on external cephalic version-related complications and to assess if the outcome of a version attempt is related to complications. In March 2007 we searched MEDLINE, EMBASE, and the Cochrane Central Register of Controlled Trials. Studies reporting on complications from an external cephalic version attempt for singleton breech pregnancies after 36 weeks of pregnancy were selected. We calculated odds ratios (ORs) from studies that reported both on complications as well as on the position of the fetus immediately after the procedure. We found 84 studies, reporting on 12,955 version attempts that reported on external cephalic version-related complications. The pooled complication rate was 6.1% (95% CI 4.7-7.8), 0.24% for serious complications (95% confidence interval [CI] 0.17-0.34) and 0.35% for emergency cesarean deliveries (95% CI 0.26-0.47). Complications were not related to external cephalic version outcome (OR 1.2 (95% CI 0.93-1.7). External cephalic version is a safe procedure. Complications are not related to the fetal position after external cephalic version.

  4. The ONKALO area model. Version 1

    International Nuclear Information System (INIS)

    Kemppainen, K.; Ahokas, T.; Ahokas, H.; Paulamaeki, S.; Paananen, M.; Gehoer, S.; Front, K.

    2007-11-01

    The geological model of the ONKALO area consists of three submodels: the lithological model, the brittle deformation model and the alteration model. The lithological model gives properties of definite rock units that can be defined on the basis the migmatite structures, textures and modal compositions. The brittle deformation model describes the results of brittle deformation, where geophysical and hydrogeological results are added. The alteration model describes occurrence of different alteration types and its possible effects. The rocks of Olkiluoto can be divided into two major classes: (1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, tonalitic-granodioriticgranitic gneisses, mica gneisses, quartz gneisses and mafic gneisses, and (2) igneous rocks including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite structure: veined gneisses, stromatic gneisses and diatexitic gneisses. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subject to polyphased ductile deformation, including five stages. In 3D modelling of the lithological units, an assumption has been made, on the basis of measurements in outcrops, investigation trenches and drill cores, that the pervasive, composite foliation produced as a result a polyphase ductile deformation has a rather constant attitude in the ONKALO area. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from the surface to the drillholes. The bedrock in the Olkiluoto site has been subject to extensive hydrothermal alteration, which has taken place at reasonably low temperature conditions, the estimated temperature interval being from slightly over 300 deg C to less than 100 deg C. Two types of alteration can be observed: (1) pervasive (disseminated

  5. Micro dosimetry model. An extended version

    International Nuclear Information System (INIS)

    Vroegindewey, C.

    1994-07-01

    In an earlier study a relative simple mathematical model has been constructed to simulate the energy transfer on a cellular scale and thus gain insight in the fundamental processes of BNCT. Based on this work, a more realistic micro dosimetry model is developed. The new facets of the model are: the treatment of proton recoil, the calculation of the distribution of energy depositions, and the determination of the number of particles crossing the target nucleus subdivided in place of origin. Besides these extensions, new stopping power tables for the emitted particles are generated and biased Monte Carlo techniques are used to reduce computer time. (orig.)

  6. [German Language Version and Validation of the Risk-Taking Behaviour Scale (RBS-K) for High-Risk Sports].

    Science.gov (United States)

    Frühauf, Anika; Niedermeier, Martin; Ruedl, Gerhard; Barlow, Matthew; Woodman, Tim; Kopp, Martin

    2017-11-23

    Background  High-risk sports, particularly climbing, kayaking and extreme skiing, have become increasingly popular. The most widely used psychological survey instrument with regard to risk behaviour in sports is the Sensation Seeking Model, mostly assessed by the Sensation Seeking Scale (SSS-V). Until recently, the literature discussed risk behaviour solely through this model. However, this scale does not measure risk-taking behaviours. In contrast, the Risk-Taking Behaviour Scale (RBS-K) is a three-item scale that measures risk behaviour in high-risk sports. This study aimed to validate a German language version of the RBS-K. Methods  The RBS-K was translated and back-translated between English and German. High-risk sports participants (n = 2399) completed the German version of the RBS-K. Of those participants, 820 completed the RBS-K in person as part of a field survey and 1579 participated in an online survey. To validate the questionnaire, the SSS-V, accident involvement, age and sex were evaluated. The RBS-K divides the sample into deliberate risk takers (mean + standard deviation) and risk-averse persons (mean - standard deviation). We tested for internal consistency and correlations with SSS-V, age, sex and accident involvement. Group differences were calculated between deliberate risk takers and risk-averse persons. Results  For internal consistency, we obtained a Cronbach's alpha of 0.56 and a McDonald's omega of 0.63. Significant correlations were shown between RBS-K and SSS-V as well as age and sex. Compared to risk-averse persons (n = 643, 26.8 %), deliberate risk takers (n = 319, 13.3 %) scored significantly higher in sensation seeking, were significantly younger and primarily male and had a significantly higher accident involvement. Conclusion  The RBS-K discriminates well for age, sex and accident involvement. Also, correlations between the RBS-K and the well-established SSS-V are acceptable. With regard to the results and its

  7. Smart Grid Interoperability Maturity Model Beta Version

    Energy Technology Data Exchange (ETDEWEB)

    Widergren, Steven E.; Drummond, R.; Giroti, Tony; Houseman, Doug; Knight, Mark; Levinson, Alex; longcore, Wayne; Lowe, Randy; Mater, J.; Oliver, Terry V.; Slack, Phil; Tolk, Andreas; Montgomery, Austin

    2011-12-02

    The GridWise Architecture Council was formed by the U.S. Department of Energy to promote and enable interoperability among the many entities that interact with the electric power system. This balanced team of industry representatives proposes principles for the development of interoperability concepts and standards. The Council provides industry guidance and tools that make it an available resource for smart grid implementations. In the spirit of advancing interoperability of an ecosystem of smart grid devices and systems, this document presents a model for evaluating the maturity of the artifacts and processes that specify the agreement of parties to collaborate across an information exchange interface. You are expected to have a solid understanding of large, complex system integration concepts and experience in dealing with software component interoperation. Those without this technical background should read the Executive Summary for a description of the purpose and contents of the document. Other documents, such as checklists, guides, and whitepapers, exist for targeted purposes and audiences. Please see the www.gridwiseac.org website for more products of the Council that may be of interest to you.

  8. Using the Violence Risk Scale-Sexual Offense version in sexual violence risk assessments: Updated risk categories and recidivism estimates from a multisite sample of treated sexual offenders.

    Science.gov (United States)

    Olver, Mark E; Mundt, James C; Thornton, David; Beggs Christofferson, Sarah M; Kingston, Drew A; Sowden, Justina N; Nicholaichuk, Terry P; Gordon, Audrey; Wong, Stephen C P

    2018-04-30

    The present study sought to develop updated risk categories and recidivism estimates for the Violence Risk Scale-Sexual Offense version (VRS-SO; Wong, Olver, Nicholaichuk, & Gordon, 2003-2017), a sexual offender risk assessment and treatment planning tool. The overarching purpose was to increase the clarity and accuracy of communicating risk assessment information that includes a systematic incorporation of new information (i.e., change) to modify risk estimates. Four treated samples of sexual offenders with VRS-SO pretreatment, posttreatment, and Static-99R ratings were combined with a minimum follow-up period of 10-years postrelease (N = 913). Logistic regression was used to model 5- and 10-year sexual and violent (including sexual) recidivism estimates across 6 different regression models employing specific risk and change score information from the VRS-SO and/or Static-99R. A rationale is presented for clinical applications of select models and the necessity of controlling for baseline risk when utilizing change information across repeated assessments. Information concerning relative risk (percentiles) and absolute risk (recidivism estimates) is integrated with common risk assessment language guidelines to generate new risk categories for the VRS-SO. Guidelines for model selection and forensic clinical application of the risk estimates are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. IDC Use Case Model Survey Version 1.1.

    Energy Technology Data Exchange (ETDEWEB)

    Harris, James Mark [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carr, Dorthe B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This document contains the brief descriptions for the actors and use cases contained in the IDC Use Case Model. REVISIONS Version Date Author/Team Revision Description Authorized by V1.0 12/2014 SNL IDC Reengineering Project Team Initial delivery M. Harris V1.1 2/2015 SNL IDC Reengineering Project Team Iteration I2 Review Comments M. Harris

  10. IDC Use Case Model Survey Version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Carr, Dorthe B.; Harris, James M.

    2014-12-01

    This document contains the brief descriptions for the actors and use cases contained in the IDC Use Case Model Survey. REVISIONS Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Re- engineering Project Team Initial delivery M. Harris

  11. A conceptual model specification language (CMSL Version 2)

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    1992-01-01

    Version 2 of a language (CMSL) to specify conceptual models is defined. CMSL consists of two parts, the value specification language VSL and the object spercification language OSL. There is a formal semantics and an inference system for CMSL but research on this still continues. A method for

  12. External cephalic version-related risks: a meta-analysis

    NARCIS (Netherlands)

    Grootscholten, K.; Kok, M.; Oei, S.G.; Mol, B.W.J.; Post, van der J.A.

    2008-01-01

    OBJECTIVE: To systematically review the literature on external cephalic version–related complications and to assess if the outcome of a version attempt is related to complications. DATA SOURCES: In March 2007 we searched MEDLINE, EMBASE, and the Cochrane Central Register of Controlled Trials.

  13. Fiscal impacts model documentation. Version 1.0

    International Nuclear Information System (INIS)

    Beck, S.L.; Scott, M.J.

    1986-05-01

    The Fiscal Impacts (FI) Model, Version 1.0 was developed under Pacific Northwest Laboratory's Monitored Retrievable Storage (MRS) Program to aid in development of the MRS Reference Site Environmental Document (PNL 5476). It computes estimates of 182 fiscal items for state and local government jurisdictions, using input data from the US Census Bureau's 1981 Survey of Governments and local population forecasts. The model can be adapted for any county or group of counties in the United States

  14. Prevalence, outcome, and women's experiences of external cephalic version in a low-risk population.

    Science.gov (United States)

    Rijnders, Marlies; Offerhaus, Pien; van Dommelen, Paula; Wiegers, Therese; Buitendijk, Simone

    2010-06-01

    Until recently, external cephalic version to prevent breech presentation at birth was not widely accepted. The objective of our study was to assess the prevalence, outcomes, and women's experiences of external cephalic version to improve the implementation of the procedure in the Netherlands. A prospective cohort study was conducted of 167 women under the care of a midwife with confirmed breech presentation at a gestational age of 33 completed weeks or more. Between June 2007 and January 2008, 167 women with a confirmed breech presentation were offered an external cephalic version. Of this group, 123 women (73.7%, 95% CI: 65.5-80.5) subsequently received the version. These women had about a ninefold increased probability of a cephalic presentation at birth compared with women who did not undergo a version (relative risk [RR]: 8.8, 95% CI: 2.2-34.8). The chance of a vaginal birth after an external cephalic version was almost threefold (RR: 2.7, 95% CI: 1.5-5.0). The success rate was 39 percent, although considerable differences existed associated with region and parity. Ninety-four percent of women with a successful version rated it as a good experience compared with 71 percent of women who had a failed version (p = 0.015). Significant pain during the version was experienced by 34 percent of women, of whom 18 percent also experienced fear during the version, compared with no women who reported little or no pain (p = 0.006). Women who reported significant pain or fear during the version experienced the version more negatively (OR: 6.0, 95% CI: 3.3-12.2 and OR: 2.7, 95% CI: 1.1-6.0, respectively). One in every four women with a breech presentation in independent midwifery care did not receive an external cephalic version. Of the women who received a version one third experienced significant pain during the procedure. Considerable regional variation in success rate existed.

  15. Towards Validating Risk Indicators Based on Measurement Theory (Extended version)

    NARCIS (Netherlands)

    Morali, A.; Wieringa, Roelf J.

    Due to the lack of quantitative information and for cost-efficiency, most risk assessment methods use partially ordered values (e.g. high, medium, low) as risk indicators. In practice it is common to validate risk indicators by asking stakeholders whether they make sense. This way of validation is

  16. Timing of delivery after external cephalic version and the risk for cesarean delivery.

    Science.gov (United States)

    Kabiri, Doron; Elram, Tamar; Aboo-Dia, Mushira; Elami-Suzin, Matan; Elchalal, Uriel; Ezra, Yossef

    2011-08-01

    To estimate the association between time of delivery after external cephalic version at term and the risk for cesarean delivery. This retrospective cohort study included all successful external cephalic versions performed in a tertiary center between January 1997 and January 2010. Stepwise logistic regression was used to calculate the odds ratio (OR) for cesarean delivery. We included 483 external cephalic versions in this study, representing 53.1% of all external cephalic version attempts. The incidence of cesarean delivery for 139 women (29%) who gave birth less than 96 hours from external cephalic version was 16.5%; for 344 women (71%) who gave birth greater than 96 hours from external cephalic version, the incidence of cesarean delivery was 7.8% (P = .004). The adjusted OR for cesarean delivery was 2.541 (95% confidence interval 1.36-4.72). When stratified by parity, the risk for cesarean delivery when delivery occurred less than 96 hours after external cephalic version was 2.97 and 2.28 for nulliparous and multiparous women, respectively. Delivery at less than 96 hours after successful external cephalic version was associated with an increased risk for cesarean delivery. III.

  17. Decommissioning economic risk advisor: DERAD Version 1.0 user's manual. Final report

    International Nuclear Information System (INIS)

    Gjerde, A.R.; Qian, M.; Govil, P.; Balson, W.E.

    1995-04-01

    DERAD - Decommissioning Economic and Risk ADvisor - is a decision support tool designed to help utility decision makers analyze economics and financial risk of decommissioning nuclear power plants. Your current copy of DERAD, Version 1.0, is customized for PWR configurated plants. DERAD has been developed by Decision Focus Incorporated under EPRI sponsorship. If you have ideas or recommendations for how we can improve and enhance future versions of DERAD, we would like to hear from you

  18. ONKALO rock mechanics model (RMM). Version 2.3

    Energy Technology Data Exchange (ETDEWEB)

    Haekkinen, T.; Merjama, S.; Moenkkoenen, H. [WSP Finland, Helsinki (Finland)

    2014-07-15

    The Rock Mechanics Model of the ONKALO rock volume includes the most important rock mechanics features and parameters at the Olkiluoto site. The main objective of the model is to be a tool to predict rock properties, rock quality and hence provide an estimate for the rock stability of the potential repository at Olkiluoto. The model includes a database of rock mechanics raw data and a block model in which the rock mechanics parameters are estimated through block volumes based on spatial rock mechanics raw data. In this version 2.3, special emphasis was placed on refining the estimation of the block model. The model was divided into rock mechanics domains which were used as constraints during the block model estimation. During the modelling process, a display profile and toolbar were developed for the GEOVIA Surpac software to improve visualisation and access to the rock mechanics data for the Olkiluoto area. (orig.)

  19. On Modeling Risk Shocks

    OpenAIRE

    Dorofeenko, Victor; Lee, Gabriel; Salyer, Kevin; Strobel, Johannes

    2016-01-01

    Within the context of a financial accelerator model, we model time-varying uncertainty (i.e. risk shocks) through the use of a mixture Normal model with time variation in the weights applied to the underlying distributions characterizing entrepreneur productivity. Specifically, we model capital producers (i.e. the entrepreneurs) as either low-risk (relatively small second moment for productivity) and high-risk (relatively large second moment for productivity) and the fraction of both types is...

  20. Latest NASA Instrument Cost Model (NICM): Version VI

    Science.gov (United States)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  1. Solid Waste Projection Model: Database (Version 1.3)

    International Nuclear Information System (INIS)

    Blackburn, C.L.

    1991-11-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software and data structures utilized in developing the SWPM Version 1.3 Database. This document is intended for use by experienced database specialists and supports database maintenance, utility development, and database enhancement

  2. Some Remarks on Stochastic Versions of the Ramsey Growth Model

    Czech Academy of Sciences Publication Activity Database

    Sladký, Karel

    2012-01-01

    Roč. 19, č. 29 (2012), s. 139-152 ISSN 1212-074X R&D Projects: GA ČR GAP402/10/1610; GA ČR GAP402/10/0956; GA ČR GAP402/11/0150 Institutional support: RVO:67985556 Keywords : Economic dynamics * Ramsey growth model with disturbance * stochastic dynamic programming * multistage stochastic programs Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/E/sladky-some remarks on stochastic versions of the ramsey growth model.pdf

  3. H2A Production Model, Version 2 User Guide

    Energy Technology Data Exchange (ETDEWEB)

    Steward, D.; Ramsden, T.; Zuboy, J.

    2008-09-01

    The H2A Production Model analyzes the technical and economic aspects of central and forecourt hydrogen production technologies. Using a standard discounted cash flow rate of return methodology, it determines the minimum hydrogen selling price, including a specified after-tax internal rate of return from the production technology. Users have the option of accepting default technology input values--such as capital costs, operating costs, and capacity factor--from established H2A production technology cases or entering custom values. Users can also modify the model's financial inputs. This new version of the H2A Production Model features enhanced usability and functionality. Input fields are consolidated and simplified. New capabilities include performing sensitivity analyses and scaling analyses to various plant sizes. This User Guide helps users already familiar with the basic tenets of H2A hydrogen production cost analysis get started using the new version of the model. It introduces the basic elements of the model then describes the function and use of each of its worksheets.

  4. Integrated Farm System Model Version 4.3 and Dairy Gas Emissions Model Version 3.3 Software development and distribution

    Science.gov (United States)

    Modeling routines of the Integrated Farm System Model (IFSM version 4.2) and Dairy Gas Emission Model (DairyGEM version 3.2), two whole-farm simulation models developed and maintained by USDA-ARS, were revised with new components for: (1) simulation of ammonia (NH3) and greenhouse gas emissions gene...

  5. NASA Risk Management Handbook. Version 1.0

    Science.gov (United States)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Maggio, Gaspare; Stamatelatos, Michael; Youngblood, Robert; Guarro, Sergio; Rutledge, Peter; Sherrard, James; Smith, Curtis; hide

    2011-01-01

    The purpose of this handbook is to provide guidance for implementing the Risk Management (RM) requirements of NASA Procedural Requirements (NPR) document NPR 8000.4A, Agency Risk Management Procedural Requirements [1], with a specific focus on programs and projects, and applying to each level of the NASA organizational hierarchy as requirements flow down. This handbook supports RM application within the NASA systems engineering process, and is a complement to the guidance contained in NASA/SP-2007-6105, NASA Systems Engineering Handbook [2]. Specifically, this handbook provides guidance that is applicable to the common technical processes of Technical Risk Management and Decision Analysis established by NPR 7123.1A, NASA Systems Engineering Process and Requirements [3]. These processes are part of the \\Systems Engineering Engine. (Figure 1) that is used to drive the development of the system and associated work products to satisfy stakeholder expectations in all mission execution domains, including safety, technical, cost, and schedule. Like NPR 7123.1A, NPR 8000.4A is a discipline-oriented NPR that intersects with product-oriented NPRs such as NPR 7120.5D, NASA Space Flight Program and Project Management Requirements [4]; NPR 7120.7, NASA Information Technology and Institutional Infrastructure Program and Project Management Requirements [5]; and NPR 7120.8, NASA Research and Technology Program and Project Management Requirements [6]. In much the same way that the NASA Systems Engineering Handbook is intended to provide guidance on the implementation of NPR 7123.1A, this handbook is intended to provide guidance on the implementation of NPR 8000.4A. 1.2 Scope and Depth This handbook provides guidance for conducting RM in the context of NASA program and project life cycles, which produce derived requirements in accordance with existing systems engineering practices that flow down through the NASA organizational hierarchy. The guidance in this handbook is not meant

  6. Break model comparison in different RELAP5 versions

    International Nuclear Information System (INIS)

    Parzer, I.

    2003-01-01

    The presented work focuses on the break flow prediction in RELAP5/MOD3 code, which is crucial to predict core uncovering and heatup during the Small Break Loss-of-Coolant Accidents (SB LOCA). The code prediction has been compared to the IAEA-SPE-4 experiments conducted on the PMK-2 integral test facilities in Hungary. The simulations have been performed with MOD3.2.2 Beta, MOD3.2.2 Gamma, MOD3.3 Beta and MOD3.3 frozen code version. In the present work we have compared the Ransom-Trapp and Henry-Fauske break model predictions. Additionally, both model predictions have been compared to itself, when used as the main modeling tool or when used as another code option, as so-called 'secret developmental options' on input card no.1. (author)

  7. GLEAM version 3: Global Land Evaporation Datasets and Model

    Science.gov (United States)

    Martens, B.; Miralles, D. G.; Lievens, H.; van der Schalie, R.; de Jeu, R.; Fernandez-Prieto, D.; Verhoest, N.

    2015-12-01

    Terrestrial evaporation links energy, water and carbon cycles over land and is therefore a key variable of the climate system. However, the global-scale magnitude and variability of the flux, and the sensitivity of the underlying physical process to changes in environmental factors, are still poorly understood due to limitations in in situ measurements. As a result, several methods have risen to estimate global patterns of land evaporation from satellite observations. However, these algorithms generally differ in their approach to model evaporation, resulting in large differences in their estimates. One of these methods is GLEAM, the Global Land Evaporation: the Amsterdam Methodology. GLEAM estimates terrestrial evaporation based on daily satellite observations of meteorological variables, vegetation characteristics and soil moisture. Since the publication of the first version of the algorithm (2011), the model has been widely applied to analyse trends in the water cycle and land-atmospheric feedbacks during extreme hydrometeorological events. A third version of the GLEAM global datasets is foreseen by the end of 2015. Given the relevance of having a continuous and reliable record of global-scale evaporation estimates for climate and hydrological research, the establishment of an online data portal to host these data to the public is also foreseen. In this new release of the GLEAM datasets, different components of the model have been updated, with the most significant change being the revision of the data assimilation algorithm. In this presentation, we will highlight the most important changes of the methodology and present three new GLEAM datasets and their validation against in situ observations and an alternative dataset of terrestrial evaporation (ERA-Land). Results of the validation exercise indicate that the magnitude and the spatiotemporal variability of the modelled evaporation agree reasonably well with the estimates of ERA-Land and the in situ

  8. Solid Waste Projection Model: Database (Version 1.4)

    International Nuclear Information System (INIS)

    Blackburn, C.; Cillan, T.

    1993-09-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software and data structures utilized in developing the SWPM Version 1.4 Database. This document is intended for use by experienced database specialists and supports database maintenance, utility development, and database enhancement. Those interested in using the SWPM database should refer to the SWPM Database User's Guide. This document is available from the PNL Task M Project Manager (D. L. Stiles, 509-372-4358), the PNL Task L Project Manager (L. L. Armacost, 509-372-4304), the WHC Restoration Projects Section Manager (509-372-1443), or the WHC Waste Characterization Manager (509-372-1193)

  9. Melanoma Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. System Analysis and Risk Assessment System (SARA), Version 4.0

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Skinner, N.L.; Stewart, H.D.; Wood, S.T.

    1992-02-01

    This NUREG is the reference manual for the System Analysis and Risk Assessment (SARA) System Version 4.0, a microcomputer-based system used to analyze the safety issues of a family [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. The SARA data base contains PRA data for the dominant accident sequences of a family and descriptive information about the family including event trees, fault trees, and system model diagrams. The number of facility data bases that can be accessed is limited only by the amount of disk storage available. To simulate changes to family systems, SARA users change the failure rates of initiating and basic events and/or modify the structure of the cut sets that make up the event trees, fault trees, and systems. The user then evaluates the effects of these changes through the recalculation of the resultant accident sequence probabilities and importance measures. The results are displayed in tables and graphs

  11. Credit Risk Modeling

    DEFF Research Database (Denmark)

    Lando, David

    Credit risk is today one of the most intensely studied topics in quantitative finance. This book provides an introduction and overview for readers who seek an up-to-date reference to the central problems of the field and to the tools currently used to analyze them. The book is aimed at researchers...... and students in finance, at quantitative analysts in banks and other financial institutions, and at regulators interested in the modeling aspects of credit risk. David Lando considers the two broad approaches to credit risk analysis: that based on classical option pricing models on the one hand...

  12. Validity and Reliability of Persian Version of Johns Hopkins Fall Risk Assessment Tool among Aged People

    Directory of Open Access Journals (Sweden)

    hadi hojati

    2018-04-01

    Full Text Available Background & Aim: It is crucial to identify aged patients in risk of falls in clinical settings. Johns Hopkins Fall Risk Assessment Tool (JHFRAT is one of most applied international instrument to assess elderly patients for the risk of falls. The aim of this study was to evaluate reliability and internal consistency of the JHFRAT. Methods & Materials: In this cross-sectional study for validity assessment of the tool, WHO’s standard protocol was applied for translation-back translation of the tool. Face and content validity of the tool was confirmed by ten person of expert faculty members for its applicability in clinical setting. In this pilot study, the inclusion criteria were being 60 or more years old, hospitalized in the last 8 hours prior to assessment and in proper cognitive condition assessed by MMSE. Subjects of the study were (n=70 elderly patients who were newly hospitalized in Shahroud Emam Hossein Hospital. Data were analyzed using SPSS software- version 16. Internal consistency of the tool was calculated by Cronbach’s alpha. Results: According to the results of the study Persian version of JHFRAT was a valid tool for application on clinical setting. The Persian version of the tool had Cronbach’s alpha equal to 0/733. Conclusion: Based on the findings of the current study, it can be concluded that Persian version of the JHFRAT is a valid and reliable tool to be applied for assessment of elderly senior citizens on admission in any clinical settings.

  13. Risk factors for cesarean section and instrumental vaginal delivery after successful external cephalic version.

    Science.gov (United States)

    de Hundt, Marcella; Vlemmix, Floortje; Bais, Joke M J; de Groot, Christianne J; Mol, Ben Willem; Kok, Marjolein

    2016-01-01

    Aim of this article is to examine if we could identify factors that predict cesarean section and instrumental vaginal delivery in women who had a successful external cephalic version. We used data from a previous randomized trial among 25 hospitals and their referring midwife practices in the Netherlands. With the data of this trial, we performed a cohort study among women attempting vaginal delivery after successful ECV. We evaluated whether maternal age, gestational age, parity, time interval between ECV and delivery, birth weight, neonatal gender, and induction of labor were predictive for a vaginal delivery on one hand or a CS or instrumental vaginal delivery on the other hand. Unadjusted and adjusted odds ratios were calculated with univariate and multivariate logistic regression analysis. Among 301 women who attempted vaginal delivery after a successful external cephalic version attempt, the cesarean section rate was 13% and the instrumental vaginal delivery rate 6%, resulting in a combined instrumental delivery rate of 19%. Nulliparity increased the risk of cesarean section (OR 2.7 (95% CI 1.2-6.1)) and instrumental delivery (OR 4.2 (95% CI 2.1-8.6)). Maternal age, gestational age at delivery, time interval between external cephalic version and delivery, birth weight and neonatal gender did not contribute to the prediction of failed spontaneous vaginal delivery. In our cohort of 301 women with a successful external cephalic version, nulliparity was the only one of seven factors that predicted the risk for cesarean section and instrumental vaginal delivery.

  14. Prediction models for successful external cephalic version: a systematic review.

    Science.gov (United States)

    Velzel, Joost; de Hundt, Marcella; Mulder, Frederique M; Molkenboer, Jan F M; Van der Post, Joris A M; Mol, Ben W; Kok, Marjolein

    2015-12-01

    To provide an overview of existing prediction models for successful ECV, and to assess their quality, development and performance. We searched MEDLINE, EMBASE and the Cochrane Library to identify all articles reporting on prediction models for successful ECV published from inception to January 2015. We extracted information on study design, sample size, model-building strategies and validation. We evaluated the phases of model development and summarized their performance in terms of discrimination, calibration and clinical usefulness. We collected different predictor variables together with their defined significance, in order to identify important predictor variables for successful ECV. We identified eight articles reporting on seven prediction models. All models were subjected to internal validation. Only one model was also validated in an external cohort. Two prediction models had a low overall risk of bias, of which only one showed promising predictive performance at internal validation. This model also completed the phase of external validation. For none of the models their impact on clinical practice was evaluated. The most important predictor variables for successful ECV described in the selected articles were parity, placental location, breech engagement and the fetal head being palpable. One model was assessed using discrimination and calibration using internal (AUC 0.71) and external validation (AUC 0.64), while two other models were assessed with discrimination and calibration, respectively. We found one prediction model for breech presentation that was validated in an external cohort and had acceptable predictive performance. This model should be used to council women considering ECV. Copyright © 2015. Published by Elsevier Ireland Ltd.

  15. Breast cancer risks and risk prediction models.

    Science.gov (United States)

    Engel, Christoph; Fischer, Christine

    2015-02-01

    BRCA1/2 mutation carriers have a considerably increased risk to develop breast and ovarian cancer. The personalized clinical management of carriers and other at-risk individuals depends on precise knowledge of the cancer risks. In this report, we give an overview of the present literature on empirical cancer risks, and we describe risk prediction models that are currently used for individual risk assessment in clinical practice. Cancer risks show large variability between studies. Breast cancer risks are at 40-87% for BRCA1 mutation carriers and 18-88% for BRCA2 mutation carriers. For ovarian cancer, the risk estimates are in the range of 22-65% for BRCA1 and 10-35% for BRCA2. The contralateral breast cancer risk is high (10-year risk after first cancer 27% for BRCA1 and 19% for BRCA2). Risk prediction models have been proposed to provide more individualized risk prediction, using additional knowledge on family history, mode of inheritance of major genes, and other genetic and non-genetic risk factors. User-friendly software tools have been developed that serve as basis for decision-making in family counseling units. In conclusion, further assessment of cancer risks and model validation is needed, ideally based on prospective cohort studies. To obtain such data, clinical management of carriers and other at-risk individuals should always be accompanied by standardized scientific documentation.

  16. Metric properties of the "timed get up and go- modified version" test, in risk assessment of falls in active women.

    Science.gov (United States)

    Alfonso Mora, Margareth Lorena

    2017-03-30

    To analyse the metric properties of the Timed Get up and Go-Modified Version Test (TGUGM), in risk assessment of falls in a group of physically active women. A sample was constituted by 202 women over 55 years of age, were assessed through a crosssectional study. The TGUGM was applied to assess their fall risk. The test was analysed by comparison of the qualitative and quantitative information and by factor analysis. The development of a logistic regression model explained the risk of falls according to the test components. The TGUGM was useful for assessing the risk of falls in the studied group. The test revealed two factors: the Get Up and the Gait with dual task . Less than twelve points in the evaluation or runtimes higher than 35 seconds was associated with high risk of falling. More than 35 seconds in the test indicated a risk fall probability greater than 0.50. Also, scores less than 12 points were associated with a delay of 7 seconds more in the execution of the test ( p = 0.0016). Factor analysis of TGUGM revealed two dimensions that can be independent predictors of risk of falling: The Get up that explains between 64% and 87% of the risk of falling, and the Gait with dual task, that explains between 77% and 95% of risk of falling.

  17. BehavePlus fire modeling system, version 5.0: Variables

    Science.gov (United States)

    Patricia L. Andrews

    2009-01-01

    This publication has been revised to reflect updates to version 4.0 of the BehavePlus software. It was originally published as the BehavePlus fire modeling system, version 4.0: Variables in July, 2008.The BehavePlus fire modeling system is a computer program based on mathematical models that describe wildland fire behavior and effects and the...

  18. Probabilistic Model for Integrated Assessment of the Behavior at the T.D.P. Version 2

    International Nuclear Information System (INIS)

    Hurtado, A.; Eguilior, S.; Recreo, F

    2015-01-01

    This report documents the completion of the first phase of the implementation of the methodology ABACO2G (Bayes Application to Geological Storage of CO2) and the final version of the ABACO2G probabilistic model for the injection phase before its future validation in the experimental field of the Technology Development Plant in Hontom (Burgos). The model, which is based on the determination of the probabilistic risk component of a geological storage of CO2 using the formalism of Bayesian networks and Monte Carlo probability yields quantitative probability functions of the total system CO2 storage and of each one of their subsystems (storage subsystem and the primary seal; secondary containment subsystem and dispersion subsystem or tertiary one); the implementation of the stochastic time evolution of the CO2 plume during the injection period, the stochastic time evolution of the drying front, the probabilistic evolution of the pressure front, decoupled from the CO2 plume progress front, and the implementation of submodels and leakage probability functions through major leakage risk elements (fractures / faults and wells / deep boreholes) which together define the space of events to estimate the risks associated with the CO2 geological storage system. The activities included in this report have been to replace the previous qualitative estimation submodels of former ABACO2G version developed during Phase I of the project ALM-10-017, by analytical, semi-analytical or numerical submodels for the main elements of risk (wells and fractures), to obtain an integrated probabilistic model of a CO2 storage complex in carbonate formations that meets the needs of the integrated behavior evaluation of the Technology Development Plant in Hontomín

  19. NETPATH-WIN: an interactive user version of the mass-balance model, NETPATH

    Science.gov (United States)

    El-Kadi, A. I.; Plummer, Niel; Aggarwal, P.

    2011-01-01

    NETPATH-WIN is an interactive user version of NETPATH, an inverse geochemical modeling code used to find mass-balance reaction models that are consistent with the observed chemical and isotopic composition of waters from aquatic systems. NETPATH-WIN was constructed to migrate NETPATH applications into the Microsoft WINDOWS® environment. The new version facilitates model utilization by eliminating difficulties in data preparation and results analysis of the DOS version of NETPATH, while preserving all of the capabilities of the original version. Through example applications, the note describes some of the features of NETPATH-WIN as applied to adjustment of radiocarbon data for geochemical reactions in groundwater systems.

  20. Computerized transportation model for the NRC Physical Protection Project. Versions I and II

    International Nuclear Information System (INIS)

    Anderson, G.M.

    1978-01-01

    Details on two versions of a computerized model for the transportation system of the NRC Physical Protection Project are presented. The Version I model permits scheduling of all types of transport units associated with a truck fleet, including truck trailers, truck tractors, escort vehicles and crews. A fixed-fleet itinerary construction process is used in which iterations on fleet size are required until the service requirements are satisfied. The Version II model adds an aircraft mode capability and provides for a more efficient non-fixed-fleet itinerary generation process. Test results using both versions are included

  1. System Analysis and Risk Assessment system (SARA) Version 4.0

    International Nuclear Information System (INIS)

    Sattison, M.B.; Russell, K.D.; Skinner, N.L.

    1992-01-01

    This NUREG is the tutorial for the System Analysis and Risk Assessment System (SARA) Version 4.0, a microcomputer-based system used to analyze the safety issues of a family [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. A series of lessons are provided that walk the user through some basic steps common to most analyses performed with SARA. The example problems presented in the lessons build on one another, and in combination, lead the user through all aspects of SARA sensitivity analysis

  2. Models of Credit Risk Measurement

    OpenAIRE

    Hagiu Alina

    2011-01-01

    Credit risk is defined as that risk of financial loss caused by failure by the counterparty. According to statistics, for financial institutions, credit risk is much important than market risk, reduced diversification of the credit risk is the main cause of bank failures. Just recently, the banking industry began to measure credit risk in the context of a portfolio along with the development of risk management started with models value at risk (VAR). Once measured, credit risk can be diversif...

  3. Risk Analysis and Decision-Making Software Package (1997 Version) User Manual

    Energy Technology Data Exchange (ETDEWEB)

    Chung, F.T.H.

    1999-02-11

    This manual provides instructions for using the U.S. Department of Energy's (DOE) risk analysis and decision making software (1997 version) developed at BDM Petroleum Technologies by BDM-Oklahoma, Inc. for DOE, under contract No. DE-AC22-94PC91OO8. This software provides petroleum producers with a simple, handy tool for exploration and production risk analysis and decision-making. It collects useful risk analysis tools in one package so that users do not have to use several programs separately. The software is simple to use, but still provides many functions. The 1997 version of the software package includes the following tools: (1) Investment risk (Gambler's ruin) analysis; (2) Monte Carlo simulation; (3) Best fit for distribution functions; (4) Sample and rank correlation; (5) Enhanced oil recovery method screening; and (6) artificial neural network. This software package is subject to change. Suggestions and comments from users are welcome and will be considered for future modifications and enhancements of the software. Please check the opening screen of the software for the current contact information. In the future, more tools will be added to this software package. This manual includes instructions on how to use the software but does not attempt to fully explain the theory and algorithms used to create it.

  4. ANLECIS-1: Version of ANLECIS Program for Calculations with the Asymetric Rotational Model

    International Nuclear Information System (INIS)

    Lopez Mendez, R.; Garcia Moruarte, F.

    1986-01-01

    A new modified version of the ANLECIS Code is reported. This version allows to fit simultaneously the cross section of the direct process by the asymetric rotational model, and the cross section of the compound nucleus process by the Hauser-Feshbach formalism with the modern statistical corrections. The calculations based in this version show a dependence of the compound nucleus cross section with respect to the asymetric parameter γ. (author). 19 refs

  5. CENTURY: Modeling Ecosystem Responses to Climate Change, Version 4 (VEMAP 1995)

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: The CENTURY model, Version 4, is a general model of plant-soil nutrient cycling that is being used to simulate carbon and nutrient dynamics for different...

  6. CENTURY: Modeling Ecosystem Responses to Climate Change, Version 4 (VEMAP 1995)

    Data.gov (United States)

    National Aeronautics and Space Administration — The CENTURY model, Version 4, is a general model of plant-soil nutrient cycling that is being used to simulate carbon and nutrient dynamics for different types of...

  7. A Constrained and Versioned Data Model for TEAM Data

    Science.gov (United States)

    Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.

    2009-04-01

    The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. At each TEAM Site, data is gathered as defined by the protocols and according to a predefined sampling schedule. The TEAM data is organized and stored in a database based on the TEAM spatio-temporal data model. This data model is at the core of the TEAM Information System - it consumes and executes spatio-temporal queries, and analytical functions that are performed on TEAM data, and defines the object data types, relationships and operations that maintain database integrity. The TEAM data model contains object types including types for observation objects (e.g. bird, butterfly and trees), sampling unit, person, role, protocol, site and the relationship of these object types. Each observation data record is a set of attribute values of an observation object and is always associated with a sampling unit, an observation timestamp or time interval, a versioned protocol and data collectors. The operations on the TEAM data model can be classified as read operations, insert operations and update operations. Following are some typical operations: The operation get(site, protocol, [sampling unit block, sampling unit,] start time, end time) returns all data records using the specified protocol and collected at the specified site, block

  8. RiskREP: Risk-Based Security Requirements Elicitation and Prioritization (extended version)

    NARCIS (Netherlands)

    Herrmann, Andrea; Morali, A.

    2010-01-01

    Today, companies are required to be in control of the security of their IT assets. This is especially challenging in the presence of limited budgets and conflicting requirements. Here, we present Risk-Based Requirements Elicitation and Prioritization (RiskREP), a method for managing IT security

  9. A hybrid version of swan for fast and efficient practical wave modelling

    NARCIS (Netherlands)

    M. Genseberger (Menno); J. Donners

    2016-01-01

    htmlabstractIn the Netherlands, for coastal and inland water applications, wave modelling with SWAN has become a main ingredient. However, computational times are relatively high. Therefore we investigated the parallel efficiency of the current MPI and OpenMP versions of SWAN. The MPI version is

  10. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0: Integrated Reliability and Risk Analysis System (IRRAS) reference manual. Volume 2

    International Nuclear Information System (INIS)

    Russell, K.D.; Kvarfordt, K.J.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs), primarily for nuclear power plants. The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the use the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification to report generation. Version 1.0 of the IRRAS program was released in February of 1987. Since then, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 5.0 and is the subject of this Reference Manual. Version 5.0 of IRRAS provides the same capabilities as earlier versions and ads the ability to perform location transformations, seismic analysis, and provides enhancements to the user interface as well as improved algorithm performance. Additionally, version 5.0 contains new alphanumeric fault tree and event used for event tree rules, recovery rules, and end state partitioning

  11. A comparison of modified versions of the Static-99 and the Sex Offender Risk Appraisal Guide.

    Science.gov (United States)

    Nunes, Kevin L; Firestone, Philip; Bradford, John M; Greenberg, David M; Broom, Ian

    2002-07-01

    The predictive validity of 2 risk assessment instruments for sex offenders, modified versions of the Static-99 and the Sex Offender Risk Appraisal Guide, was examined and compared in a sample of 258 adult male sex offenders. In addition, the independent contributions to the prediction of recidivism made by each instrument and by various phallometric indices were explored. Both instruments demonstrated moderate levels of predictive accuracy for sexual and violent (including sexual) recidivism. They were not significantly different in terms of their predictive accuracy for sexual or violent recidivism, nor did they contribute independently to the prediction of sexual or violent recidivism. Of the phallometric indices examined, only the pedophile index added significantly to the prediction of sexual recidivism, but not violent recidivism, above the Static-99 alone.

  12. A Systems Engineering Capability Maturity Model, Version 1.1,

    Science.gov (United States)

    1995-11-01

    of a sequence of actions to be taken to perform a given task. [SECMM] 1. A set of activities ( ISO 12207 ). 2. A set of practices that address the...standards One of the design goals of the SE-CMM effort was to capture the salient concepts from emerging standards and initiatives (e.g.; ISO 9001...history for the SE-CMM: Version Designator Content Change Notes Release 1 • architecture rationale • Process Areas • ISO (SPICE) BPG 0.05 summary

  13. Integrated Medical Model (IMM) Optimization Version 4.0 Functional Improvements

    Science.gov (United States)

    Arellano, John; Young, M.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.

    2016-01-01

    The IMMs ability to assess mission outcome risk levels relative to available resources provides a unique capability to provide guidance on optimal operational medical kit and vehicle resources. Post-processing optimization allows IMM to optimize essential resources to improve a specific model outcome such as maximization of the Crew Health Index (CHI), or minimization of the probability of evacuation (EVAC) or the loss of crew life (LOCL). Mass and or volume constrain the optimized resource set. The IMMs probabilistic simulation uses input data on one hundred medical conditions to simulate medical events that may occur in spaceflight, the resources required to treat those events, and the resulting impact to the mission based on specific crew and mission characteristics. Because IMM version 4.0 provides for partial treatment for medical events, IMM Optimization 4.0 scores resources at the individual resource unit increment level as opposed to the full condition-specific treatment set level, as done in version 3.0. This allows the inclusion of as many resources as possible in the event that an entire set of resources called out for treatment cannot satisfy the constraints. IMM Optimization version 4.0 adds capabilities that increase efficiency by creating multiple resource sets based on differing constraints and priorities, CHI, EVAC, or LOCL. It also provides sets of resources that improve mission-related IMM v4.0 outputs with improved performance compared to the prior optimization. The new optimization represents much improved fidelity that will improve the utility of the IMM 4.0 for decision support.

  14. The air emissions risk assessment model (AERAM)

    International Nuclear Information System (INIS)

    Gratt, L.B.

    1991-01-01

    AERAM is an environmental analysis and power generation station investment decision support tool. AERAM calculates the public health risk (in terms of the lifetime cancers) in the nearby population from pollutants released into the air. AERAM consists of four main subroutines: Emissions, Air, Exposure and Risk. The Emission subroutine uses power plant parameters to calculate the expected release of the pollutants. A coal-fired and oil-fired power plant are currently available. A gas-fired plant model is under preparation. The release of the pollutants into the air is followed by their dispersal in the environment. The dispersion in the Air Subroutine uses the Environmental Protection Agency's model, Industrial Source Complex-Long Term. Additional dispersion models (Industrial Source Complex - Short Term and Cooling Tower Drift) are being implemented for future AERAM versions. The Expose Subroutine uses the ambient concentrations to compute population exposures for the pollutants of concern. The exposures are used with corresponding dose-response model in the Risk Subroutine to estimate both the total population risk and individual risk. The risk for the dispersion receptor-population centroid for the maximum concentration is also calculated for regulatory-population purposes. In addition, automated interfaces with AirTox (an air risk decision model) have been implemented to extend AERAM's steady-state single solution to the decision-under-uncertainty domain. AERAM was used for public health risks, the investment decision for additional pollution control systems based on health risk reductions, and the economics of fuel vs. health risk tradeoffs. AERAM provides that state-of-the-art capability for evaluating the public health impact airborne toxic substances in response to regulations and public concern

  15. Estimating Parameters for the PVsyst Version 6 Photovoltaic Module Performance Model

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    We present an algorithm to determine parameters for the photovoltaic module perf ormance model encoded in the software package PVsyst(TM) version 6. Our method operates on current - voltage (I - V) measured over a range of irradiance and temperature conditions. We describe the method and illustrate its steps using data for a 36 cell crystalli ne silicon module. We qualitatively compare our method with one other technique for estimating parameters for the PVsyst(TM) version 6 model .

  16. The outcomes and risk factors of fetal bradycardia associated with external cephalic version.

    Science.gov (United States)

    Suyama, Fumio; Ogawa, Kohei; Tazaki, Yukiko; Miwa, Terumi; Taniguchi, Kosuke; Nakamura, Noriyuki; Tanaka, Satomi; Tanigaki, Shinji; Sago, Haruhiko

    2017-11-02

    The objective of this study is to assess the outcomes and risk factors of fetal bradycardia after external cephalic version (ECV). We performed a retrospective study of women who underwent ECV after 35 weeks of gestation in 2010-2016. We assessed the birth outcomes, including umbilical cord artery pH, according to the duration of fetal bradycardia and the risk factors for bradycardia. Among 390 cases, 189 (48.5%) cases showed fetal bradycardia during or immediately after ECV. The duration of fetal bradycardia was 10 min occurred in three cases; emergency cesarean section was performed in each case, with delivery after 12-4 min of bradycardia. Two of three cases showed low Apgar scores at 5 min, with an umbilical cord arterial pH of 10 min after ECV was a risk factor for asphyxia. Thus, delivery should be completed within 10 min after bradycardia. A low maternal BMI and a prolonged ECV procedure were risk factors for bradycardia after ECV.

  17. Incremental Validity Analyses of the Violence Risk Appraisal Guide and the Psychopathy Checklist: Screening Version in a Civil Psychiatric Sample

    Science.gov (United States)

    Edens, John F.; Skeem, Jennifer L.; Douglas, Kevin S.

    2006-01-01

    This study compares two instruments frequently used to assess risk for violence, the Violence Risk Appraisal Guide (VRAG) and the Psychopathy Checklist: Screening Version (PCL:SV), in a large sample of civil psychiatric patients. Despite a strong bivariate relationship with community violence, the VRAG could not improve on the predictive validity…

  18. Custom v. Standardized Risk Models

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-05-01

    Full Text Available We discuss when and why custom multi-factor risk models are warranted and give source code for computing some risk factors. Pension/mutual funds do not require customization but standardization. However, using standardized risk models in quant trading with much shorter holding horizons is suboptimal: (1 longer horizon risk factors (value, growth, etc. increase noise trades and trading costs; (2 arbitrary risk factors can neutralize alpha; (3 “standardized” industries are artificial and insufficiently granular; (4 normalization of style risk factors is lost for the trading universe; (5 diversifying risk models lowers P&L correlations, reduces turnover and market impact, and increases capacity. We discuss various aspects of custom risk model building.

  19. Model Risk in Portfolio Optimization

    Directory of Open Access Journals (Sweden)

    David Stefanovits

    2014-08-01

    Full Text Available We consider a one-period portfolio optimization problem under model uncertainty. For this purpose, we introduce a measure of model risk. We derive analytical results for this measure of model risk in the mean-variance problem assuming we have observations drawn from a normal variance mixture model. This model allows for heavy tails, tail dependence and leptokurtosis of marginals. The results show that mean-variance optimization is seriously compromised by model uncertainty, in particular, for non-Gaussian data and small sample sizes. To mitigate these shortcomings, we propose a method to adjust the sample covariance matrix in order to reduce model risk.

  20. Integrated Biosphere Simulator Model (IBIS), Version 2.5

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: The Integrated Biosphere Simulator (or IBIS) is designed to be a comprehensive model of the terrestrial biosphere. Tthe model represents a wide range of...

  1. Integrated Biosphere Simulator Model (IBIS), Version 2.5

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Biosphere Simulator (or IBIS) is designed to be a comprehensive model of the terrestrial biosphere. Tthe model represents a wide range of processes,...

  2. Risk based modelling

    International Nuclear Information System (INIS)

    Chapman, O.J.V.; Baker, A.E.

    1993-01-01

    Risk based analysis is a tool becoming available to both engineers and managers to aid decision making concerning plant matters such as In-Service Inspection (ISI). In order to develop a risk based method, some form of Structural Reliability Risk Assessment (SRRA) needs to be performed to provide a probability of failure ranking for all sites around the plant. A Probabilistic Risk Assessment (PRA) can then be carried out to combine these possible events with the capability of plant safety systems and procedures, to establish the consequences of failure for the sites. In this way the probability of failures are converted into a risk based ranking which can be used to assist the process of deciding which sites should be included in an ISI programme. This paper reviews the technique and typical results of a risk based ranking assessment carried out for nuclear power plant pipework. (author)

  3. The MiniBIOS model (version 1A4) at the RIVM

    NARCIS (Netherlands)

    Uijt de Haag PAM; Laheij GMH

    1993-01-01

    This report is the user's guide of the MiniBIOS model, version 1A4. The model is operational at the Laboratory of Radiation Research of the RIVM. MiniBIOS is a simulation model for calculating the transport of radionuclides in the biosphere and the consequential radiation dose to humans. The

  4. Integrated Baseline System (IBS) Version 2.0: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  5. Microsoft Repository Version 2 and the Open Information Model.

    Science.gov (United States)

    Bernstein, Philip A.; Bergstraesser, Thomas; Carlson, Jason; Pal, Shankar; Sanders, Paul; Shutt, David

    1999-01-01

    Describes the programming interface and implementation of the repository engine and the Open Information Model for Microsoft Repository, an object-oriented meta-data management facility that ships in Microsoft Visual Studio and Microsoft SQL Server. Discusses Microsoft's component object model, object manipulation, queries, and information…

  6. Prediction models for successful external cephalic version: a systematic review

    NARCIS (Netherlands)

    Velzel, Joost; de Hundt, Marcella; Mulder, Frederique M.; Molkenboer, Jan F. M.; van der Post, Joris A. M.; Mol, Ben W.; Kok, Marjolein

    2015-01-01

    To provide an overview of existing prediction models for successful ECV, and to assess their quality, development and performance. We searched MEDLINE, EMBASE and the Cochrane Library to identify all articles reporting on prediction models for successful ECV published from inception to January 2015.

  7. Efficient Modelling and Generation of Markov Automata (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    2012-01-01

    This paper introduces a framework for the efficient modelling and generation of Markov automata. It consists of (1) the data-rich process-algebraic language MAPA, allowing concise modelling of systems with nondeterminism, probability and Markovian timing; (2) a restricted form of the language, the

  8. STORM WATER MANAGEMENT MODEL USER'S MANUAL VERSION 5.0

    Science.gov (United States)

    The EPA Storm Water Management Model (SWMM) is a dynamic rainfall-runoff simulation model used for single event or long-term (continuous) simulation of runoff quantity and quality from primarily urban areas. SWMM was first developed in 1971 and has undergone several major upgrade...

  9. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1993-01-01

    The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

  10. Flipped version of the supersymmetric strongly coupled preon model

    Energy Technology Data Exchange (ETDEWEB)

    Fajfer, S. (Institut za Fiziku, University of Sarajevo, Sarajevo, (Yugoslavia)); Milekovic, M.; Tadic, D. (Zavod za Teorijsku Fiziku, Prirodoslovno-Matematicki Fakultet, University of Zagreb, Croatia, (Yugoslavia))

    1989-12-01

    In the supersymmetric SU(5) (SUSY SU(5)) composite model (which was described in an earlier paper) the fermion mass terms can be easily constructed. The SUSY SU(5){direct product}U(1), i.e., flipped, composite model possesses a completely analogous composite-particle spectrum. However, in that model one cannot construct a renormalizable superpotential which would generate fermion mass terms. This contrasts with the standard noncomposite grand unified theories (GUT's) in which both the Georgi-Glashow electrical charge embedding and its flipped counterpart lead to the renormalizable theories.

  11. Radarsat Antarctic Mapping Project Digital Elevation Model, Version 2

    Data.gov (United States)

    National Aeronautics and Space Administration — The high-resolution Radarsat Antarctic Mapping Project (RAMP) Digital Elevation Model (DEM) combines topographic data from a variety of sources to provide consistent...

  12. U.S. Coastal Relief Model - Southern California Version 2

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NGDC's U.S. Coastal Relief Model (CRM) provides a comprehensive view of the U.S. coastal zone integrating offshore bathymetry with land topography into a seamless...

  13. ONKALO rock mechanics model (RMM) - Version 2.0

    International Nuclear Information System (INIS)

    Moenkkoenen, H.; Hakala, M.; Paananen, M.; Laine, E.

    2012-02-01

    The Rock Mechanics Model of the ONKALO rock volume is a description of the significant features and parameters related to rock mechanics. The main objective is to develop a tool to predict the rock properties, quality and hence the potential for stress failure which can then be used for continuing design of the ONKALO and the repository. This is the second implementation of the Rock Mechanics Model and it includes sub-models of the intact rock strength, in situ stress, thermal properties, rock mass quality and properties of the brittle deformation zones. Because of the varying quantities of available data for the different parameters, the types of presentations also vary: some data sets can be presented in the style of a 3D block model but, in other cases, a single distribution represents the whole rock volume hosting the ONKALO. (orig.)

  14. Geological model of the ONKALO area version 0

    International Nuclear Information System (INIS)

    Paananen, M.; Paulamaeki, S.; Gehoer, S.; Kaerki, A.

    2006-03-01

    The geological model of the ONKALO area is composed of four submodels: ductile deformation model, lithological model, brittle deformation model and alteration model. The ductile deformation model describes and models the products of polyphase ductile deformation, which facilitates the definition of dimensions and geometrical properties of individual lithological units determined in the lithological model. The lithological model describes the properties of rock units that can be defined on the basis the migmatite structures, textures and modal compositions. The brittle deformation model describes the products of multiple phases of brittle deformation, and the alteration model describes the types, occurrence and the effects of the hydrothermal alteration. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subject to five stages of ductile deformation. This resulted in a pervasive, composite foliation which shows a rather constant attitude in the ONKALO area. Based on observations in outcrops, investigation trenches and drill cores, 3D modelling of the lithological units is carried out assuming that the contacts are quasiconcordant. Using this assumption, the strike and dip of the foliation has been used as a tool to correlate the lithologies between the drillholes, and from surface and tunnel outcrops to drillholes. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from surface to drillholes. The rocks at Olkiluoto can be divided into two major groups: (1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, homogeneous tonaliticgranodioritic- granitic gneisses, mica gneisses and quartzitic gneisses, and mafic gneisses, (2) igneous rocks, including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite

  15. The Oak Ridge Competitive Electricity Dispatch (ORCED) Model Version 9

    Energy Technology Data Exchange (ETDEWEB)

    Hadley, Stanton W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Baek, Young Sun [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-11-01

    The Oak Ridge Competitive Electricity Dispatch (ORCED) model dispatches power plants in a region to meet the electricity demands for any single given year up to 2030. It uses publicly available sources of data describing electric power units such as the National Energy Modeling System and hourly demands from utility submittals to the Federal Energy Regulatory Commission that are projected to a future year. The model simulates a single region of the country for a given year, matching generation to demands and predefined net exports from the region, assuming no transmission constraints within the region. ORCED can calculate a number of key financial and operating parameters for generating units and regional market outputs including average and marginal prices, air emissions, and generation adequacy. By running the model with and without changes such as generation plants, fuel prices, emission costs, plug-in hybrid electric vehicles, distributed generation, or demand response, the marginal impact of these changes can be found.

  16. Due Regard Encounter Model Version 1.0

    Science.gov (United States)

    2013-08-19

    Note that no existing model covers encoun- ters between two IFR aircraft in oceanic airspace. The reason for this is that one cannot observe encounters...encounters between instrument flight rules ( IFR ) and non- IFR traffic beyond 12NM. 2 TABLE 1 Encounter model categories. Aircraft of Interest Intruder...Aircraft Location Flight Rule IFR VFR Noncooperative Noncooperative Conventional Unconventional CONUS IFR C C U X VFR C U U X Offshore IFR C C U X VFR C U

  17. Wildfire Risk Main Model

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The model combines three modeled fire behavior parameters (rate of spread, flame length, crown fire potential) and one modeled ecological health measure (fire regime...

  18. Transport Risk Index of Physiologic Stability, version II (TRIPS-II): a simple and practical neonatal illness severity score.

    Science.gov (United States)

    Lee, Shoo K; Aziz, Khalid; Dunn, Michael; Clarke, Maxine; Kovacs, Lajos; Ojah, Cecil; Ye, Xiang Y

    2013-05-01

    Derive and validate a practical assessment of infant illness severity at admission to neonatal intensive care units (NICUs). Prospective study involving 17,075 infants admitted to 15 NICUs in 2006 to 2008. Logistic regression was used to derive a prediction model for mortality comprising four empirically weighted items (temperature, blood pressure, respiratory status, response to noxious stimuli). This Transport Risk Index of Physiologic Stability, version II (TRIPS-II) was then validated for prediction of 7-day and total NICU mortality. TRIPS-II discriminated 7-day (receiver operating curve [ROC] area, 0.90) and total NICU mortality (ROC area, 0.87) from survival. Furthermore, there was a direct association between changes in TRIPS-II at 12 and 24 hours and mortality. There was good calibration across the full range of TRIPS-II scores and the gestational age at birth, and addition of TRIPS-II improved performance of prediction models that use gestational age and baseline population risk variables. TRIPS-II is a validated benchmarking tool for assessing infant illness severity at admission and for up to 24 hours after. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  19. Geological model of the Olkiluoto site Version O

    International Nuclear Information System (INIS)

    Paulamaeki, S.; Paananen, M.; Gehoer, S.

    2006-05-01

    The geological model of the Olkiluoto site consists of four submodels: the lithological model, the ductile deformation model, the brittle deformation model and the alteration model. The lithological model gives properties of definite rock units that can be defined on the basis the migmatite structures, textures and modal compositions. The ductile deformation model describes and models the products of polyphase ductile deformation, which enables to define the dimensions and geometrical properties of individual lithological units determined in the lithological model. The brittle deformation model describes the products of multiple phases of brittle deformation. The alteration model describes the types, occurrence and the effects of the hydrothermal alteration. The rocks of Olkiluoto can be divided into two major classes: (1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, tonalitic-granodioriticgranitic gneisses, mica gneisses, quartz gneisses and mafic gneisses, and (2) igneous rocks including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite structure: veined gneisses, stromatic gneisses and diatexitic gneisses. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subject to polyphased ductile deformation, including five stages. In 3D modelling of the lithological units, an assumption has been made, on the basis of measurements in outcrops, investigation trenches and drill cores, that the pervasive, composite foliation produced as a result a polyphase ductile deformation has a rather constant attitude in the ONKALO area. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from the surface to the drillholes. The bedrock in the Olkiluoto site has been subject to extensive hydrothermal alteration

  20. Institutional Transformation Version 2.5 Modeling and Planning.

    Energy Technology Data Exchange (ETDEWEB)

    Villa, Daniel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mizner, Jack H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Passell, Howard D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gallegos, Gerald R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Peplinski, William John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vetter, Douglas W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Christopher A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Malczynski, Leonard A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Addison, Marlin [Arizona State Univ., Mesa, AZ (United States); Schaffer, Matthew A. [Bridgers and Paxton Engineering Firm, Albuquerque, NM (United States); Higgins, Matthew W. [Vibrantcy, Albuquerque, NM (United States)

    2017-02-01

    Reducing the resource consumption and emissions of large institutions is an important step toward a sustainable future. Sandia National Laboratories' (SNL) Institutional Transformation (IX) project vision is to provide tools that enable planners to make well-informed decisions concerning sustainability, resource conservation, and emissions reduction across multiple sectors. The building sector has been the primary focus so far because it is the largest consumer of resources for SNL. The IX building module allows users to define the evolution of many buildings over time. The module has been created so that it can be generally applied to any set of DOE-2 ( http://doe2.com ) building models that have been altered to include parameters and expressions required by energy conservation measures (ECM). Once building models have been appropriately prepared, they are checked into a Microsoft Access (r) database. Each building can be represented by many models. This enables the capability to keep a continuous record of models in the past, which are replaced with different models as changes occur to the building. In addition to this, the building module has the capability to apply climate scenarios through applying different weather files to each simulation year. Once the database has been configured, a user interface in Microsoft Excel (r) is used to create scenarios with one or more ECMs. The capability to include central utility buildings (CUBs) that service more than one building with chilled water has been developed. A utility has been created that joins multiple building models into a single model. After using the utility, several manual steps are required to complete the process. Once this CUB model has been created, the individual contributions of each building are still tracked through meters. Currently, 120 building models from SNL's New Mexico and California campuses have been created. This includes all buildings at SNL greater than 10,000 sq. ft

  1. Mars Global Reference Atmospheric Model 2010 Version: Users Guide

    Science.gov (United States)

    Justh, H. L.

    2014-01-01

    This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.

  2. Red Storm usage model :Version 1.12.

    Energy Technology Data Exchange (ETDEWEB)

    Jefferson, Karen L.; Sturtevant, Judith E.

    2005-12-01

    Red Storm is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Sandia National Laboratories (SNL). The Red Storm Usage Model (RSUM) documents the capabilities and the environment provided for the FY05 Tri-Lab Level II Limited Availability Red Storm User Environment Milestone and the FY05 SNL Level II Limited Availability Red Storm Platform Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and SNL. Additionally, the Red Storm Usage Model maps the provided capabilities to the Tri-Lab ASC Computing Environment (ACE) requirements. The ACE requirements reflect the high performance computing requirements for the ASC community and have been updated in FY05 to reflect the community's needs. For each section of the RSUM, Appendix I maps the ACE requirements to the Limited Availability User Environment capabilities and includes a description of ACE requirements met and those requirements that are not met in that particular section. The Red Storm Usage Model, along with the ACE mappings, has been issued and vetted throughout the Tri-Lab community.

  3. Zig-zag version of the Frenkel-Kontorova model

    DEFF Research Database (Denmark)

    Christiansen, Peter Leth; Savin, A.V.; Zolotaryuk, Alexander

    1996-01-01

    We study a generalization of the Frenkel-Kontorova model which describes a zig-zag chain of particles coupled by both the first- and second-neighbor harmonic forces and subjected to a planar substrate with a commensurate potential relief. The particles are supposed to have two degrees of freedom...

  4. Modelling allergenic risk

    DEFF Research Database (Denmark)

    Birot, Sophie

    combines second order Monte-Carlo simulations with Bayesian inferences [13]. An alternative method using second order Monte-Carlo simulations was proposed to take into account the uncertainty from the inputs. The uncertainty propagation from the inputs to the risk of allergic reaction was also evaluated...... countries is proposed. Thus, the allergen risk assessment can be performed cross-nationally and for the correct food group. Then the two probabilistic risk assessment methods usually used were reviewed and compared. First order Monte-Carlo simulations are used in one method [14], whereas the other one......Up to 20 million Europeans suffer from food allergies. Due to the lack of knowledge about why food allergies developed or how to protect allergic consumers from the offending food, food allergy management is mainly based on food allergens avoidance. The iFAAM project (Integrated approaches to Food...

  5. The ``KILDER`` air pollution modelling system, version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Gram, F.

    1996-12-31

    This report describes the KILDER Air Pollution Modelling System, which is a system of small PC-programs for calculation of long-term emission, dispersion, concentration and exposure from different source categories. The system consists of three parts: (1) The dispersion models POI-KILD and ARE-KILD for point- and area-sources, respectively, (2) Meterological programs WINDFREC, STABFREC and METFREC, (3) Supporting programs for calculating emissions and exposure and for operating with binary data fields. The file structure is based on binary files with data fields. The data fields are matrices with different types of values and may be read into the computer or be calculated in other programs. 19 refs., 22 figs., 3 tabs.

  6. Implementation of a parallel version of a regional climate model

    Energy Technology Data Exchange (ETDEWEB)

    Gerstengarbe, F.W. [ed.; Kuecken, M. [Potsdam-Institut fuer Klimafolgenforschung (PIK), Potsdam (Germany); Schaettler, U. [Deutscher Wetterdienst, Offenbach am Main (Germany). Geschaeftsbereich Forschung und Entwicklung

    1997-10-01

    A regional climate model developed by the Max Planck Institute for Meterology and the German Climate Computing Centre in Hamburg based on the `Europa` and `Deutschland` models of the German Weather Service has been parallelized and implemented on the IBM RS/6000 SP computer system of the Potsdam Institute for Climate Impact Research including parallel input/output processing, the explicit Eulerian time-step, the semi-implicit corrections, the normal-mode initialization and the physical parameterizations of the German Weather Service. The implementation utilizes Fortran 90 and the Message Passing Interface. The parallelization strategy used is a 2D domain decomposition. This report describes the parallelization strategy, the parallel I/O organization, the influence of different domain decomposition approaches for static and dynamic load imbalances and first numerical results. (orig.)

  7. External Validation of a Prediction Model for Successful External Cephalic Version

    NARCIS (Netherlands)

    de Hundt, Marcella; Vlemmix, Floortje; Kok, Marjolein; van der Steeg, Jan W.; Bais, Joke M.; Mol, Ben W.; van der Post, Joris A.

    2012-01-01

    We sought external validation of a prediction model for the probability of a successful external cephalic version (ECV). We evaluated the performance of the prediction model with calibration and discrimination. For clinical practice, we developed a score chart to calculate the probability of a

  8. Regularized integrable version of the one-dimensional quantum sine-Gordon model

    International Nuclear Information System (INIS)

    Japaridze, G.I.; Nersesyan, A.A.; Wiegmann, P.B.

    1983-01-01

    The authors derive a regularized exactly solvable version of the one-dimensional quantum sine-Gordon model proceeding from the exact solution of the U(1)-symmetric Thirring model. The ground state and the excitation spectrum are obtained in the region ν 2 < 8π. (Auth.)

  9. Connected Equipment Maturity Model Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Butzbaugh, Joshua B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mayhorn, Ebony T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Sullivan, Greg [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Whalen, Scott A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-05-01

    The Connected Equipment Maturity Model (CEMM) evaluates the high-level functionality and characteristics that enable equipment to provide the four categories of energy-related services through communication with other entities (e.g., equipment, third parties, utilities, and users). The CEMM will help the U.S. Department of Energy, industry, energy efficiency organizations, and research institutions benchmark the current state of connected equipment and identify capabilities that may be attained to reach a more advanced, future state.

  10. System cost model user's manual, version 1.2

    International Nuclear Information System (INIS)

    Shropshire, D.

    1995-06-01

    The System Cost Model (SCM) was developed by Lockheed Martin Idaho Technologies in Idaho Falls, Idaho and MK-Environmental Services in San Francisco, California to support the Baseline Environmental Management Report sensitivity analysis for the U.S. Department of Energy (DOE). The SCM serves the needs of the entire DOE complex for treatment, storage, and disposal (TSD) of mixed low-level, low-level, and transuranic waste. The model can be used to evaluate total complex costs based on various configuration options or to evaluate site-specific options. The site-specific cost estimates are based on generic assumptions such as waste loads and densities, treatment processing schemes, existing facilities capacities and functions, storage and disposal requirements, schedules, and cost factors. The SCM allows customization of the data for detailed site-specific estimates. There are approximately forty TSD module designs that have been further customized to account for design differences for nonalpha, alpha, remote-handled, and transuranic wastes. The SCM generates cost profiles based on the model default parameters or customized user-defined input and also generates costs for transporting waste from generators to TSD sites

  11. Geological Model of the Olkiluoto Site. Version 2.0

    International Nuclear Information System (INIS)

    Aaltonen, I.

    2010-10-01

    The rocks of Olkiluoto can be divided into two major classes: 1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, tonalitic-granodioriticgranitic gneisses, mica gneisses, quartz gneisses and mafic gneisses, and 2) igneous rocks including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite structure: veined gneisses, stromatic gneisses and diatexitic gneisses. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subjected to polyphased ductile deformation, consisting of five stages, the D2 being locally the most intensive phase, producing thrust-related folding, strong migmatisation and pervasive foliation. In 3D modelling of the lithological units, an assumption has been made, on the basis of measurements in the outcrops, investigation trenches and drill cores, that the pervasive, composite foliation produced as a result of polyphase ductile deformation has a rather constant attitude in the ONKALO area. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from the surface to the drillholes. In addition, the largest ductile deformation zones and tectonic units are described in 3D model. The bedrock at the Olkiluoto site has been subjected to extensive hydrothermal alteration, which has taken place at reasonably low temperature conditions, the estimated temperature interval being from slightly over 300 deg C to less than 100 deg C. Two types of alteration can be observed: firstly, pervasive alteration and secondly fracturecontrolled alteration. Clay mineralisation and sulphidisation are the most prominent alteration events in the site area. Sulphides are located in the uppermost part of the model volume following roughly the foliation and lithological trend. Kaolinite is also mainly located in the

  12. A magnetic version of the Smilansky-Solomyak model

    Czech Academy of Sciences Publication Activity Database

    Barseghyan, Diana; Exner, Pavel

    2017-01-01

    Roč. 50, č. 48 (2017), č. článku 485203. ISSN 1751-8113 R&D Projects: GA ČR GA17-01706S Institutional support: RVO:61389005 Keywords : Smilansky-Solomyak model * spectral transition * homegeneous magnetic field * discrete spectrum * essential spectrum Subject RIV: BE - Theoretical Physics OBOR OECD: Atomic, molecular and chemical physics (physics of atoms and molecules including collision, interaction with radiation, magnetic resonances, Mössbauer effect) Impact factor: 1.857, year: 2016

  13. Psychometric validation of the Chinese version of the Johns Hopkins Fall Risk Assessment Tool for older Chinese inpatients.

    Science.gov (United States)

    Zhang, Junhong; Wang, Min; Liu, Yu

    2016-10-01

    To culturally adapt and evaluate the reliability and validity of the Chinese version of the Johns Hopkins Fall Risk Assessment Tool among older inpatients in the mainland of China. Patient falls are an important safety consideration within hospitals among older inpatients. Nurses need specific risk assessment tools for older inpatients to reliably identify at-risk populations and guide interventions that highlight fixable risk factors for falls and consequent injuries. In China, a few tools have been developed to measure fall risk. However, they lack the solid psychometric development necessary to establish their validity and reliability, and they are not widely used for elderly inpatients. A cross-sectional study. A convenient sampling was used to recruit 201 older inpatients from two tertiary-level hospitals in Beijing and Xiamen, China. The Johns Hopkins Fall Risk Assessment Tool was translated using forward and backward translation procedures and was administered to these 201 older inpatients. Reliability of the tool was calculated by inter-rater reliability and Cronbach's alpha. Validity was analysed through content validity index and construct validity. The Inter-rater reliability of Chinese version of Johns Hopkins Fall Risk Assessment Tool was 97·14% agreement with Cohen's Kappa of 0·903. Cronbach's α was 0·703. Content of Validity Index was 0·833. Two factors represented intrinsic and extrinsic risk factors were explored that together explained 58·89% of the variance. This study provided evidence that Johns Hopkins Fall Risk Assessment Tool is an acceptable, valid and reliable tool to identify older inpatients at risk of falls and falls with injury. Further psychometric testing on criterion validity and evaluation of its advanced utility in geriatric clinical settings are warranted. The Chinese version of Johns Hopkins Fall Risk Assessment Tool may be useful for health care personnel to identify older Chinese inpatients at risk of falls and falls

  14. Shortened version of the work ability index to identify workers at risk of long-term sickness absence

    NARCIS (Netherlands)

    Schouten, Lianne S.; Bultmann, Ute; Heymans, Martijn W.; Joling, Catelijne I.; Twisk, Jos W. R.; Roelen, Corne A. M.

    2016-01-01

    Background: The Work Ability Index (WAI) identifies non-sicklisted workers at risk of future long-term sickness absence (LTSA). The WAI is a complicated instrument and inconvenient for use in large-scale surveys. We investigated whether shortened versions of the WAI identify non-sicklisted workers

  15. PUMA Version 6 Multiplatform with Facilities to be coupled with other Simulation Models

    International Nuclear Information System (INIS)

    Grant, Carlos

    2013-01-01

    PUMA is a code for nuclear reactor calculation used in all nuclear installations in Argentina for simulation of fuel management, power cycles and transient events by means of spatial kinetic diffusion theory in 3D. For the versions used up to now the WINDOWS platform was used with very good results. Nowadays PUMA must work in different operative systems, LINUX among others, and must also have facilities to be coupled with other models. For this reason this new version was reprogrammed in ADA, language oriented to a safe programming and be found in any operative system. In former versions PUMA was executed through macro instructions written in LOGO. For this version it is possible to use also PYTHON, which makes also possible the access in execution time to internal data of PUMA. The use of PYTHON allows a easy way to couple PUMA with other codes. The possibilities of this new version of PUMA are shown by means of examples of input data and process control using PYTHON and LOGO. It is discussed the implementation of this methodology in other codes to be coupled with PUMA for versions run in WINDOWS and LINUX. (author)

  16. Geological model of the Olkiluoto site. Version 1.0

    International Nuclear Information System (INIS)

    Mattila, J.; Aaltonen, I.; Kemppainen, K.

    2008-01-01

    The rocks of Olkiluoto can be divided into two major classes: (1) supracrustal high-grade metamorphic rocks including various migmatitic gneisses, tonalitic-granodioriticgranitic gneisses, mica gneisses, quartz gneisses and mafic gneisses, and (2) igneous rocks including pegmatitic granites and diabase dykes. The migmatitic gneisses can further be divided into three subgroups in terms of the type of migmatite structure: veined gneisses, stromatic gneisses and diatexitic gneisses. On the basis of refolding and crosscutting relationships, the metamorphic supracrustal rocks have been subjected to polyphased ductile deformation, consisting of five stages, the D2 being locally the most intensive phase, producing thrust-related folding, strong migmatisation and pervasive foliation. In 3D modelling of the lithological units, an assumption has been made, on the basis of measurements in the outcrops, investigation trenches and drill cores, that the pervasive, composite foliation produced as a result of polyphase ductile deformation has a rather constant attitude in the ONKALO area. Consequently, the strike and dip of the foliation has been used as a tool, through which the lithologies have been correlated between the drillholes and from the surface to the drillholes. The bedrock at the Olkiluoto site has been subjected to extensive hydrothermal alteration, which has taken place at reasonably low temperature conditions, the estimated temperature interval being from slightly over 300 deg C to less than 100 deg C. Two types of alteration can be observed: (1) pervasive (disseminated) alteration and (2) fracture-controlled (veinlet) alteration. Kaolinisation and sulphidisation are the most prominent alteration events in the site area. Sulphides are located in the uppermost part of the model volume following roughly the lithological trend (slightly dipping to the SE). Kaolinite is also located in the uppermost part, but the orientation is opposite to the main lithological trend

  17. [Risk of developmental dysplasia of the hip in patients subjected to the external cephalic version].

    Science.gov (United States)

    Sarmiento Carrera, Nerea; González Colmenero, Eva; Vázquez Castelo, José Luis; Concheiro Guisán, Ana; Couceiro Naveira, Emilio; Fernández Lorenzo, José Ramón

    2018-03-01

    Developmental dysplasia of the hip (DDH) refers to the spectrum of abnormalities of maturation and development of the hip. Breech presentation is associated with DDH. This risk factor can be modified by external cephalic version (ECV). The aim of this study is to evaluate the incidence of DDH in patients who successfully underwent ECV, as well as to evaluate need for these children (breech for a period during gestation) to be included in the DDH screening protocol. A prospective cohort study was conducted in the Hospital Universitario de Vigo from January 1, 2015 to December 31, 2015. It included children born in cephalic presentation after a successful ECV, as well as children born in breech presentation. They all were screened for DDH by ultrasound examination of the hip. Out of a total of 122 newborns included in the study, ECV was attempted on 67 (54.9%), of which 35 (52.2%) were successful. Out of the 14 children diagnosed with DDH, 3 of those born in cephalic presentation after a successful ECV were found to be normal on physical examination. Successful ECV is associated with a lower incidence of DDH as regards breech presentation. However, these patients should be included in the DDH screening protocol for the early detection of this disorder. Copyright © 2017 Asociación Española de Pediatría. Publicado por Elsevier España, S.L.U. All rights reserved.

  18. The breech presentation and the vertex presentation following an external version represent risk factors for neonatal hip instability.

    Science.gov (United States)

    Andersson, J E; Odén, A

    2001-08-01

    The aim of this study was to evaluate the frequency and type of hip-joint instability and the frequency of hip dislocation requiring treatment in neonates who had been lying in the breech presentation and were delivered vaginally after an external version or by caesarean section, and to compare them with neonates who were naturally in the vertex presentation. Breech presentations without ongoing labour were subjected to an attempted external version and, in cases where this proved unsuccessful or where labour had started, to deliver by caesarean section. None of the breech presentations was vaginally delivered. The anterior-dynamic ultrasound method was used to assess the hip-joint status of the neonates. Out of 6,571 foetuses, 257 were in breech presentation after 36 wk of pregnancy. Sixty-two were vaginally delivered following an external version to vertex presentation and 195 were delivered by caesarean section, 75 of these following unsuccessful attempts to perform a version. Treatment for congenital hip-joint dislocation was performed on 0.2%. Out of the breech presentations, 1.0% of those delivered by caesarean section were treated, while in those with vaginal delivery following an external version the treatment frequency was 3.2%. No case of late diagnosed hip dislocation was recorded. Significant differences in frequency of hip-joint instability and treatment were found between (i) neonates delivered in breech presentation and those delivered with vertex presentation, (ii) infants delivered in vertex presentation, naturally or after successful version, and (iii) those delivered by caesarean section with or without attempted external version and those delivered with vortex presentation. Breech presentation predisposes to increased hip instability. The instability is present prior to delivery and is certainly not a primary result of delivery forces. Both breech and vertex presentations following an external or spontaneous version should be considered as risk

  19. Response Surface Modeling Tool Suite, Version 1.x

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-05

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code, a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.

  20. Multilevel joint competing risk models

    Science.gov (United States)

    Karunarathna, G. H. S.; Sooriyarachchi, M. R.

    2017-09-01

    Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).

  1. Prevalence, Outcome, and Women's Experiences of External Cephalic Version in a Low-Risk Population

    NARCIS (Netherlands)

    Rijnders, Marlies; Offerhaus, Pien; van Dommelen, Paula; Wiegers, Therese; Buitendijk, Simone

    2010-01-01

    Background: Until recently, external cephalic version to prevent breech presentation at birth was not widely accepted. The objective of our study was to assess the prevalence, outcomes, and women's experiences of external cephalic version to improve the implementation of the procedure in the

  2. Prevalence, outcome, and women's experiences of external cephalic version in a low-risk population

    NARCIS (Netherlands)

    Rijnders, M.; Offerhaus, P.; Dommelen, P. van; Wiegers, T.; Buitendijk, S.

    2010-01-01

    Background: Until recently, external cephalic version to prevent breech presentation at birth was not widely accepted. The objective of our study was to assess the prevalence, outcomes, and women's experiences of external cephalic version to improve the implementation of the procedure in the

  3. Prostate Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  4. Colorectal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Esophageal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Bladder Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  7. Lung Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  8. Breast Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  9. Pancreatic Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  10. Ovarian Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  11. Liver Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  12. Testicular Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of testicular cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  13. Cervical Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  14. Risk modelling in portfolio optimization

    Science.gov (United States)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  15. Models for Pesticide Risk Assessment

    Science.gov (United States)

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  16. Technical Note: Description and assessment of a nudged version of the new dynamics Unified Model

    Directory of Open Access Journals (Sweden)

    O. Morgenstern

    2008-03-01

    Full Text Available We present a "nudged" version of the Met Office general circulation model, the Unified Model. We constrain this global climate model using ERA-40 re-analysis data with the aim of reproducing the observed "weather" over a year from September 1999. Quantitative assessments are made of its performance, focusing on dynamical aspects of nudging and demonstrating that the "weather" is well simulated.

  17. Site investigation SFR. Hydrogeological modelling of SFR. Model version 0.2

    Energy Technology Data Exchange (ETDEWEB)

    Oehman, Johan (Golder Associates AB (Sweden)); Follin, Sven (SF GeoLogic (Sweden))

    2010-01-15

    The Swedish Nuclear Fuel and Waste Management Company (SKB) has conducted site investigations for a planned extension of the existing final repository for short-lived radioactive waste (SFR). A hydrogeological model is developed in three model versions, which will be used for safety assessment and design analyses. This report presents a data analysis of the currently available hydrogeological data from the ongoing Site Investigation SFR (KFR27, KFR101, KFR102A, KFR102B, KFR103, KFR104, and KFR105). The purpose of this work is to develop a preliminary hydrogeological Discrete Fracture Network model (hydro-DFN) parameterisation that can be applied in regional-scale modelling. During this work, the Geologic model had not yet been updated for the new data set. Therefore, all analyses were made to the rock mass outside Possible Deformation Zones, according to Single Hole Interpretation. Owing to this circumstance, it was decided not to perform a complete hydro-DFN calibration at this stage. Instead focus was re-directed to preparatory test cases and conceptual questions with the aim to provide a sound strategy for developing the hydrogeological model SFR v. 1.0. The presented preliminary hydro-DFN consists of five fracture sets and three depth domains. A statistical/geometrical approach (connectivity analysis /Follin et al. 2005/) was performed to estimate the size (i.e. fracture radius) distribution of fractures that are interpreted as Open in geologic mapping of core data. Transmissivity relations were established based on an assumption of a correlation between the size and evaluated specific capacity of geologic features coupled to inflows measured by the Posiva Flow Log device (PFL-f data). The preliminary hydro-DFN was applied in flow simulations in order to test its performance and to explore the role of PFL-f data. Several insights were gained and a few model technical issues were raised. These are summarised in Table 5-1

  18. Site investigation SFR. Hydrogeological modelling of SFR. Model version 0.2

    International Nuclear Information System (INIS)

    Oehman, Johan; Follin, Sven

    2010-01-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) has conducted site investigations for a planned extension of the existing final repository for short-lived radioactive waste (SFR). A hydrogeological model is developed in three model versions, which will be used for safety assessment and design analyses. This report presents a data analysis of the currently available hydrogeological data from the ongoing Site Investigation SFR (KFR27, KFR101, KFR102A, KFR102B, KFR103, KFR104, and KFR105). The purpose of this work is to develop a preliminary hydrogeological Discrete Fracture Network model (hydro-DFN) parameterisation that can be applied in regional-scale modelling. During this work, the Geologic model had not yet been updated for the new data set. Therefore, all analyses were made to the rock mass outside Possible Deformation Zones, according to Single Hole Interpretation. Owing to this circumstance, it was decided not to perform a complete hydro-DFN calibration at this stage. Instead focus was re-directed to preparatory test cases and conceptual questions with the aim to provide a sound strategy for developing the hydrogeological model SFR v. 1.0. The presented preliminary hydro-DFN consists of five fracture sets and three depth domains. A statistical/geometrical approach (connectivity analysis /Follin et al. 2005/) was performed to estimate the size (i.e. fracture radius) distribution of fractures that are interpreted as Open in geologic mapping of core data. Transmissivity relations were established based on an assumption of a correlation between the size and evaluated specific capacity of geologic features coupled to inflows measured by the Posiva Flow Log device (PFL-f data). The preliminary hydro-DFN was applied in flow simulations in order to test its performance and to explore the role of PFL-f data. Several insights were gained and a few model technical issues were raised. These are summarised in Table 5-1

  19. A new version of code Java for 3D simulation of the CCA model

    Science.gov (United States)

    Zhang, Kebo; Xiong, Hailing; Li, Chao

    2016-07-01

    In this paper we present a new version of the program of CCA model. In order to benefit from the advantages involved in the latest technologies, we migrated the running environment from JDK1.6 to JDK1.7. And the old program was optimized into a new framework, so promoted extendibility.

  20. Inter-rater reliability of the German version of the Nurses' Global Assessment of Suicide Risk scale.

    Science.gov (United States)

    Kozel, Bernd; Grieser, Manuela; Abderhalden, Christoph; Cutcliffe, John R

    2016-10-01

    In comparison to the general population, the suicide rates of psychiatric inpatient populations in Germany and Switzerland are very high. An important preventive contribution to the lowering of the suicide rates in mental health care is to ensure that the risk of suicide of psychiatric inpatients is assessed as accurately as possible. While risk-assessment instruments can serve an important function in determining such risk, very few have been translated to German. Therefore, in the present study, we reported on the German version of Nurses' Global Assessment of Suicide Risk (NGASR) scale. After translating the original instrument into German and pretesting the German version, we tested the inter-rater reliability of the instrument. Twelve video case studies were evaluated by 13 raters with the NGASR scale in a 'laboratory' trial. In each case, the observer's agreement was calculated for the single items, the overall scale, the risk levels, and the sum scores. The statistical data analysis was conducted with kappa and AC1 statistics for dichotomous (items, scale) scales. A high-to-very high observers' agreement (AC1: 0.62-1.00, kappa: 0.00-1.00) was determined for 16 items of the German version of the NGASR scale. We conclude that the German version of the NGASR scale is a reliable instrument for evaluating risk factors for suicide. A reliable application in the clinical practise appears to be enhanced by training in the use of the instrument and the right implementation instructions. © 2016 Australian College of Mental Health Nurses Inc.

  1. RAMS Model for Terrestrial Pathways Version 3. 0 (for microcomputers). Model-Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Niebla, E.

    1989-01-01

    The RAMS Model for Terrestrial Pathways is a computer program for calculation of numeric criteria for land application and distribution and marketing of sludges under the sewage-sludge regulations at 40 CFR Part 503. The risk-assessment models covered assume that municipal sludge with specified characteristics is spread across a defined area of ground at a known rate once each year for a given number of years. Risks associated with direct land application of sludge applied after distribution and marketing are both calculated. The computer program calculates the maximum annual loading of contaminants that can be land applied and still meet the risk criteria specified as input. Software Description: The program is written in the Turbo/Basic programming language for implementation on IBM PC/AT or compatible machines using DOS 3.0 or higher operating system. Minimum core storage is 512K.

  2. Modeling renewable energy company risk

    International Nuclear Information System (INIS)

    Sadorsky, Perry

    2012-01-01

    The renewable energy sector is one of the fastest growing components of the energy industry and along with this increased demand for renewable energy there has been an increase in investing and financing activities. The tradeoff between risk and return in the renewable energy sector is, however, precarious. Renewable energy companies are often among the riskiest types of companies to invest in and for this reason it is necessary to have a good understanding of the risk factors. This paper uses a variable beta model to investigate the determinants of renewable energy company risk. The empirical results show that company sales growth has a negative impact on company risk while oil price increases have a positive impact on company risk. When oil price returns are positive and moderate, increases in sales growth can offset the impact of oil price returns and this leads to lower systematic risk.

  3. User's guide to the Yucca Mountain Integrating Model (YMIM) Version 2.1

    International Nuclear Information System (INIS)

    Gansemer, J.; Lamont, A.

    1995-04-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the engineered barrier system. It contains models of the processes of waste container failure and nuclide release from the fuel rods. YMIM is driven by scenarios of container and rod temperature, near-field chemistry, and near-field hydrology provided by other modules. It is designed to be highly modular so that a model of an individual process can be easily modified to replaced without interfering with the models of other processes. This manual describes the process models and provides instructions for setting up and running YMIM Version 2.1

  4. The Lagrangian particle dispersion model FLEXPART-WRF VERSION 3.1

    Energy Technology Data Exchange (ETDEWEB)

    Brioude, J.; Arnold, D.; Stohl, A.; Cassiani, M.; Morton, Don; Seibert, P.; Angevine, W. M.; Evan, S.; Dingwell, A.; Fast, Jerome D.; Easter, Richard C.; Pisso, I.; Bukhart, J.; Wotawa, G.

    2013-11-01

    The Lagrangian particle dispersion model FLEXPART was originally designed for cal- culating long-range and mesoscale dispersion of air pollutants from point sources, such as after an accident in a nuclear power plant. In the meantime FLEXPART has evolved into a comprehensive tool for atmospheric transport modeling and analysis at different scales. This multiscale need from the modeler community has encouraged new developments in FLEXPART. In this document, we present a version that works with the Weather Research and Forecasting (WRF) mesoscale meteoro- logical model. Simple procedures on how to run FLEXPART-WRF are presented along with special options and features that differ from its predecessor versions. In addition, test case data, the source code and visualization tools are provided to the reader as supplementary material.

  5. Towards New Empirical Versions of Financial and Accounting Models Corrected for Measurement Errors

    OpenAIRE

    Francois-Éric Racicot; Raymond Théoret; Alain Coen

    2006-01-01

    In this paper, we propose a new empirical version of the Fama and French Model based on the Hausman (1978) specification test and aimed at discarding measurement errors in the variables. The proposed empirical framework is general enough to be used for correcting other financial and accounting models of measurement errors. Removing measurement errors is important at many levels as information disclosure, corporate governance and protection of investors.

  6. Description and evaluation of the Model for Ozone and Related chemical Tracers, version 4 (MOZART-4

    Directory of Open Access Journals (Sweden)

    L. K. Emmons

    2010-01-01

    Full Text Available The Model for Ozone and Related chemical Tracers, version 4 (MOZART-4 is an offline global chemical transport model particularly suited for studies of the troposphere. The updates of the model from its previous version MOZART-2 are described, including an expansion of the chemical mechanism to include more detailed hydrocarbon chemistry and bulk aerosols. Online calculations of a number of processes, such as dry deposition, emissions of isoprene and monoterpenes and photolysis frequencies, are now included. Results from an eight-year simulation (2000–2007 are presented and evaluated. The MOZART-4 source code and standard input files are available for download from the NCAR Community Data Portal (http://cdp.ucar.edu.

  7. A one-dimensional material transfer model for HECTR version 1.5

    International Nuclear Information System (INIS)

    Geller, A.S.; Wong, C.C.

    1991-08-01

    HECTR (Hydrogen Event Containment Transient Response) is a lumped-parameter computer code developed for calculating the pressure-temperature response to combustion in a nuclear power plant containment building. The code uses a control-volume approach and subscale models to simulate the mass, momentum, and energy transfer occurring in the containment during a loss-of-collant-accident (LOCA). This document describes one-dimensional subscale models for mass and momentum transfer, and the modifications to the code required to implement them. Two problems were analyzed: the first corresponding to a standard problem studied with previous HECTR versions, the second to experiments. The performance of the revised code relative to previous HECTR version is discussed as is the ability of the code to model the experiments. 8 refs., 5 figs., 3 tabs

  8. Information risk and security modeling

    Science.gov (United States)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  9. Cabin Environment Physics Risk Model

    Science.gov (United States)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  10. The Hamburg Oceanic Carbon Cycle Circulation Model. Version 1. Version 'HAMOCC2s' for long time integrations

    Energy Technology Data Exchange (ETDEWEB)

    Heinze, C.; Maier-Reimer, E. [Max-Planck-Institut fuer Meteorologie, Hamburg (Germany)

    1999-11-01

    The Hamburg Ocean Carbon Cycle Circulation Model (HAMOCC, configuration HAMOCC2s) predicts the atmospheric carbon dioxide partial pressure (as induced by oceanic processes), production rates of biogenic particulate matter, and geochemical tracer distributions in the water column as well as the bioturbated sediment. Besides the carbon cycle this model version includes also the marine silicon cycle (silicic acid in the water column and the sediment pore waters, biological opal production, opal flux through the water column and opal sediment pore water interaction). The model is based on the grid and geometry of the LSG ocean general circulation model (see the corresponding manual, LSG=Large Scale Geostrophic) and uses a velocity field provided by the LSG-model in 'frozen' state. In contrast to the earlier version of the model (see Report No. 5), the present version includes a multi-layer sediment model of the bioturbated sediment zone, allowing for variable tracer inventories within the complete model system. (orig.)

  11. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    Science.gov (United States)

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  12. Digital elevation models for site investigation programme in Oskarshamn. Site description version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Brydsten, Lars; Stroemgren, Maarten [Umeaa Univ. (Sweden). Dept. of Biology and Environmental Science

    2005-06-01

    In the Oskarshamn area, a digital elevation model has been produced using elevation data from many elevation sources on both land and sea. Many elevation model users are only interested in elevation models over land, so the model has been designed in three versions: Version 1 describes land surface, lake water surface, and sea bottom. Version 2 describes land surface, sediment levels at lake bottoms, and sea bottoms. Version 3 describes land surface, sediment levels at lake bottoms, and sea surface. In cases where the different sources of data were not in point form 'such as existing elevation models of land or depth lines from nautical charts' they have been converted to point values using GIS software. Because data from some sources often overlaps with data from other sources, several tests were conducted to determine if both sources of data or only one source would be included in the dataset used for the interpolation procedure. The tests resulted in the decision to use only the source judged to be of highest quality for most areas with overlapping data sources. All data were combined into a database of approximately 3.3 million points unevenly spread over an area of about 800 km{sup 2}. The large number of data points made it difficult to construct the model with a single interpolation procedure, the area was divided into 28 sub-models that were processed one by one and finally merged together into one single model. The software ArcGis 8.3 and its extension Geostatistical Analysis were used for the interpolation. The Ordinary Kriging method was used for interpolation. This method allows both a cross validation and a validation before the interpolation is conducted. Cross validation with different Kriging parameters were performed and the model with the most reasonable statistics was chosen. Finally, a validation with the most appropriate Kriging parameters was performed in order to verify that the model fit unmeasured localities. Since both the

  13. Digital elevation models for site investigation programme in Oskarshamn. Site description version 1.2

    International Nuclear Information System (INIS)

    Brydsten, Lars; Stroemgren, Maarten

    2005-06-01

    In the Oskarshamn area, a digital elevation model has been produced using elevation data from many elevation sources on both land and sea. Many elevation model users are only interested in elevation models over land, so the model has been designed in three versions: Version 1 describes land surface, lake water surface, and sea bottom. Version 2 describes land surface, sediment levels at lake bottoms, and sea bottoms. Version 3 describes land surface, sediment levels at lake bottoms, and sea surface. In cases where the different sources of data were not in point form 'such as existing elevation models of land or depth lines from nautical charts' they have been converted to point values using GIS software. Because data from some sources often overlaps with data from other sources, several tests were conducted to determine if both sources of data or only one source would be included in the dataset used for the interpolation procedure. The tests resulted in the decision to use only the source judged to be of highest quality for most areas with overlapping data sources. All data were combined into a database of approximately 3.3 million points unevenly spread over an area of about 800 km 2 . The large number of data points made it difficult to construct the model with a single interpolation procedure, the area was divided into 28 sub-models that were processed one by one and finally merged together into one single model. The software ArcGis 8.3 and its extension Geostatistical Analysis were used for the interpolation. The Ordinary Kriging method was used for interpolation. This method allows both a cross validation and a validation before the interpolation is conducted. Cross validation with different Kriging parameters were performed and the model with the most reasonable statistics was chosen. Finally, a validation with the most appropriate Kriging parameters was performed in order to verify that the model fit unmeasured localities. Since both the quality and the

  14. Fuzzy audit risk modeling algorithm

    Directory of Open Access Journals (Sweden)

    Zohreh Hajihaa

    2011-07-01

    Full Text Available Fuzzy logic has created suitable mathematics for making decisions in uncertain environments including professional judgments. One of the situations is to assess auditee risks. During recent years, risk based audit (RBA has been regarded as one of the main tools to fight against fraud. The main issue in RBA is to determine the overall audit risk an auditor accepts, which impact the efficiency of an audit. The primary objective of this research is to redesign the audit risk model (ARM proposed by auditing standards. The proposed model of this paper uses fuzzy inference systems (FIS based on the judgments of audit experts. The implementation of proposed fuzzy technique uses triangular fuzzy numbers to express the inputs and Mamdani method along with center of gravity are incorporated for defuzzification. The proposed model uses three FISs for audit, inherent and control risks, and there are five levels of linguistic variables for outputs. FISs include 25, 25 and 81 rules of if-then respectively and officials of Iranian audit experts confirm all the rules.

  15. Thermal modelling. Preliminary site description. Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Back, Paer-Erik; Bengtsson, Anna; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2005-08-01

    This report presents the thermal site descriptive model for the Forsmark area, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at canister scale has been modelled for two different lithological domains (RFM029 and RFM012, both dominated by granite to granodiorite (101057)). A main modelling approach has been used to determine the mean value of the thermal conductivity. Two alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological model for the Forsmark area, version 1.2 together with rock type models constituted from measured and calculated (from mineral composition) thermal conductivities. Results indicate that the mean of thermal conductivity is expected to exhibit a small variation between the different domains, 3.46 W/(mxK) for RFM012 to 3.55 W/(mxK) for RFM029. The spatial distribution of the thermal conductivity does not follow a simple model. Lower and upper 95% confidence limits are based on the modelling results, but have been rounded of to only two significant figures. Consequently, the lower limit is 2.9 W/(mxK), while the upper is 3.8 W/(mxK). This is applicable to both the investigated domains. The temperature dependence is rather small with a decrease in thermal conductivity of 10.0% per 100 deg C increase in temperature for the dominating rock type. There are a number of important uncertainties associated with these results. One of the uncertainties considers the representative scale for the canister. Another important uncertainty is the methodological uncertainties associated with the upscaling of thermal conductivity from cm-scale to canister scale. In addition, the representativeness of rock samples is

  16. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    Energy Technology Data Exchange (ETDEWEB)

    Back, Paer-Erik; Sundberg, Jan [Geo Innova AB (Sweden)

    2007-09-15

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  17. Thermal site descriptive model. A strategy for the model development during site investigations - version 2

    International Nuclear Information System (INIS)

    Back, Paer-Erik; Sundberg, Jan

    2007-09-01

    This report presents a strategy for describing, predicting and visualising the thermal aspects of the site descriptive model. The strategy is an updated version of an earlier strategy applied in all SDM versions during the initial site investigation phase at the Forsmark and Oskarshamn areas. The previous methodology for thermal modelling did not take the spatial correlation fully into account during simulation. The result was that the variability of thermal conductivity in the rock mass was not sufficiently well described. Experience from earlier thermal SDMs indicated that development of the methodology was required in order describe the spatial distribution of thermal conductivity in the rock mass in a sufficiently reliable way, taking both variability within rock types and between rock types into account. A good description of the thermal conductivity distribution is especially important for the lower tail. This tail is important for the design of a repository because it affects the canister spacing. The presented approach is developed to be used for final SDM regarding thermal properties, primarily thermal conductivity. Specific objectives for the strategy of thermal stochastic modelling are: Description: statistical description of the thermal conductivity of a rock domain. Prediction: prediction of thermal conductivity in a specific rock volume. Visualisation: visualisation of the spatial distribution of thermal conductivity. The thermal site descriptive model should include the temperature distribution and thermal properties of the rock mass. The temperature is the result of the thermal processes in the repository area. Determination of thermal transport properties can be made using different methods, such as laboratory investigations, field measurements, modelling from mineralogical composition and distribution, modelling from density logging and modelling from temperature logging. The different types of data represent different scales, which has to be

  18. COMODI: an ontology to characterise differences in versions of computational models in biology.

    Science.gov (United States)

    Scharm, Martin; Waltemath, Dagmar; Mendes, Pedro; Wolkenhauer, Olaf

    2016-07-11

    Open model repositories provide ready-to-reuse computational models of biological systems. Models within those repositories evolve over time, leading to different model versions. Taken together, the underlying changes reflect a model's provenance and thus can give valuable insights into the studied biology. Currently, however, changes cannot be semantically interpreted. To improve this situation, we developed an ontology of terms describing changes in models. The ontology can be used by scientists and within software to characterise model updates at the level of single changes. When studying or reusing a model, these annotations help with determining the relevance of a change in a given context. We manually studied changes in selected models from BioModels and the Physiome Model Repository. Using the BiVeS tool for difference detection, we then performed an automatic analysis of changes in all models published in these repositories. The resulting set of concepts led us to define candidate terms for the ontology. In a final step, we aggregated and classified these terms and built the first version of the ontology. We present COMODI, an ontology needed because COmputational MOdels DIffer. It empowers users and software to describe changes in a model on the semantic level. COMODI also enables software to implement user-specific filter options for the display of model changes. Finally, COMODI is a step towards predicting how a change in a model influences the simulation results. COMODI, coupled with our algorithm for difference detection, ensures the transparency of a model's evolution, and it enhances the traceability of updates and error corrections. COMODI is encoded in OWL. It is openly available at http://comodi.sems.uni-rostock.de/ .

  19. Main modelling features of the ASTEC V2.1 major version

    International Nuclear Information System (INIS)

    Chatelard, P.; Belon, S.; Bosland, L.; Carénini, L.; Coindreau, O.; Cousin, F.; Marchetto, C.; Nowack, H.; Piar, L.; Chailan, L.

    2016-01-01

    Highlights: • Recent modelling improvements of the ASTEC European severe accident code are outlined. • Key new physical models now available in the ASTEC V2.1 major version are described. • ASTEC progress towards a multi-design reactor code is illustrated for BWR and PHWR. • ASTEC strong link with the on-going EC CESAM FP7 project is emphasized. • Main remaining modelling issues (on which IRSN efforts are now directing) are given. - Abstract: A new major version of the European severe accident integral code ASTEC, developed by IRSN with some GRS support, was delivered in November 2015 to the ASTEC worldwide community. Main modelling features of this V2.1 version are summarised in this paper. In particular, the in-vessel coupling technique between the reactor coolant system thermal-hydraulics module and the core degradation module has been strongly re-engineered to remove some well-known weaknesses of the former V2.0 series. The V2.1 version also includes new core degradation models specifically addressing BWR and PHWR reactor types, as well as several other physical modelling improvements, notably on reflooding of severely damaged cores, Zircaloy oxidation under air atmosphere, corium coolability during corium concrete interaction and source term evaluation. Moreover, this V2.1 version constitutes the back-bone of the CESAM FP7 project, which final objective is to further improve ASTEC for use in Severe Accident Management analysis of the Gen.II–III nuclear power plants presently under operation or foreseen in near future in Europe. As part of this European project, IRSN efforts to continuously improve both code numerical robustness and computing performances at plant scale as well as users’ tools are being intensified. Besides, ASTEC will continue capitalising the whole knowledge on severe accidents phenomenology by progressively keeping physical models at the state of the art through a regular feed-back from the interpretation of the current and

  20. GARUSO - Version 1.0. Uncertainty model for multipath ultrasonic transit time gas flow meters

    Energy Technology Data Exchange (ETDEWEB)

    Lunde, Per; Froeysa, Kjell-Eivind; Vestrheim, Magne

    1997-09-01

    This report describes an uncertainty model for ultrasonic transit time gas flow meters configured with parallel chords, and a PC program, GARUSO Version 1.0, implemented for calculation of the meter`s relative expanded uncertainty. The program, which is based on the theoretical uncertainty model, is used to carry out a simplified and limited uncertainty analysis for a 12`` 4-path meter, where examples of input and output uncertainties are given. The model predicts a relative expanded uncertainty for the meter at a level which further justifies today`s increasing tendency to use this type of instruments for fiscal metering of natural gas. 52 refs., 15 figs., 11 tabs.

  1. A multisensor evaluation of the asymmetric convective model, version 2, in southeast Texas.

    Science.gov (United States)

    Kolling, Jenna S; Pleim, Jonathan E; Jeffries, Harvey E; Vizuete, William

    2013-01-01

    There currently exist a number of planetary boundary layer (PBL) schemes that can represent the effects of turbulence in daytime convective conditions, although these schemes remain a large source of uncertainty in meteorology and air quality model simulations. This study evaluates a recently developed combined local and nonlocal closure PBL scheme, the Asymmetric Convective Model, version 2 (ACM2), against PBL observations taken from radar wind profilers, a ground-based lidar, and multiple daytime radiosonde balloon launches. These observations were compared against predictions of PBLs from the Weather Research and Forecasting (WRF) model version 3.1 with the ACM2 PBL scheme option, and the Fifth-Generation Meteorological Model (MM5) version 3.7.3 with the Eta PBL scheme option that is currently being used to develop ozone control strategies in southeast Texas. MM5 and WRF predictions during the regulatory modeling episode were evaluated on their ability to predict the rise and fall of the PBL during daytime convective conditions across southeastern Texas. The MM5 predicted PBLs consistently underpredicted observations, and were also less than the WRF PBL predictions. The analysis reveals that the MM5 predicted a slower rising and shallower PBL not representative of the daytime urban boundary layer. Alternatively, the WRF model predicted a more accurate PBL evolution improving the root mean square error (RMSE), both temporally and spatially. The WRF model also more accurately predicted vertical profiles of temperature and moisture in the lowest 3 km of the atmosphere. Inspection of median surface temperature and moisture time-series plots revealed higher predicted surface temperatures in WRF and more surface moisture in MM5. These could not be attributed to surface heat fluxes, and thus the differences in performance of the WRF and MM5 models are likely due to the PBL schemes. An accurate depiction of the diurnal evolution of the planetary boundary layer (PBL) is

  2. Incorporation of detailed eye model into polygon-mesh versions of ICRP-110 reference phantoms.

    Science.gov (United States)

    Nguyen, Thang Tat; Yeom, Yeon Soo; Kim, Han Sung; Wang, Zhao Jun; Han, Min Cheol; Kim, Chan Hyeong; Lee, Jai Ki; Zankl, Maria; Petoussi-Henss, Nina; Bolch, Wesley E; Lee, Choonsik; Chung, Beom Sun

    2015-11-21

    The dose coefficients for the eye lens reported in ICRP 2010 Publication 116 were calculated using both a stylized model and the ICRP-110 reference phantoms, according to the type of radiation, energy, and irradiation geometry. To maintain consistency of lens dose assessment, in the present study we incorporated the ICRP-116 detailed eye model into the converted polygon-mesh (PM) version of the ICRP-110 reference phantoms. After the incorporation, the dose coefficients for the eye lens were calculated and compared with those of the ICRP-116 data. The results showed generally a good agreement between the newly calculated lens dose coefficients and the values of ICRP 2010 Publication 116. Significant differences were found for some irradiation cases due mainly to the use of different types of phantoms. Considering that the PM version of the ICRP-110 reference phantoms preserve the original topology of the ICRP-110 reference phantoms, it is believed that the PM version phantoms, along with the detailed eye model, provide more reliable and consistent dose coefficients for the eye lens.

  3. Risk factors for cesarean section and instrumental vaginal delivery after successful external cephalic version

    NARCIS (Netherlands)

    de Hundt, Marcella; Vlemmix, Floortje; Bais, Joke M. J.; de Groot, Christianne J.; Mol, Ben Willem; Kok, Marjolein

    2016-01-01

    Aim of this article is to examine if we could identify factors that predict cesarean section and instrumental vaginal delivery in women who had a successful external cephalic version. We used data from a previous randomized trial among 25 hospitals and their referring midwife practices in the

  4. Incremental testing of the Community Multiscale Air Quality (CMAQ modeling system version 4.7

    Directory of Open Access Journals (Sweden)

    K. M. Foley

    2010-03-01

    Full Text Available This paper describes the scientific and structural updates to the latest release of the Community Multiscale Air Quality (CMAQ modeling system version 4.7 (v4.7 and points the reader to additional resources for further details. The model updates were evaluated relative to observations and results from previous model versions in a series of simulations conducted to incrementally assess the effect of each change. The focus of this paper is on five major scientific upgrades: (a updates to the heterogeneous N2O5 parameterization, (b improvement in the treatment of secondary organic aerosol (SOA, (c inclusion of dynamic mass transfer for coarse-mode aerosol, (d revisions to the cloud model, and (e new options for the calculation of photolysis rates. Incremental test simulations over the eastern United States during January and August 2006 are evaluated to assess the model response to each scientific improvement, providing explanations of differences in results between v4.7 and previously released CMAQ model versions. Particulate sulfate predictions are improved across all monitoring networks during both seasons due to cloud module updates. Numerous updates to the SOA module improve the simulation of seasonal variability and decrease the bias in organic carbon predictions at urban sites in the winter. Bias in the total mass of fine particulate matter (PM2.5 is dominated by overpredictions of unspeciated PM2.5 (PMother in the winter and by underpredictions of carbon in the summer. The CMAQv4.7 model results show slightly worse performance for ozone predictions. However, changes to the meteorological inputs are found to have a much greater impact on ozone predictions compared to changes to the CMAQ modules described here. Model updates had little effect on existing biases in wet deposition predictions.

  5. Model of MSD Risk Assessment at Workplace

    OpenAIRE

    K. Sekulová; M. Šimon

    2015-01-01

    This article focuses on upper-extremity musculoskeletal disorders risk assessment model at workplace. In this model are used risk factors that are responsible for musculoskeletal system damage. Based on statistic calculations the model is able to define what risk of MSD threatens workers who are under risk factors. The model is also able to say how MSD risk would decrease if these risk factors are eliminated.

  6. Scope Complexity Options Risks Excursions (SCORE) Version 3.0 Mathematical Description.

    Energy Technology Data Exchange (ETDEWEB)

    Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Samberson, Jonell Nicole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shettigar, Subhasini [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jungels, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Welch, Kimberly M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Dean A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-03-01

    The purpose of the Scope, Complexity, Options, Risks, Excursions (SCORE) model is to estimate the relative complexity of design variants of future warhead options. The results of this model allow those considering these options to understand the complexity tradeoffs between proposed warhead options. The core idea of SCORE is to divide a warhead option into a well- defined set of scope elements and then estimate the complexity of each scope element against a well understood reference system. The uncertainty associated with estimates can also be captured. A weighted summation of the relative complexity of each scope element is used to determine the total complexity of the proposed warhead option or portions of the warhead option (i.e., a National Work Breakdown Structure code). The SCORE analysis process is a growing multi-organizational Nuclear Security Enterprise (NSE) effort, under the management of the NA- 12 led Enterprise Modeling and Analysis Consortium (EMAC), that has provided the data elicitation, integration and computation needed to support the out-year Life Extension Program (LEP) cost estimates included in the Stockpile Stewardship Management Plan (SSMP).

  7. Statistical model of fractures and deformation zones. Preliminary site description, Laxemar subarea, version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hermanson, Jan; Forssberg, Ola [Golder Associates AB, Stockholm (Sweden); Fox, Aaron; La Pointe, Paul [Golder Associates Inc., Redmond, WA (United States)

    2005-10-15

    The goal of this summary report is to document the data sources, software tools, experimental methods, assumptions, and model parameters in the discrete-fracture network (DFN) model for the local model volume in Laxemar, version 1.2. The model parameters presented herein are intended for use by other project modeling teams. Individual modeling teams may elect to simplify or use only a portion of the DFN model, depending on their needs. This model is not intended to be a flow model or a mechanical model; as such, only the geometrical characterization is presented. The derivations of the hydraulic or mechanical properties of the fractures or their subsurface connectivities are not within the scope of this report. This model represents analyses carried out on particular data sets. If additional data are obtained, or values for existing data are changed or excluded, the conclusions reached in this report, and the parameter values calculated, may change as well. The model volume is divided into two subareas; one located on the Simpevarp peninsula adjacent to the power plant (Simpevarp), and one further to the west (Laxemar). The DFN parameters described in this report were determined by analysis of data collected within the local model volume. As such, the final DFN model is only valid within this local model volume and the modeling subareas (Laxemar and Simpevarp) within.

  8. A computationally efficient description of heterogeneous freezing: A simplified version of the Soccer ball model

    Science.gov (United States)

    Niedermeier, Dennis; Ervens, Barbara; Clauss, Tina; Voigtländer, Jens; Wex, Heike; Hartmann, Susan; Stratmann, Frank

    2014-01-01

    In a recent study, the Soccer ball model (SBM) was introduced for modeling and/or parameterizing heterogeneous ice nucleation processes. The model applies classical nucleation theory. It allows for a consistent description of both apparently singular and stochastic ice nucleation behavior, by distributing contact angles over the nucleation sites of a particle population assuming a Gaussian probability density function. The original SBM utilizes the Monte Carlo technique, which hampers its usage in atmospheric models, as fairly time-consuming calculations must be performed to obtain statistically significant results. Thus, we have developed a simplified and computationally more efficient version of the SBM. We successfully used the new SBM to parameterize experimental nucleation data of, e.g., bacterial ice nucleation. Both SBMs give identical results; however, the new model is computationally less expensive as confirmed by cloud parcel simulations. Therefore, it is a suitable tool for describing heterogeneous ice nucleation processes in atmospheric models.

  9. Community Land Model Version 3.0 (CLM3.0) Developer's Guide

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, FM

    2004-12-21

    This document describes the guidelines adopted for software development of the Community Land Model (CLM) and serves as a reference to the entire code base of the released version of the model. The version of the code described here is Version 3.0 which was released in the summer of 2004. This document, the Community Land Model Version 3.0 (CLM3.0) User's Guide (Vertenstein et al., 2004), the Technical Description of the Community Land Model (CLM) (Oleson et al., 2004), and the Community Land Model's Dynamic Global Vegetation Model (CLM-DGVM): Technical Description and User's Guide (Levis et al., 2004) provide the developer, user, or researcher with details of implementation, instructions for using the model, a scientific description of the model, and a scientific description of the Dynamic Global Vegetation Model integrated with CLM respectively. The CLM is a single column (snow-soil-vegetation) biogeophysical model of the land surface which can be run serially (on a laptop or personal computer) or in parallel (using distributed or shared memory processors or both) on both vector and scalar computer architectures. Written in Fortran 90, CLM can be run offline (i.e., run in isolation using stored atmospheric forcing data), coupled to an atmospheric model (e.g., the Community Atmosphere Model (CAM)), or coupled to a climate system model (e.g., the Community Climate System Model Version 3 (CCSM3)) through a flux coupler (e.g., Coupler 6 (CPL6)). When coupled, CLM exchanges fluxes of energy, water, and momentum with the atmosphere. The horizontal land surface heterogeneity is represented by a nested subgrid hierarchy composed of gridcells, landunits, columns, and plant functional types (PFTs). This hierarchical representation is reflected in the data structures used by the model code. Biophysical processes are simulated for each subgrid unit (landunit, column, and PFT) independently, and prognostic variables are maintained for each subgrid unit

  10. The modified version of the centre-of-mass correction to the bag model

    International Nuclear Information System (INIS)

    Bartelski, J.; Tatur, S.

    1986-01-01

    We propose the improvement of the recently considered version of the centre-of-mass correction to the bag model. We identify a nucleon bag with physical nucleon confined in an external fictitious spherical well potential with an additional external fictitious pressure characterized by the parameter b. The introduction of such a pressure restores the conservation of the canonical energy-momentum tensor, which was lost in the former model. We propose several methods to determine the numerical value of b. We calculate the Roper resonance mass as well as static electroweak parameters of a nucleon with centre-of-mass corrections taken into account. 7 refs., 1 tab. (author)

  11. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  12. Cancer Genetics Risk Assessment and Counseling (PDQ®)—Health Professional Version

    Science.gov (United States)

    Cancer genetics risk assessment and genetic counseling includes family history, psychosocial assessments, and education on hereditary cancer syndromes, testing, and risk. Get more information including the ethical, legal, and social implications of genetic testing in this summary for clinicians.

  13. Shortened version of the work ability index to identify workers at risk of long-term sickness absence.

    Science.gov (United States)

    Schouten, Lianne S; Bültmann, Ute; Heymans, Martijn W; Joling, Catelijne I; Twisk, Jos W R; Roelen, Corné A M

    2016-04-01

    The Work Ability Index (WAI) identifies non-sicklisted workers at risk of future long-term sickness absence (LTSA). The WAI is a complicated instrument and inconvenient for use in large-scale surveys. We investigated whether shortened versions of the WAI identify non-sicklisted workers at risk of LTSA. Prospective study including two samples of non-sicklisted workers participating in occupational health checks between 2010 and 2012. A heterogeneous development sample (N= 2899) was used to estimate logistic regression coefficients for the complete WAI, a shortened WAI version without the list of diseases, and single-item Work Ability Score (WAS). These three instruments were calibrated for predictions of different (≥2, ≥4 and ≥6 weeks) LTSA durations in a validation sample of non-sicklisted workers (N= 3049) employed at a steel mill, differentiating between manual (N= 1710) and non-manual (N= 1339) workers. The discriminative ability was investigated by receiver operating characteristic analysis. All three instruments under-predicted the LTSA risks in both manual and non-manual workers. The complete WAI discriminated between individuals at high and low risk of LTSA ≥2, ≥4 and ≥6 weeks in manual and non-manual workers. Risk predictions and discrimination by the shortened WAI without the list of diseases were as good as the complete WAI. The WAS showed poorer discrimination in manual and non-manual workers. The WAI without the list of diseases is a good alternative to the complete WAI to identify non-sicklisted workers at risk of future LTSA durations ≥2, ≥4 and ≥6 weeks. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  14. Spatiotemporal Modeling of Community Risk

    Science.gov (United States)

    2016-03-01

    Ertugay, and Sebnem Duzgun, “Exploratory and Inferential Methods for Spatio-Temporal Analysis of Residential Fire Clustering in Urban Areas,” Fire ...response in communities.”26 In “Exploratory and Inferential Methods for Spatio-temporal Analysis of Residential Fire Clustering in Urban Areas,” Ceyhan...of fire resources spread across the community. Spatiotemporal modeling shows that actualized risk is dynamic and relatively patterned. Though

  15. Taking Stock and Taking Steps: The Case for an Adolescent Version of the Short-Assessment of Risk and Treatability.

    Science.gov (United States)

    Viljoen, Jodi L; Cruise, Keith R; Nicholls, Tonia L; Desmarais, Sarah L; Webster, Christopher

    2012-01-01

    The field of violence risk assessment has matured considerably, possibly advancing beyond its own adolescence. At this point in the field's evolution, it is more important than ever for the development of any new device to be accompanied by a strong rationale and the capacity to provide a unique contribution. With this issue in mind, we first take stock of the field of adolescent risk assessment in order to describe the rapid progress that this field has made, as well as the gaps that led us to adapt the Short-Term Assessment of Risk and Treatability (START; Webster, Martin, Brink, Nicholls, & Desmarais, 2009) for use with adolescents. We view the Short-Term Assessment of Risk and Treatability: Adolescent Version (START:AV; Nicholls, Viljoen, Cruise, Desmarais, & Webster, 2010; Viljoen, Cruise, Nicholls, Desmarais, & Webster, in progress) as complementing other risk measures in four primary ways: 1) rather than focusing solely on violence risk, it examines broader adverse outcomes to which some adolescents are vulnerable (including self-harm, suicide, victimization, substance abuse, unauthorized leave, self-neglect, general offending); 2) it places a balanced emphasis on adolescents' strengths; 3) it focuses on dynamic factors that are relevant to short-term assessment, risk management, and treatment planning; and 4) it is designed for both mental health and justice populations. We describe the developmentally-informed approach we took in the adaptation of the START for adolescents, and outline future steps for the continuing validation and refinement of the START:AV.

  16. The complex model of risk and progression of AMD estimation

    Directory of Open Access Journals (Sweden)

    V. S. Akopyan

    2012-01-01

    Full Text Available Purpose: to develop a method and a statistical model to estimate individual risk of AMD and the risk for progression to advanced AMD using clinical and genetic risk factors.Methods: A statistical risk assessment model was developed using stepwise binary logistic regression analysis. to estimate the population differences in the prevalence of allelic variants of genes and for the development of models adapted to the population of Moscow region genotyping and assessment of the influence of other risk factors was performed in two groups: patients with differ- ent stages of AMD (n = 74, and control group (n = 116. Genetic risk factors included in the study: polymorphisms in the complement system genes (C3 and CFH, genes at 10q26 locus (ARMS2 and HtRA1, polymorphism in the mitochondrial gene Mt-ND2. Clinical risk factors included in the study: age, gender, high body mass index, smoking history.Results: A comprehensive analysis of genetic and clinical risk factors for AMD in the study group was performed. Compiled statis- tical model assessment of individual risk of AMD, the sensitivity of the model — 66.7%, specificity — 78.5%, AUC = 0.76. Risk factors of late AMD, compiled a statistical model describing the probability of late AMD, the sensitivity of the model — 66.7%, specificity — 78.3%, AUC = 0.73. the developed system allows determining the most likely version of the current late AMD: dry or wet.Conclusion: the developed test system and the mathematical algorhythm for determining the risk of AMD, risk of progression to advanced AMD have fair diagnostic informative and promising for use in clinical practice.

  17. MESOI Version 2.0: an interactive mesoscale Lagrangian puff dispersion model with deposition and decay

    International Nuclear Information System (INIS)

    Ramsdell, J.V.; Athey, G.F.; Glantz, C.S.

    1983-11-01

    MESOI Version 2.0 is an interactive Lagrangian puff model for estimating the transport, diffusion, deposition and decay of effluents released to the atmosphere. The model is capable of treating simultaneous releases from as many as four release points, which may be elevated or at ground-level. The puffs are advected by a horizontal wind field that is defined in three dimensions. The wind field may be adjusted for expected topographic effects. The concentration distribution within the puffs is initially assumed to be Gaussian in the horizontal and vertical. However, the vertical concentration distribution is modified by assuming reflection at the ground and the top of the atmospheric mixing layer. Material is deposited on the surface using a source depletion, dry deposition model and a washout coefficient model. The model also treats the decay of a primary effluent species and the ingrowth and decay of a single daughter species using a first order decay process. This report is divided into two parts. The first part discusses the theoretical and mathematical bases upon which MESOI Version 2.0 is based. The second part contains the MESOI computer code. The programs were written in the ANSI standard FORTRAN 77 and were developed on a VAX 11/780 computer. 43 references, 14 figures, 13 tables

  18. A p-version embedded model for simulation of concrete temperature fields with cooling pipes

    Directory of Open Access Journals (Sweden)

    Sheng Qiang

    2015-07-01

    Full Text Available Pipe cooling is an effective method of mass concrete temperature control, but its accurate and convenient numerical simulation is still a cumbersome problem. An improved embedded model, considering the water temperature variation along the pipe, was proposed for simulating the temperature field of early-age concrete structures containing cooling pipes. The improved model was verified with an engineering example. Then, the p-version self-adaption algorithm for the improved embedded model was deduced, and the initial values and boundary conditions were examined. Comparison of some numerical samples shows that the proposed model can provide satisfying precision and a higher efficiency. The analysis efficiency can be doubled at the same precision, even for a large-scale element. The p-version algorithm can fit grids of different sizes for the temperature field simulation. The convenience of the proposed algorithm lies in the possibility of locating more pipe segments in one element without the need of so regular a shape as in the explicit model.

  19. Manual risk zoning wind turbines. Final version. 3rd updated version May 2013. 3 ed.; Handboek risicozonering windturbines. Eindversie. 3e geactualiseerde versie mei 2013

    Energy Technology Data Exchange (ETDEWEB)

    Faasen, C.J.; Franck, P.A.L.; Taris, A.M.H.W. [DNV KEMA, Arnhem (Netherlands)

    2013-05-15

    The title manual has been drafted to be able to quickly inventory the safety risks of wind turbines, which can speed up the licensing procedures for the installation of wind turbines. Attention is paid to the following categories: building areas, roads, waterways, railroads, industrial areas, underground cables and mains, aboveground cables, power transmission lines, dikes and dams, and telecommunication. For each of these categories the risks are calculated. This is an updated version of the manual which was first published in 2000 and updated in 2005 [Dutch] Dit is een actualisatie van het Handboek uit 2005. Het oorspronkelijke Handboek is in 2000 opgesteld. Het in 2000 verschenen Handboek is door ECN samengesteld in opdracht van Novem (nu: Agentschap NL) met als doel een uniforme methode te bieden voor het uitvoeren van kwantitatieve risicoanalyses en voor het toetsen van de resultaten aan acceptatiecriteria. Dit Handboek bood antwoord op de vraag van zowel projectontwikkelaars als overheden naar een algemeen geldende methode om veiligheidsrisico's van windturbines te berekenen voor diverse omgevingsaspecten. In 2005 is een actualisatie van het Handboek uitgebracht door ECN en KEMA in opdracht van SenterNovem (nu onderdeel van Agentschap NL). Daarin is verder ingegaan op turbines met grotere vermogens en is het Handboek aangevuld met rekenvoorbeelden. Vanwege de verdere ontwikkeling van de windturbinetechnologie, aangepaste wetgeving, het feit dat rekenmodellen verouderd waren en dat diverse infrastructuur steeds vaker gebundeld of geclusterd met windturbines wordt gerealiseerd, is het wenselijk / noodzakelijk de eventuele risico's op een consistente en eenduidige wijze in kaart te brengen. Daarom heeft Agentschap NL in 2012 opdracht gegeven aan DNV KEMA om het Handboek opnieuw te actualiseren. Hierbij zijn de resultaten van het rapport: 'Rekenmethodiek zonering windturbines in relatie tot gas- en elektrische infrastructuur' (2012), dat in

  20. Analysis of uncertainty in modeling perceived risks

    International Nuclear Information System (INIS)

    Melnyk, R.; Sandquist, G.M.

    2005-01-01

    Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)

  1. Description of the new version 4.0 of the tritium model UFOTRI including user guide

    International Nuclear Information System (INIS)

    Raskob, W.

    1993-08-01

    In view of the future operation of fusion reactors the release of tritium may play a dominant role during normal operation as well as after accidents. Because of its physical and chemical properties which differ significantly from those of other radionuclides, the model UFOTRI for assessing the radiological consequences of accidental tritium releases has been developed. It describes the behaviour of tritium in the biosphere and calculates the radiological impact on individuals and the population due to the direct exposure and by the ingestion pathways. Processes such as the conversion of tritium gas into tritiated water (HTO) in the soil, re-emission after deposition and the conversion of HTO into organically bound tritium, are considered. The use of UFOTRI in its probabilistic mode shows the spectrum of the radiological impact together with the associated probability of occurrence. A first model version was established in 1991. As the ongoing work on investigating the main processes of the tritium behaviour in the environment shows up new results, the model has been improved in several points. The report describes the changes incorporated into the model since 1991. Additionally provides the up-dated user guide for handling the revised UFOTRI version which will be distributed to interested organizations. (orig.) [de

  2. Model risk analysis for risk management and option pricing

    NARCIS (Netherlands)

    Kerkhof, F.L.J.

    2003-01-01

    Due to the growing complexity of products in financial markets, market participants rely more and more on quantitative models for trading and risk management decisions. This introduces a fairly new type of risk, namely, model risk. In the first part of this thesis we investigate the quantitative

  3. Dynamic Computation of Change Operations in Version Management of Business Process Models

    Science.gov (United States)

    Küster, Jochen Malte; Gerth, Christian; Engels, Gregor

    Version management of business process models requires that changes can be resolved by applying change operations. In order to give a user maximal freedom concerning the application order of change operations, position parameters of change operations must be computed dynamically during change resolution. In such an approach, change operations with computed position parameters must be applicable on the model and dependencies and conflicts of change operations must be taken into account because otherwise invalid models can be constructed. In this paper, we study the concept of partially specified change operations where parameters are computed dynamically. We provide a formalization for partially specified change operations using graph transformation and provide a concept for their applicability. Based on this, we study potential dependencies and conflicts of change operations and show how these can be taken into account within change resolution. Using our approach, a user can resolve changes of business process models without being unnecessarily restricted to a certain order.

  4. Update of the Polar SWIFT model for polar stratospheric ozone loss (Polar SWIFT version 2)

    Science.gov (United States)

    Wohltmann, Ingo; Lehmann, Ralph; Rex, Markus

    2017-07-01

    The Polar SWIFT model is a fast scheme for calculating the chemistry of stratospheric ozone depletion in polar winter. It is intended for use in global climate models (GCMs) and Earth system models (ESMs) to enable the simulation of mutual interactions between the ozone layer and climate. To date, climate models often use prescribed ozone fields, since a full stratospheric chemistry scheme is computationally very expensive. Polar SWIFT is based on a set of coupled differential equations, which simulate the polar vortex-averaged mixing ratios of the key species involved in polar ozone depletion on a given vertical level. These species are O3, chemically active chlorine (ClOx), HCl, ClONO2 and HNO3. The only external input parameters that drive the model are the fraction of the polar vortex in sunlight and the fraction of the polar vortex below the temperatures necessary for the formation of polar stratospheric clouds. Here, we present an update of the Polar SWIFT model introducing several improvements over the original model formulation. In particular, the model is now trained on vortex-averaged reaction rates of the ATLAS Chemistry and Transport Model, which enables a detailed look at individual processes and an independent validation of the different parameterizations contained in the differential equations. The training of the original Polar SWIFT model was based on fitting complete model runs to satellite observations and did not allow for this. A revised formulation of the system of differential equations is developed, which closely fits vortex-averaged reaction rates from ATLAS that represent the main chemical processes influencing ozone. In addition, a parameterization for the HNO3 change by denitrification is included. The rates of change of the concentrations of the chemical species of the Polar SWIFT model are purely chemical rates of change in the new version, whereas in the original Polar SWIFT model, they included a transport effect caused by the

  5. QMM – A Quarterly Macroeconomic Model of the Icelandic Economy. Version 2.0

    DEFF Research Database (Denmark)

    Ólafsson, Tjörvi

    This paper documents and describes Version 2.0 of the Quarterly Macroeconomic Model of the Central Bank of Iceland (QMM). QMM and the underlying quarterly database have been under construction since 2001 at the Research and Forecasting Division of the Economics Department at the Bank and was first...... implemented in the forecasting round for the Monetary Bulletin 2006/1 in March 2006. QMM is used by the Bank for forecasting and various policy simulations and therefore plays a key role as an organisational framework for viewing the medium-term future when formulating monetary policy at the Bank. This paper...

  6. GOOSE Version 1.4: A powerful object-oriented simulation environment for developing reactor models

    International Nuclear Information System (INIS)

    Nypaver, D.J.; March-Leuba, C.; Abdalla, M.A.; Guimaraes, L.

    1992-01-01

    A prototype software package for a fully interactive Generalized Object-Oriented Simulation Environment (GOOSE) is being developed at Oak Ridge National Laboratory. Dynamic models are easily constructed and tested; fully interactive capabilities allow the user to alter model parameters and complexity without recompilation. This environment provides assess to powerful tools such as numerical integration packages, graphical displays, and online help. In GOOSE, portability has been achieved by creating the environment in Objective-C 1 , which is supported by a variety of platforms including UNIX and DOS. GOOSE Version 1.4 introduces new enhancements like the capability of creating ''initial,'' ''dynamic,'' and ''digital'' methods. The object-oriented approach to simulation used in GOOSE combines the concept of modularity with the additional features of allowing precompilation, optimization, testing, and validation of individual modules. Once a library of classes has been defined and compiled, models can be built and modified without recompilation. GOOSE Version 1.4 is primarily command-line driven

  7. A RETRAN-02 model of the Sizewell B PCSR design - the Winfrith one-loop model, version 3.0

    International Nuclear Information System (INIS)

    Kinnersly, S.R.

    1983-11-01

    A one-loop RETRAN-02 model of the Sizewell B Pre Construction Safety Report (PCSR) design, set up at Winfrith, is described and documented. The model is suitable for symmetrical pressurised transients. Comparison with data from the Sizewell B PCSR shows that the model is a good representation of that design. Known errors, limitations and deficiencies are described. The mode of storage and maintenance at Winfrith using PROMUS (Program Maintenance and Update System) is noted. It is recommended that users modify the standard data by adding replacement cards to the end so as to aid in identification, use and maintenance of local versions. (author)

  8. Credit Risk Evaluation : Modeling - Analysis - Management

    OpenAIRE

    Wehrspohn, Uwe

    2002-01-01

    An analysis and further development of the building blocks of modern credit risk management: -Definitions of default -Estimation of default probabilities -Exposures -Recovery Rates -Pricing -Concepts of portfolio dependence -Time horizons for risk calculations -Quantification of portfolio risk -Estimation of risk measures -Portfolio analysis and portfolio improvement -Evaluation and comparison of credit risk models -Analytic portfolio loss distributions The thesis contributes to the evaluatio...

  9. The Systems Biology Markup Language (SBML) Level 3 Package: Qualitative Models, Version 1, Release 1.

    Science.gov (United States)

    Chaouiya, Claudine; Keating, Sarah M; Berenguier, Duncan; Naldi, Aurélien; Thieffry, Denis; van Iersel, Martijn P; Le Novère, Nicolas; Helikar, Tomáš

    2015-09-04

    Quantitative methods for modelling biological networks require an in-depth knowledge of the biochemical reactions and their stoichiometric and kinetic parameters. In many practical cases, this knowledge is missing. This has led to the development of several qualitative modelling methods using information such as, for example, gene expression data coming from functional genomic experiments. The SBML Level 3 Version 1 Core specification does not provide a mechanism for explicitly encoding qualitative models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Qualitative Models package for SBML Level 3 adds features so that qualitative models can be directly and explicitly encoded. The approach taken in this package is essentially based on the definition of regulatory or influence graphs. The SBML Qualitative Models package defines the structure and syntax necessary to describe qualitative models that associate discrete levels of activities with entity pools and the transitions between states that describe the processes involved. This is particularly suited to logical models (Boolean or multi-valued) and some classes of Petri net models can be encoded with the approach.

  10. Landfill Gas Energy Cost Model Version 3.0 (LFGcost-Web V3 ...

    Science.gov (United States)

    To help stakeholders estimate the costs of a landfill gas (LFG) energy project, in 2002, LMOP developed a cost tool (LFGcost). Since then, LMOP has routinely updated the tool to reflect changes in the LFG energy industry. Initially the model was designed for EPA to assist landfills in evaluating the economic and financial feasibility of LFG energy project development. In 2014, LMOP developed a public version of the model, LFGcost-Web (Version 3.0), to allow landfill and industry stakeholders to evaluate project feasibility on their own. LFGcost-Web can analyze costs for 12 energy recovery project types. These project costs can be estimated with or without the costs of a gas collection and control system (GCCS). The EPA used select equations from LFGcost-Web to estimate costs of the regulatory options in the 2015 proposed revisions to the MSW Landfills Standards of Performance (also known as New Source Performance Standards) and the Emission Guidelines (herein thereafter referred to collectively as the Landfill Rules). More specifically, equations derived from LFGcost-Web were applied to each landfill expected to be impacted by the Landfill Rules to estimate annualized installed capital costs and annual O&M costs of a gas collection and control system. In addition, after applying the LFGcost-Web equations to the list of landfills expected to require a GCCS in year 2025 as a result of the proposed Landfill Rules, the regulatory analysis evaluated whether electr

  11. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (IBM PC VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  12. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (MACINTOSH VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    The frequency and intensity of rain attenuation affecting the communication between a satellite and an earth terminal is an important consideration in planning satellite links. The NASA Lewis Research Center Satellite Link Attenuation Model Program (LeRC-SLAM) provides a static and dynamic statistical assessment of the impact of rain attenuation on a communications link established between an earth terminal and a geosynchronous satellite. The program is designed for use in the specification, design and assessment of satellite links for any terminal location in the continental United States. The basis for LeRC-SLAM is the ACTS Rain Attenuation Prediction Model, which uses a log-normal cumulative probability distribution to describe the random process of rain attenuation on satellite links. The derivation of the statistics for the rainrate process at the specified terminal location relies on long term rainfall records compiled by the U.S. Weather Service during time periods of up to 55 years in length. The theory of extreme value statistics is also utilized. The user provides 1) the longitudinal position of the satellite in geosynchronous orbit, 2) the geographical position of the earth terminal in terms of latitude and longitude, 3) the height above sea level of the terminal site, 4) the yearly average rainfall at the terminal site, and 5) the operating frequency of the communications link (within 1 to 1000 GHz, inclusive). Based on the yearly average rainfall at the terminal location, LeRC-SLAM calculates the relevant rain statistics for the site using an internal data base. The program then generates rain attenuation data for the satellite link. This data includes a description of the static (i.e., yearly) attenuation process, an evaluation of the cumulative probability distribution for attenuation effects, and an evaluation of the probability of fades below selected fade depths. In addition, LeRC-SLAM calculates the elevation and azimuth angles of the terminal

  13. Validity study of the Beck Anxiety Inventory (Portuguese version by the Rasch Rating Scale model

    Directory of Open Access Journals (Sweden)

    Sónia Quintão

    2013-01-01

    Full Text Available Our objective was to conduct a validation study of the Portuguese version of the Beck Anxiety Inventory (BAI by means of the Rasch Rating Scale Model, and then compare it with the most used scales of anxiety in Portugal. The sample consisted of 1,160 adults (427 men and 733 women, aged 18-82 years old (M=33.39; SD=11.85. Instruments were Beck Anxiety Inventory, State-Trait Anxiety Inventory and Zung Self-Rating Anxiety Scale. It was found that Beck Anxiety Inventory's system of four categories, the data-model fit, and people reliability were adequate. The measure can be considered as unidimensional. Gender and age-related differences were not a threat to the validity. BAI correlated significantly with other anxiety measures. In conclusion, BAI shows good psychometric quality.

  14. New Source Term Model for the RESRAD-OFFSITE Code Version 3

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Charley [Argonne National Lab. (ANL), Argonne, IL (United States); Gnanapragasam, Emmanuel [Argonne National Lab. (ANL), Argonne, IL (United States); Cheng, Jing-Jy [Argonne National Lab. (ANL), Argonne, IL (United States); Kamboj, Sunita [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Shih-Yew [Argonne National Lab. (ANL), Argonne, IL (United States)

    2013-06-01

    This report documents the new source term model developed and implemented in Version 3 of the RESRAD-OFFSITE code. This new source term model includes: (1) "first order release with transport" option, in which the release of the radionuclide is proportional to the inventory in the primary contamination and the user-specified leach rate is the proportionality constant, (2) "equilibrium desorption release" option, in which the user specifies the distribution coefficient which quantifies the partitioning of the radionuclide between the solid and aqueous phases, and (3) "uniform release" option, in which the radionuclides are released from a constant fraction of the initially contaminated material during each time interval and the user specifies the duration over which the radionuclides are released.

  15. The SGHWR version of the Monte Carlo code W-MONTE. Part 1. The theoretical model

    International Nuclear Information System (INIS)

    Allen, F.R.

    1976-03-01

    W-MONTE provides a multi-group model of neutron transport in the exact geometry of a reactor lattice using Monte Carlo methods. It is currently restricted to uniform axial properties. Material data is normally obtained from a preliminary WIMS lattice calculation in the transport group structure. The SGHWR version has been required for analysis of zero energy experiments and special aspects of power reactor lattices, such as the unmoderated lattice region above the moderator when drained to dump height. Neutron transport is modelled for a uniform infinite lattice, simultaneously treating the cases of no leakage, radial leakage or axial leakage only, and the combined effects of radial and axial leakage. Multigroup neutron balance edits are incorporated for the separate effects of radial and axial leakage to facilitate the analysis of leakage and to provide effective diffusion theory parameters for core representation in reactor cores. (author)

  16. Comparison of three ice cloud optical schemes in climate simulations with community atmospheric model version 5

    Science.gov (United States)

    Zhao, Wenjie; Peng, Yiran; Wang, Bin; Yi, Bingqi; Lin, Yanluan; Li, Jiangnan

    2018-05-01

    A newly implemented Baum-Yang scheme for simulating ice cloud optical properties is compared with existing schemes (Mitchell and Fu schemes) in a standalone radiative transfer model and in the global climate model (GCM) Community Atmospheric Model Version 5 (CAM5). This study systematically analyzes the effect of different ice cloud optical schemes on global radiation and climate by a series of simulations with a simplified standalone radiative transfer model, atmospheric GCM CAM5, and a comprehensive coupled climate model. Results from the standalone radiative model show that Baum-Yang scheme yields generally weaker effects of ice cloud on temperature profiles both in shortwave and longwave spectrum. CAM5 simulations indicate that Baum-Yang scheme in place of Mitchell/Fu scheme tends to cool the upper atmosphere and strengthen the thermodynamic instability in low- and mid-latitudes, which could intensify the Hadley circulation and dehydrate the subtropics. When CAM5 is coupled with a slab ocean model to include simplified air-sea interaction, reduced downward longwave flux to surface in Baum-Yang scheme mitigates ice-albedo feedback in the Arctic as well as water vapor and cloud feedbacks in low- and mid-latitudes, resulting in an overall temperature decrease by 3.0/1.4 °C globally compared with Mitchell/Fu schemes. Radiative effect and climate feedback of the three ice cloud optical schemes documented in this study can be referred for future improvements on ice cloud simulation in CAM5.

  17. Immersion freezing by natural dust based on a soccer ball model with the Community Atmospheric Model version 5: climate effects

    Science.gov (United States)

    Wang, Yong; Liu, Xiaohong

    2014-12-01

    We introduce a simplified version of the soccer ball model (SBM) developed by Niedermeier et al (2014 Geophys. Res. Lett. 41 736-741) into the Community Atmospheric Model version 5 (CAM5). It is the first time that SBM is used in an atmospheric model to parameterize the heterogeneous ice nucleation. The SBM, which was simplified for its suitable application in atmospheric models, uses the classical nucleation theory to describe the immersion/condensation freezing by dust in the mixed-phase cloud regime. Uncertain parameters (mean contact angle, standard deviation of contact angle probability distribution, and number of surface sites) in the SBM are constrained by fitting them to recent natural dust (Saharan dust) datasets. With the SBM in CAM5, we investigate the sensitivity of modeled cloud properties to the SBM parameters, and find significant seasonal and regional differences in the sensitivity among the three SBM parameters. Changes of mean contact angle and the number of surface sites lead to changes of cloud properties in Arctic in spring, which could be attributed to the transport of dust ice nuclei to this region. In winter, significant changes of cloud properties induced by these two parameters mainly occur in northern hemispheric mid-latitudes (e.g., East Asia). In comparison, no obvious changes of cloud properties caused by changes of standard deviation can be found in all the seasons. These results are valuable for understanding the heterogeneous ice nucleation behavior, and useful for guiding the future model developments.

  18. Immersion freezing by natural dust based on a soccer ball model with the Community Atmospheric Model version 5: climate effects

    International Nuclear Information System (INIS)

    Wang, Yong; Liu, Xiaohong

    2014-01-01

    We introduce a simplified version of the soccer ball model (SBM) developed by Niedermeier et al (2014 Geophys. Res. Lett. 41 736–741) into the Community Atmospheric Model version 5 (CAM5). It is the first time that SBM is used in an atmospheric model to parameterize the heterogeneous ice nucleation. The SBM, which was simplified for its suitable application in atmospheric models, uses the classical nucleation theory to describe the immersion/condensation freezing by dust in the mixed-phase cloud regime. Uncertain parameters (mean contact angle, standard deviation of contact angle probability distribution, and number of surface sites) in the SBM are constrained by fitting them to recent natural dust (Saharan dust) datasets. With the SBM in CAM5, we investigate the sensitivity of modeled cloud properties to the SBM parameters, and find significant seasonal and regional differences in the sensitivity among the three SBM parameters. Changes of mean contact angle and the number of surface sites lead to changes of cloud properties in Arctic in spring, which could be attributed to the transport of dust ice nuclei to this region. In winter, significant changes of cloud properties induced by these two parameters mainly occur in northern hemispheric mid-latitudes (e.g., East Asia). In comparison, no obvious changes of cloud properties caused by changes of standard deviation can be found in all the seasons. These results are valuable for understanding the heterogeneous ice nucleation behavior, and useful for guiding the future model developments. (letter)

  19. Incorporating remote sensing-based ET estimates into the Community Land Model version 4.5

    Directory of Open Access Journals (Sweden)

    D. Wang

    2017-07-01

    Full Text Available Land surface models bear substantial biases in simulating surface water and energy budgets despite the continuous development and improvement of model parameterizations. To reduce model biases, Parr et al. (2015 proposed a method incorporating satellite-based evapotranspiration (ET products into land surface models. Here we apply this bias correction method to the Community Land Model version 4.5 (CLM4.5 and test its performance over the conterminous US (CONUS. We first calibrate a relationship between the observational ET from the Global Land Evaporation Amsterdam Model (GLEAM product and the model ET from CLM4.5, and assume that this relationship holds beyond the calibration period. During the validation or application period, a simulation using the default CLM4.5 (CLM is conducted first, and its output is combined with the calibrated observational-vs.-model ET relationship to derive a corrected ET; an experiment (CLMET is then conducted in which the model-generated ET is overwritten with the corrected ET. Using the observations of ET, runoff, and soil moisture content as benchmarks, we demonstrate that CLMET greatly improves the hydrological simulations over most of the CONUS, and the improvement is stronger in the eastern CONUS than the western CONUS and is strongest over the Southeast CONUS. For any specific region, the degree of the improvement depends on whether the relationship between observational and model ET remains time-invariant (a fundamental hypothesis of the Parr et al. (2015 method and whether water is the limiting factor in places where ET is underestimated. While the bias correction method improves hydrological estimates without improving the physical parameterization of land surface models, results from this study do provide guidance for physically based model development effort.

  20. Statistical models for competing risk analysis

    International Nuclear Information System (INIS)

    Sather, H.N.

    1976-08-01

    Research results on three new models for potential applications in competing risks problems. One section covers the basic statistical relationships underlying the subsequent competing risks model development. Another discusses the problem of comparing cause-specific risk structure by competing risks theory in two homogeneous populations, P1 and P2. Weibull models which allow more generality than the Berkson and Elveback models are studied for the effect of time on the hazard function. The use of concomitant information for modeling single-risk survival is extended to the multiple failure mode domain of competing risks. The model used to illustrate the use of this methodology is a life table model which has constant hazards within pre-designated intervals of the time scale. Two parametric models for bivariate dependent competing risks, which provide interesting alternatives, are proposed and examined

  1. Risk matrix model for rotating equipment

    Directory of Open Access Journals (Sweden)

    Wassan Rano Khan

    2014-07-01

    Full Text Available Different industries have various residual risk levels for their rotating equipment. Accordingly the occurrence rate of the failures and associated failure consequences categories are different. Thus, a generalized risk matrix model is developed in this study which can fit various available risk matrix standards. This generalized risk matrix will be helpful to develop new risk matrix, to fit the required risk assessment scenario for rotating equipment. Power generation system was taken as case study. It was observed that eight subsystems were under risk. Only vibration monitor system was under high risk category, while remaining seven subsystems were under serious and medium risk categories.

  2. VALIDATION OF THE ASTER GLOBAL DIGITAL ELEVATION MODEL VERSION 3 OVER THE CONTERMINOUS UNITED STATES

    Directory of Open Access Journals (Sweden)

    D. Gesch

    2016-06-01

    Full Text Available The ASTER Global Digital Elevation Model Version 3 (GDEM v3 was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1 in 2009 and GDEM Version 2 (v2 in 2011. The absolute vertical accuracy of GDEM v3 was calculated by comparison with more than 23,000 independent reference geodetic ground control points from the U.S. National Geodetic Survey. The root mean square error (RMSE measured for GDEM v3 is 8.52 meters. This compares with the RMSE of 8.68 meters for GDEM v2. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v3 mean error of −1.20 meters reflects an overall negative bias in GDEM v3. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover type to provide insight into how GDEM v3 performs in various land surface conditions. While the RMSE varies little across cover types (6.92 to 9.25 meters, the mean error (bias does appear to be affected by land cover type, ranging from −2.99 to +4.16 meters across 14 land cover classes. These results indicate that in areas where built or natural aboveground features are present, GDEM v3 is measuring elevations above the ground level, a condition noted in assessments of previous GDEM versions (v1 and v2 and an expected condition given the type of stereo-optical image data collected by ASTER. GDEM v3 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM dataset. In many forested areas, GDEM v3 has elevations that are higher in the canopy than SRTM. The overall validation effort also included an evaluation of the GDEM v3 water mask. In general, the number of distinct water polygons in GDEM v3 is much lower than the number in a reference land cover dataset, but the total areas compare much more closely.

  3. Validation of the ASTER Global Digital Elevation Model version 3 over the conterminous United States

    Science.gov (United States)

    Gesch, Dean B.; Oimoen, Michael J.; Danielson, Jeffrey J.; Meyer, David; Halounova, L; Šafář, V.; Jiang, J.; Olešovská, H.; Dvořáček, P.; Holland, D.; Seredovich, V.A.; Muller, J.P.; Pattabhi Rama Rao, E.; Veenendaal, B.; Mu, L.; Zlatanova, S.; Oberst, J.; Yang, C.P.; Ban, Y.; Stylianidis, S.; Voženílek, V.; Vondráková, A.; Gartner, G.; Remondino, F.; Doytsher, Y.; Percivall, George; Schreier, G.; Dowman, I.; Streilein, A.; Ernst, J.

    2016-01-01

    The ASTER Global Digital Elevation Model Version 3 (GDEM v3) was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1) in 2009 and GDEM Version 2 (v2) in 2011. The absolute vertical accuracy of GDEM v3 was calculated by comparison with more than 23,000 independent reference geodetic ground control points from the U.S. National Geodetic Survey. The root mean square error (RMSE) measured for GDEM v3 is 8.52 meters. This compares with the RMSE of 8.68 meters for GDEM v2. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v3 mean error of −1.20 meters reflects an overall negative bias in GDEM v3. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover type to provide insight into how GDEM v3 performs in various land surface conditions. While the RMSE varies little across cover types (6.92 to 9.25 meters), the mean error (bias) does appear to be affected by land cover type, ranging from −2.99 to +4.16 meters across 14 land cover classes. These results indicate that in areas where built or natural aboveground features are present, GDEM v3 is measuring elevations above the ground level, a condition noted in assessments of previous GDEM versions (v1 and v2) and an expected condition given the type of stereo-optical image data collected by ASTER. GDEM v3 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM) dataset. In many forested areas, GDEM v3 has elevations that are higher in the canopy than SRTM. The overall validation effort also included an evaluation of the GDEM v3 water mask. In general, the number of distinct water polygons in GDEM v3 is much lower than the number in a reference land cover dataset, but the total areas compare much more closely.

  4. Community Practice Implementation of a Self-administered Version of PREMM1,2,6 to Assess Risk for Lynch Syndrome.

    Science.gov (United States)

    Luba, Daniel G; DiSario, James A; Rock, Colleen; Saraiya, Devki; Moyes, Kelsey; Brown, Krystal; Rushton, Kristen; Ogara, Maydeen M; Raphael, Mona; Zimmerman, Dayna; Garrido, Kimmie; Silguero, Evelyn; Nelson, Jonathan; Yurgelun, Matthew B; Kastrinos, Fay; Wenstrup, Richard J; Syngal, Sapna

    2018-01-01

    incorporation of PREMM 1,2,6 into their clinical practice, and that they would continue using it to assess risk for Lynch syndrome. A patient self-administered version of the PREMM 1,2,6 Lynch syndrome risk assessment model can be used systematically in community-based gastroenterology and endoscopy practices. Copyright © 2018 AGA Institute. Published by Elsevier Inc. All rights reserved.

  5. 78 FR 32224 - Availability of Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional...

    Science.gov (United States)

    2013-05-29

    ... Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional Discussion Topics in Connect America Cost Model Virtual Workshop AGENCY: Federal Communications Commission. ACTION: Proposed rule... America Cost Model (CAM v3.1.2), which allows Commission staff and interested parties to calculate costs...

  6. Version 2.0 of the European Gas Model. Changes and their impact on the German gas sector

    International Nuclear Information System (INIS)

    Balmert, David; Petrov, Konstantin

    2015-01-01

    In January 2015 ACER, the European Agency for the Cooperation of Energy Regulators, presented an updated version of its target model for the inner-European natural gas market, also referred to as version 2.0 of the Gas Target Model. During 2014 the existing model, originally developed by the Council of European Energy Regulators (CEER) and launched in 2011, had been analysed, revised and updated in preparation of the new version. While it has few surprises to offer, the new Gas Target Model contains specifies and goes into greater detail on many elements of the original model. Some of the new content is highly relevant to the German gas sector, not least the deliberations on the current key issues, which are security of supply and the ability of the gas markets to function.

  7. Hydrogeochemical evaluation for Simpevarp model version 1.2. Preliminary site description of the Simpevarp area

    International Nuclear Information System (INIS)

    Laaksoharju, Marcus

    2004-12-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Simpevarp and Forsmark, to determine their geological, hydrogeochemical and hydrogeological characteristics. Present work completed has resulted in Model version 1.2 which represents the second evaluation of the available Simpevarp groundwater analytical data collected up to April, 2004. The deepest fracture groundwater samples with sufficient analytical data reflected depths down to 1.7 km. Model version 1.2 focusses on geochemical and mixing processes affecting the groundwater composition in the uppermost part of the bedrock, down to repository levels, and eventually extending to 1000 m depth. The groundwater flow regimes at Laxemar/Simpevarp are considered local and extend down to depths of around 600-1000 m depending on local topography. The marked differences in the groundwater flow regimes between Laxemar and Simpevarp are reflected in the groundwater chemistry where four major hydrochemical groups of groundwaters (types A-D) have been identified: TYPE A: This type comprises dilute groundwaters ( 3 type present at shallow ( 300 m) levels at Simpevarp, and at even greater depths (approx. 1200 m) at Laxemar. At Simpevarp the groundwaters are mainly Na-Ca-Cl with increasingly enhanced Br and SO 4 with depth. At Laxemar they are mainly Ca-Na-Cl also with increasing enhancements of Br and SO 4 with depth. Main reactions involve ion exchange (Ca). At both sites a glacial component and a deep saline component are present. At Simpevarp the saline component may be potentially non marine and/or non-marine/old Littorina marine in origin; at Laxemar it is more likely to be non-marine in origin. TYPE D: This type comprises reducing highly saline groundwaters (> 20 000 mg/L Cl; to a maximum of ∼70 g/L TDS) and only has been identified at Laxemar at depths exceeding 1200 m. It is mainly Ca-Na-Cl with higher Br but lower SO 4 compared

  8. Evaluation of a new CNRM-CM6 model version for seasonal climate predictions

    Science.gov (United States)

    Volpi, Danila; Ardilouze, Constantin; Batté, Lauriane; Dorel, Laurant; Guérémy, Jean-François; Déqué, Michel

    2017-04-01

    This work presents the quality assessment of a new version of the Météo-France coupled climate prediction system, which has been developed in the EU COPERNICUS Climate Change Services framework to carry out seasonal forecast. The system is based on the CNRM-CM6 model, with Arpege-Surfex 6.2.2 as atmosphere/land component and Nemo 3.2 as ocean component, which has directly embedded the sea-ice component Gelato 6.0. In order to have a robust diagnostic, the experiment is composed by 60 ensemble members generated with stochastic dynamic perturbations. The experiment has been performed over a 37-year re-forecast period from 1979 to 2015, with two start dates per year, respectively in May 1st and November 1st. The evaluation of the predictive skill of the model is shown under two perspectives: on the one hand, the ability of the model to faithfully respond to positive or negative ENSO, NAO and QBO events, independently of the predictability of these events. Such assessment is carried out through a composite analysis, and shows that the model succeeds in reproducing the main patterns for 2-meter temperature, precipitation and geopotential height at 500 hPa during the winter season. On the other hand, the model predictive skill of the same events (positive and negative ENSO, NAO and QBO) is evaluated.

  9. A description of the FAMOUS (version XDBUA climate model and control run

    Directory of Open Access Journals (Sweden)

    A. Osprey

    2008-12-01

    Full Text Available FAMOUS is an ocean-atmosphere general circulation model of low resolution, capable of simulating approximately 120 years of model climate per wallclock day using current high performance computing facilities. It uses most of the same code as HadCM3, a widely used climate model of higher resolution and computational cost, and has been tuned to reproduce the same climate reasonably well. FAMOUS is useful for climate simulations where the computational cost makes the application of HadCM3 unfeasible, either because of the length of simulation or the size of the ensemble desired. We document a number of scientific and technical improvements to the original version of FAMOUS. These improvements include changes to the parameterisations of ozone and sea-ice which alleviate a significant cold bias from high northern latitudes and the upper troposphere, and the elimination of volume-averaged drifts in ocean tracers. A simple model of the marine carbon cycle has also been included. A particular goal of FAMOUS is to conduct millennial-scale paleoclimate simulations of Quaternary ice ages; to this end, a number of useful changes to the model infrastructure have been made.

  10. Models for assessing and managing credit risk

    Directory of Open Access Journals (Sweden)

    Neogradi Slađana

    2014-01-01

    Full Text Available This essay deals with the definition of a model for assessing and managing credit risk. Risk is an inseparable component of any average and normal credit transaction. Looking at the different aspects of the identification and classification of risk in the banking industry as well as representation of the key components of modern risk management. In the first part of the essay will analyze how the impact of credit risk on bank and empirical models for determining the financial difficulties in which the company can be found. Bank on the basis of these models can reduce number of approved risk assets. In the second part, we consider models for improving credit risk with emphasis on Basel I, II and III, and the third part, we conclude that the most appropriate model and gives the best effect for measuring credit risk in domestic banks.

  11. Living with risk. A global review of disaster reduction initiatives. Preliminary version

    International Nuclear Information System (INIS)

    2002-01-01

    In recent years the world has witnessed an interminable succession of disasters - floods, storms, earthquakes, landslides, volcanic eruptions and wildfires that have claimed many thousands of lives, caused material losses in the tens of billions of dollars, and inflicted a terrible toll on developing countries in particular, where disasters divert attention and resources needed desperately to escape poverty. Communities will always face natural hazards, but today's disasters are often generated by, or at least exacerbated by, human activities. At the most dramatic level, human activities are changing the natural balance of the earth, interfering as never before with the atmosphere, the oceans, the polar ice caps, the forest cover and the natural pillars that make our world a livable home. But we are also putting ourselves in harm's way in less visible ways. At no time in human history have so many people lived in cities clustered around seismically active areas. Destitution and demographic pressure have led more people than ever before to live in flood plains or in areas prone to landslides. Poor land-use planning; environmental mismanagement; and a lack of regulatory mechanisms both increase the risk and exacerbate the effects of disasters. Living with risk: a global review of disaster reduction is the first comprehensive effort by the United Nations system to take stock of disaster reduction initiatives throughout the world. Coordinated by the secretariat of the International Strategy for Disaster Reduction (ISDR), the report discusses current disaster trends, assesses policies aimed at mitigating the impact of disasters, and offers examples of successful initiatives. It also recommends that risk reduction be integrated into sustainable development at all levels - global, national and local. Most of all, Living with risk shows that we are far from helpless in the face of natural hazards. Early warning and risk reduction measures have been important factors in

  12. Hydrogeochemical evaluation of the Forsmark site, model version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Laaksoharju, Marcus (ed.) [GeoPoint AB, Sollentuna (Sweden); Gimeno, Maria; Auque, Luis; Gomez, Javier [Univ. of Zaragoza (Spain). Dept. of Earth Sciences; Smellie, John [Conterra AB, Uppsala (Sweden); Tullborg, Eva-Lena [Terralogica AB, Graabo (Sweden); Gurban, Ioana [3D-Terra, Montreal (Canada)

    2004-01-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Forsmark and Simpevarp, on the eastern coast of Sweden to determine their geological, geochemical and hydrogeological characteristics. Present work completed has resulted in model version 1.1 which represents the first evaluation of the available Forsmark groundwater analytical data collected up to May 1, 2003 (i.e. the first 'data freeze'). The HAG group had access to a total of 456 water samples collected mostly from the surface and sub-surface environment (e.g. soil pipes in the overburden, streams and lakes); only a few samples were collected from drilled boreholes. The deepest samples reflected depths down to 200 m. Furthermore, most of the waters sampled (74%) lacked crucial analytical information that restricted the evaluation. Consequently, model version 1.1 focussed on the processes taking place in the uppermost part of the bedrock rather than at repository levels. The complex groundwater evolution and patterns at Forsmark are a result of many factors such as: a) the flat topography and closeness to the Baltic Sea resulting in relative small hydrogeological driving forces which can preserve old water types from being flushed out, b) the changes in hydrogeology related to glaciation/deglaciation and land uplift, c) repeated marine/lake water regressions/transgressions, and d) organic or inorganic alteration of the groundwater caused by microbial processes or water/rock interactions. The sampled groundwaters reflect to various degrees modern or ancient water/rock interactions and mixing processes. Based on the general geochemical character and the apparent age two major water types occur in Forsmark: fresh-meteoric waters with a bicarbonate imprint and low residence times (tritium values above detection limit), and brackish-marine waters with Cl contents up to 6,000 mg/L and longer residence times (tritium

  13. Thermal modelling. Preliminary site description Simpevarp subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Back, Paer-Erik; Bengtsson, Anna; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2005-08-15

    This report presents the thermal site descriptive model for the Simpevarp subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at possible canister scale has been modelled for four different lithological domains (RSMA01 (Aevroe granite), RSMB01 (Fine-grained dioritoid), RSMC01 (mixture of Aevroe granite and Quartz monzodiorite), and RSMD01 (Quartz monzodiorite)). A main modelling approach has been used to determine the mean value of the thermal conductivity. Three alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological model for the Simpevarp subarea, version 1.2 together with rock type models constituted from measured and calculated (from mineral composition) thermal conductivities. For one rock type, the Aevroe granite (501044), density loggings within the specific rock type has also been used in the domain modelling in order to consider the spatial variability within the Aevroe granite. This has been possible due to the presented relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the mean of thermal conductivity is expected to exhibit only a small variation between the different domains, from 2.62 W/(m.K) to 2.80 W/(m.K). The standard deviation varies according to the scale considered and for the canister scale it is expected to range from 0.20 to 0.28 W/(m.K). Consequently, the lower confidence limit (95% confidence) for the canister scale is within the range 2.04-2.35 W/(m.K) for the different domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-3.4% per 100 deg C increase in temperature for the dominating rock

  14. Thermal modelling. Preliminary site description Simpevarp subarea - version 1.2

    International Nuclear Information System (INIS)

    Sundberg, Jan; Back, Paer-Erik; Bengtsson, Anna; Laendell, Maerta

    2005-08-01

    This report presents the thermal site descriptive model for the Simpevarp subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at possible canister scale has been modelled for four different lithological domains (RSMA01 (Aevroe granite), RSMB01 (Fine-grained dioritoid), RSMC01 (mixture of Aevroe granite and Quartz monzodiorite), and RSMD01 (Quartz monzodiorite)). A main modelling approach has been used to determine the mean value of the thermal conductivity. Three alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological model for the Simpevarp subarea, version 1.2 together with rock type models constituted from measured and calculated (from mineral composition) thermal conductivities. For one rock type, the Aevroe granite (501044), density loggings within the specific rock type has also been used in the domain modelling in order to consider the spatial variability within the Aevroe granite. This has been possible due to the presented relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the mean of thermal conductivity is expected to exhibit only a small variation between the different domains, from 2.62 W/(m.K) to 2.80 W/(m.K). The standard deviation varies according to the scale considered and for the canister scale it is expected to range from 0.20 to 0.28 W/(m.K). Consequently, the lower confidence limit (95% confidence) for the canister scale is within the range 2.04-2.35 W/(m.K) for the different domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-3.4% per 100 deg C increase in temperature for the dominating rock

  15. Thermal modelling. Preliminary site description Laxemar subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Sundberg, Jan; Wrafter, John; Back, Paer-Erik; Laendell, Maerta [Geo Innova AB, Linkoeping (Sweden)

    2006-02-15

    This report presents the thermal site descriptive model for the Laxemar subarea, version 1.2. The main objective of this report is to present the thermal modelling work where data has been identified, quality controlled, evaluated and summarised in order to make an upscaling to lithological domain level possible. The thermal conductivity at canister scale has been modelled for five different lithological domains: RSMA (Aevroe granite), RSMBA (mixture of Aevroe granite and fine-grained dioritoid), RSMD (quartz monzodiorite), RSME (diorite/gabbro) and RSMM (mix domain with high frequency of diorite to gabbro). A base modelling approach has been used to determine the mean value of the thermal conductivity. Four alternative/complementary approaches have been used to evaluate the spatial variability of the thermal conductivity at domain level. The thermal modelling approaches are based on the lithological domain model for the Laxemar subarea, version 1.2 together with rock type models based on measured and calculated (from mineral composition) thermal conductivities. For one rock type, Aevroe granite (501044), density loggings have also been used in the domain modelling in order to evaluate the spatial variability within the Aevroe granite. This has been possible due to an established relationship between density and thermal conductivity, valid for the Aevroe granite. Results indicate that the means of thermal conductivity for the various domains are expected to exhibit a variation from 2.45 W/(m.K) to 2.87 W/(m.K). The standard deviation varies according to the scale considered, and for the 0.8 m scale it is expected to range from 0.17 to 0.29 W/(m.K). Estimates of lower tail percentiles for the same scale are presented for all five domains. The temperature dependence is rather small with a decrease in thermal conductivity of 1.1-5.3% per 100 deg C increase in temperature for the dominant rock types. There are a number of important uncertainties associated with these

  16. Solid waste projection model: Database user's guide (Version 1.0)

    International Nuclear Information System (INIS)

    Carr, F.; Stiles, D.

    1991-01-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for preparing to use Version 1 of the SWPM database, for entering and maintaining data, and for performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions, and does not provide instructions in the use of Paradox, the database management system in which the SWPM database is established. 3 figs., 1 tab

  17. Solid Waste Projection Model: Database user's guide (Version 1.3)

    International Nuclear Information System (INIS)

    Blackburn, C.L.

    1991-11-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC) specifically to address Hanford solid waste management issues. This document is one of a set of documents supporting the SWPM system and providing instructions in the use and maintenance of SWPM components. This manual contains instructions for preparing to use Version 1.3 of the SWPM database, for entering and maintaining data, and for performing routine database functions. This document supports only those operations which are specific to SWPM database menus and functions and does not provide instruction in the use of Paradox, the database management system in which the SWPM database is established

  18. HAM Construction modeling using COMSOL with MatLab Modeling Guide version 1.0.

    NARCIS (Netherlands)

    Schijndel, van A.W.M.

    2006-01-01

    This paper presents a first modeling guide for the modeling and simulation of up to full 3D dynamic Heat, Air & Moisture (HAM) transport of building constructions using COMSOL with Matlab. The modeling scripts are provided at the appendix. Furthermore, all modeling files and results are published at

  19. HAM Construction modeling using COMSOL with MatLab Modeling Guide, version 1.0

    NARCIS (Netherlands)

    Schijndel, van A.W.M.

    2006-01-01

    This paper presents a first modeling guide for the modeling and simulation of up to full 3D dynamic Heat, Air & Moisture (HAM) transport of building constructions using COMSOL with Matlab. The modeling scripts are provided at the appendix. Furthermore, all modeling files and results are published at

  20. RISK LOAN PORTFOLIO OPTIMIZATION MODEL BASED ON CVAR RISK MEASURE

    Directory of Open Access Journals (Sweden)

    Ming-Chang LEE

    2015-07-01

    Full Text Available In order to achieve commercial banks liquidity, safety and profitability objective requirements, loan portfolio risk analysis based optimization decisions are rational allocation of assets.  The risk analysis and asset allocation are the key technology of banking and risk management.  The aim of this paper, build a loan portfolio optimization model based on risk analysis.  Loan portfolio rate of return by using Value-at-Risk (VaR and Conditional Value-at-Risk (CVaR constraint optimization decision model reflects the bank's risk tolerance, and the potential loss of direct control of the bank.  In this paper, it analyze a general risk management model applied to portfolio problems with VaR and CVaR risk measures by using Using the Lagrangian Algorithm.  This paper solves the highly difficult problem by matrix operation method.  Therefore, the combination of this paper is easy understanding the portfolio problems with VaR and CVaR risk model is a hyperbola in mean-standard deviation space.  It is easy calculation in proposed method.

  1. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    Energy Technology Data Exchange (ETDEWEB)

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival function and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact

  2. Development of Arabic version of Berlin questionnaire to identify obstructive sleep apnea at risk patients

    Directory of Open Access Journals (Sweden)

    Abdel Baset M Saleh

    2011-01-01

    Results: The study demonstrated a high degree of internal consistency and stability over time for the developed ABQ. The Cronbach′s alpha coefficient for the 10-item tool was 0.92. Validation of ABQ against AHI at cutoff >5 revealed a sensitivity of 97%, specificity of 90%, positive and negative predictive values of 96% and 93%, respectively. Conclusion: The ABQ is reliable and valid scale in screening patients for the risk of OSA among Arabic-speaking nations, especially in resource-limited settings.

  3. Systems Biology Markup Language (SBML Level 2 Version 5: Structures and Facilities for Model Definitions

    Directory of Open Access Journals (Sweden)

    Hucka Michael

    2015-06-01

    Full Text Available Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  4. Systems Biology Markup Language (SBML) Level 2 Version 5: Structures and Facilities for Model Definitions.

    Science.gov (United States)

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.

  5. Overview of the Meso-NH model version 5.4 and its applications

    Directory of Open Access Journals (Sweden)

    C. Lac

    2018-05-01

    Full Text Available This paper presents the Meso-NH model version 5.4. Meso-NH is an atmospheric non hydrostatic research model that is applied to a broad range of resolutions, from synoptic to turbulent scales, and is designed for studies of physics and chemistry. It is a limited-area model employing advanced numerical techniques, including monotonic advection schemes for scalar transport and fourth-order centered or odd-order WENO advection schemes for momentum. The model includes state-of-the-art physics parameterization schemes that are important to represent convective-scale phenomena and turbulent eddies, as well as flows at larger scales. In addition, Meso-NH has been expanded to provide capabilities for a range of Earth system prediction applications such as chemistry and aerosols, electricity and lightning, hydrology, wildland fires, volcanic eruptions, and cyclones with ocean coupling. Here, we present the main innovations to the dynamics and physics of the code since the pioneer paper of Lafore et al. (1998 and provide an overview of recent applications and couplings.

  6. Conceptual Model of an Application for Automated Generation of Webpage Mobile Versions

    Directory of Open Access Journals (Sweden)

    Todor Rachovski

    2017-11-01

    Full Text Available Accessing webpages through various types of mobile devices with different screen sizes and using different browsers has put new demands on web developers. The main challenge is the development of websites with responsive design that is adaptable depending on the mobile device used. The article presents a conceptual model of an app for automated generation of mobile pages. It has five-layer architecture: database, database management layer, business logic layer, web services layer and a presentation layer. The database stores all the data needed to run the application. The database management layer uses an ORM model to convert relational data into an object-oriented format and control the access to them. The business logic layer contains components that perform the actual work on building a mobile version of the page, including parsing, building a hierarchical model of the page and a number of transformations. The web services layer provides external applications with access to lower-level functionalities, and the presentation layer is responsible for choosing and using the appropriate CSS. A web application that uses the proposed model was developed and experiments were conducted.

  7. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    International Nuclear Information System (INIS)

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the open-quotes constructionclose quotes of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc

  8. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of the physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.

  9. Versioning of printed products

    Science.gov (United States)

    Tuijn, Chris

    2005-01-01

    During the definition of a printed product in an MIS system, a lot of attention is paid to the production process. The MIS systems typically gather all process-related parameters at such a level of detail that they can determine what the exact cost will be to make a specific product. This information can then be used to make a quote for the customer. Considerably less attention is paid to the content of the products since this does not have an immediate impact on the production costs (assuming that the number of inks or plates is known in advance). The content management is typically carried out either by the prepress systems themselves or by dedicated workflow servers uniting all people that contribute to the manufacturing of a printed product. Special care must be taken when considering versioned products. With versioned products we here mean distinct products that have a number of pages or page layers in common. Typical examples are comic books that have to be printed in different languages. In this case, the color plates can be shared over the different versions and the black plate will be different. Other examples are nation-wide magazines or newspapers that have an area with regional pages or advertising leaflets in different languages or currencies. When considering versioned products, the content will become an important cost factor. First of all, the content management (and associated proofing and approval cycles) becomes much more complex and, therefore, the risk that mistakes will be made increases considerably. Secondly, the real production costs are very much content-dependent because the content will determine whether plates can be shared across different versions or not and how many press runs will be needed. In this paper, we will present a way to manage different versions of a printed product. First, we will introduce a data model for version management. Next, we will show how the content of the different versions can be supplied by the customer

  10. Simulated pre-industrial climate in Bergen Climate Model (version 2: model description and large-scale circulation features

    Directory of Open Access Journals (Sweden)

    O. H. Otterå

    2009-11-01

    Full Text Available The Bergen Climate Model (BCM is a fully-coupled atmosphere-ocean-sea-ice model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate. Here, a pre-industrial multi-century simulation with an updated version of BCM is described and compared to observational data. The model is run without any form of flux adjustments and is stable for several centuries. The simulated climate reproduces the general large-scale circulation in the atmosphere reasonably well, except for a positive bias in the high latitude sea level pressure distribution. Also, by introducing an updated turbulence scheme in the atmosphere model a persistent cold bias has been eliminated. For the ocean part, the model drifts in sea surface temperatures and salinities are considerably reduced compared to earlier versions of BCM. Improved conservation properties in the ocean model have contributed to this. Furthermore, by choosing a reference pressure at 2000 m and including thermobaric effects in the ocean model, a more realistic meridional overturning circulation is simulated in the Atlantic Ocean. The simulated sea-ice extent in the Northern Hemisphere is in general agreement with observational data except for summer where the extent is somewhat underestimated. In the Southern Hemisphere, large negative biases are found in the simulated sea-ice extent. This is partly related to problems with the mixed layer parametrization, causing the mixed layer in the Southern Ocean to be too deep, which in turn makes it hard to maintain a realistic sea-ice cover here. However, despite some problematic issues, the pre-industrial control simulation presented here should still be appropriate for climate change studies requiring multi-century simulations.

  11. MODELING CREDIT RISK THROUGH CREDIT SCORING

    OpenAIRE

    Adrian Cantemir CALIN; Oana Cristina POPOVICI

    2014-01-01

    Credit risk governs all financial transactions and it is defined as the risk of suffering a loss due to certain shifts in the credit quality of a counterpart. Credit risk literature gravitates around two main modeling approaches: the structural approach and the reduced form approach. In addition to these perspectives, credit risk assessment has been conducted through a series of techniques such as credit scoring models, which form the traditional approach. This paper examines the evolution of...

  12. An overview of gambling disorder: from treatment approaches to risk factors [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    José M Menchon

    2018-04-01

    Full Text Available Gambling disorder (GD has been reclassified recently into the “Substance-Related and Addictive Disorders” category of the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5, a landmark occurrence for a behavioral addiction. GD is characterized by recurrent, maladaptive gambling behavior that results in clinically significant distress. Although the number of randomized controlled trials assessing the effectiveness of pharmacological treatments is limited, some pharmacological treatments, notably opiate antagonists, have been employed in the treatment of GD. Patients with GD often present cognitive distortions and specific personality traits, making treatment more difficult. Cognitive behavioral therapy has become the most common psychological intervention for treating gambling problems, and it is effective in reducing gambling behavior. In this brief overview, we provide a report on the state of pharmacological and psychological treatments for gambling disorder. Risk factors and potential future lines of research are addressed.

  13. VALIDATION OF THE ASTER GLOBAL DIGITAL ELEVATION MODEL VERSION 2 OVER THE CONTERMINOUS UNITED STATES

    Directory of Open Access Journals (Sweden)

    D. Gesch

    2012-07-01

    Full Text Available The ASTER Global Digital Elevation Model Version 2 (GDEM v2 was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1 in 2009. The absolute vertical accuracy of GDEM v2 was calculated by comparison with more than 18,000 independent reference geodetic ground control points from the National Geodetic Survey. The root mean square error (RMSE measured for GDEM v2 is 8.68 meters. This compares with the RMSE of 9.34 meters for GDEM v1. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v2 mean error of –0.20 meters is a significant improvement over the GDEM v1 mean error of –3.69 meters. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover to examine the effects of cover types on measured errors. The GDEM v2 mean errors by land cover class verify that the presence of aboveground features (tree canopies and built structures cause a positive elevation bias, as would be expected for an imaging system like ASTER. In open ground classes (little or no vegetation with significant aboveground height, GDEM v2 exhibits a negative bias on the order of 1 meter. GDEM v2 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM dataset. In many forested areas, GDEM v2 has elevations that are higher in the canopy than SRTM.

  14. Validation of the ASTER Global Digital Elevation Model Version 2 over the conterminous United States

    Science.gov (United States)

    Gesch, Dean B.; Oimoen, Michael J.; Zhang, Zheng; Meyer, David J.; Danielson, Jeffrey J.

    2012-01-01

    The ASTER Global Digital Elevation Model Version 2 (GDEM v2) was evaluated over the conterminous United States in a manner similar to the validation conducted for the original GDEM Version 1 (v1) in 2009. The absolute vertical accuracy of GDEM v2 was calculated by comparison with more than 18,000 independent reference geodetic ground control points from the National Geodetic Survey. The root mean square error (RMSE) measured for GDEM v2 is 8.68 meters. This compares with the RMSE of 9.34 meters for GDEM v1. Another important descriptor of vertical accuracy is the mean error, or bias, which indicates if a DEM has an overall vertical offset from true ground level. The GDEM v2 mean error of -0.20 meters is a significant improvement over the GDEM v1 mean error of -3.69 meters. The absolute vertical accuracy assessment results, both mean error and RMSE, were segmented by land cover to examine the effects of cover types on measured errors. The GDEM v2 mean errors by land cover class verify that the presence of aboveground features (tree canopies and built structures) cause a positive elevation bias, as would be expected for an imaging system like ASTER. In open ground classes (little or no vegetation with significant aboveground height), GDEM v2 exhibits a negative bias on the order of 1 meter. GDEM v2 was also evaluated by differencing with the Shuttle Radar Topography Mission (SRTM) dataset. In many forested areas, GDEM v2 has elevations that are higher in the canopy than SRTM.

  15. A generic method for automatic translation between input models for different versions of simulation codes

    International Nuclear Information System (INIS)

    Serfontein, Dawid E.; Mulder, Eben J.; Reitsma, Frederik

    2014-01-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications

  16. A generic method for automatic translation between input models for different versions of simulation codes

    Energy Technology Data Exchange (ETDEWEB)

    Serfontein, Dawid E., E-mail: Dawid.Serfontein@nwu.ac.za [School of Mechanical and Nuclear Engineering, North West University (PUK-Campus), PRIVATE BAG X6001 (Internal Post Box 360), Potchefstroom 2520 (South Africa); Mulder, Eben J. [School of Mechanical and Nuclear Engineering, North West University (South Africa); Reitsma, Frederik [Calvera Consultants (South Africa)

    2014-05-01

    A computer code was developed for the semi-automatic translation of input models for the VSOP-A diffusion neutronics simulation code to the format of the newer VSOP 99/05 code. In this paper, this algorithm is presented as a generic method for producing codes for the automatic translation of input models from the format of one code version to another, or even to that of a completely different code. Normally, such translations are done manually. However, input model files, such as for the VSOP codes, often are very large and may consist of many thousands of numeric entries that make no particular sense to the human eye. Therefore the task, of for instance nuclear regulators, to verify the accuracy of such translated files can be very difficult and cumbersome. This may cause translation errors not to be picked up, which may have disastrous consequences later on when a reactor with such a faulty design is built. Therefore a generic algorithm for producing such automatic translation codes may ease the translation and verification process to a great extent. It will also remove human error from the process, which may significantly enhance the accuracy and reliability of the process. The developed algorithm also automatically creates a verification log file which permanently record the names and values of each variable used, as well as the list of meanings of all the possible values. This should greatly facilitate reactor licensing applications.

  17. RALOC Mod 1/81: Program description of RALOC version by the structural heat model HECU

    International Nuclear Information System (INIS)

    Pham, V.T.

    1984-01-01

    In the version RALOC-Mod 1/81 an expanded heat transfer model and structure heat model is included. This feature allows for a realistic simulation of the thermodynamic and fluiddynamic characteristics of the containment atmosphere. Steel and concrete substructures with a plain or rotational symmetry can be represented. The treat transfer calculations for the structures are problem oriented, taking into account, the time- and space dependencies. The influence of the heat transfer on the gas transport (in particular convection) in the reactor vessel is demonstrated by the numerical calculations. In contrast to the calculations without a simulation of the heat storage effects of the container structures showing a widely homogenious hydrogen distribution, the results on the basis of the HECU-model give an inhomogenious distribution during the first 8 to 12 days. However these results are only examples for the application of the RALOC-Mod 1/81 -code, which have not been intended to contribute to the discussion of hydrogen distributions in a PWR-type reactor. (orig./GL) [de

  18. A comparison of models for risk assessment

    International Nuclear Information System (INIS)

    Kellerer, A.M.; Jing Chen

    1993-01-01

    Various mathematical models have been used to represent the dependence of excess cancer risk on dose, age and time since exposure. For solid cancers, i.e. all cancers except leukaemia, the so-called relative risk model is usually employed. However, there can be quite different relative risk models. The most usual model for the quantification of excess tumour rate among the atomic bomb survivors has been a dependence of the relative risk on age at exposure, but it has been shown recently that an age attained model can be equally applied, to represent the observations among the atomic bomb survivors. The differences between the models and their implications are explained. It is also shown that the age attained model is similar to the approaches that have been used in the analysis of lung cancer incidence among radon exposed miners. A more unified approach to modelling of radiation risks can thus be achieved. (3 figs.)

  19. Structure function of holographic quark-gluon plasma: Sakai-Sugimoto model versus its noncritical version

    International Nuclear Information System (INIS)

    Bu Yanyan; Yang Jinmin

    2011-01-01

    Motivated by recent studies of deep inelastic scattering off the N=4 super-Yang-Mills (SYM) plasma, holographically dual to an AdS 5 xS 5 black hole, we use the spacelike flavor current to probe the internal structure of one holographic quark-gluon plasma, which is described by the Sakai-Sugimoto model at high temperature phase (i.e., the chiral-symmetric phase). The plasma structure function is extracted from the retarded flavor current-current correlator. Our main aim in this paper is to explore the effect of nonconformality on these physical quantities. As usual, our study is under the supergravity approximation and the limit of large color number. Although the Sakai-Sugimoto model is nonconformal, which makes the calculations more involved than the well-studied N=4 SYM case, the result seems to indicate that the nonconformality has little essential effect on the physical picture of the internal structure of holographic plasma, which is consistent with the intuition from the asymptotic freedom of QCD at high energy. While the physical picture underlying our investigation is same as the deep inelastic scattering off the N=4 SYM plasma with(out) flavor, the plasma structure functions are quantitatively different, especially their scaling dependence on the temperature, which can be recognized as model dependent. As a comparison, we also do the same analysis for the noncritical version of the Sakai-Sugimoto model which is conformal in the sense that it has a constant dilaton vacuum. The result for this noncritical model is quite similar to the conformal N=4 SYM plasma. We therefore attribute the above difference to the effect of nonconformality of the Sakai-Sugimoto model.

  20. Competing Risks and Multistate Models with R

    CERN Document Server

    Beyersmann, Jan; Schumacher, Martin

    2012-01-01

    This book covers competing risks and multistate models, sometimes summarized as event history analysis. These models generalize the analysis of time to a single event (survival analysis) to analysing the timing of distinct terminal events (competing risks) and possible intermediate events (multistate models). Both R and multistate methods are promoted with a focus on nonparametric methods.

  1. Modeling Research Project Risks with Fuzzy Maps

    Science.gov (United States)

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  2. Chronic pancreatitis: review and update of etiology, risk factors, and management [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Angela Pham

    2018-05-01

    Full Text Available Chronic pancreatitis is a syndrome involving inflammation, fibrosis, and loss of acinar and islet cells which can manifest in unrelenting abdominal pain, malnutrition, and exocrine and endocrine insufficiency. The Toxic-Metabolic, Idiopathic, Genetic, Autoimmune, Recurrent and Severe Acute Pancreatitis, Obstructive (TIGAR-O classification system categorizes known causes and factors that contribute to chronic pancreatitis. Although determining disease etiology provides a framework for focused and specific treatments, chronic pancreatitis remains a challenging condition to treat owing to the often refractory, centrally mediated pain and the lack of consensus regarding when endoscopic therapy and surgery are indicated. Further complications incurred include both exocrine and endocrine pancreatic insufficiency, pseudocyst formation, bile duct obstruction, and pancreatic cancer. Medical treatment of chronic pancreatitis involves controlling pain, addressing malnutrition via the treatment of vitamin and mineral deficiencies and recognizing the risk of osteoporosis, and administering appropriate pancreatic enzyme supplementation and diabetic agents. Cornerstones in treatment include the recognition of pancreatic exocrine insufficiency and administration of pancreatic enzyme replacement therapy, support to cease smoking and alcohol consumption, consultation with a dietitian, and a systematic follow-up to assure optimal treatment effect.

  3. Risk Modelling for Passages in Approach Channel

    Directory of Open Access Journals (Sweden)

    Leszek Smolarek

    2013-01-01

    Full Text Available Methods of multivariate statistics, stochastic processes, and simulation methods are used to identify and assess the risk measures. This paper presents the use of generalized linear models and Markov models to study risks to ships along the approach channel. These models combined with simulation testing are used to determine the time required for continuous monitoring of endangered objects or period at which the level of risk should be verified.

  4. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  5. Fetal heart rate abnormalities during and after external cephalic version: Which fetuses are at risk and how are they delivered?

    Science.gov (United States)

    Kuppens, Simone M; Smailbegovic, Ida; Houterman, Saskia; de Leeuw, Ingrid; Hasaart, Tom H

    2017-10-17

    Fetal heart rate abnormalities (FHR) during and after external cephalic version (ECV) are relatively frequent. They may raise concern about fetal wellbeing. Only occasionally they may lead to an emergency cesarean section. Prospective cohort study in 980 women (> 34 weeks gestation) with a singleton fetus in breech presentation. During and after external cephalic version (ECV) FHR abnormalities were recorded. Obstetric variables and delivery outcome were evaluated. Primary outcome was to identify which fetuses are at risk for FHR abnormalities. Secondary outcome was to identify a possible relationship between FHR abnormalities during and after ECV and mode of delivery and fetal distress during subsequent labor. The overall success rate of ECV was 60% and in 9% of the attempts there was an abnormal FHR pattern. In two cases FHR abnormalities after ECV led to an emergency CS. Estimated fetal weight per 100 g (OR 0.90, CI: 0.87-0.94) and longer duration of the ECV-procedure (OR 1.13, CI: 1.05-1.21) were factors significantly associated with the occurrence of FHR abnormalities. FHR abnormalities were not associated with the mode of delivery or the occurrence of fetal distress during subsequent labor. FHR abnormalities during and after ECV are more frequent with lower estimated fetal weight and longer duration of the procedure. FHR abnormalities during and after ECV have no consequences for subsequent mode of delivery. They do not predict whether fetal distress will occur during labor. The Eindhoven Breech Intervention Study, NCT00516555 . Date of registration: August 13, 2007.

  6. The Extrapolar SWIFT model (version 1.0): fast stratospheric ozone chemistry for global climate models

    Science.gov (United States)

    Kreyling, Daniel; Wohltmann, Ingo; Lehmann, Ralph; Rex, Markus

    2018-03-01

    The Extrapolar SWIFT model is a fast ozone chemistry scheme for interactive calculation of the extrapolar stratospheric ozone layer in coupled general circulation models (GCMs). In contrast to the widely used prescribed ozone, the SWIFT ozone layer interacts with the model dynamics and can respond to atmospheric variability or climatological trends.The Extrapolar SWIFT model employs a repro-modelling approach, in which algebraic functions are used to approximate the numerical output of a full stratospheric chemistry and transport model (ATLAS). The full model solves a coupled chemical differential equation system with 55 initial and boundary conditions (mixing ratio of various chemical species and atmospheric parameters). Hence the rate of change of ozone over 24 h is a function of 55 variables. Using covariances between these variables, we can find linear combinations in order to reduce the parameter space to the following nine basic variables: latitude, pressure altitude, temperature, overhead ozone column and the mixing ratio of ozone and of the ozone-depleting families (Cly, Bry, NOy and HOy). We will show that these nine variables are sufficient to characterize the rate of change of ozone. An automated procedure fits a polynomial function of fourth degree to the rate of change of ozone obtained from several simulations with the ATLAS model. One polynomial function is determined per month, which yields the rate of change of ozone over 24 h. A key aspect for the robustness of the Extrapolar SWIFT model is to include a wide range of stratospheric variability in the numerical output of the ATLAS model, also covering atmospheric states that will occur in a future climate (e.g. temperature and meridional circulation changes or reduction of stratospheric chlorine loading).For validation purposes, the Extrapolar SWIFT model has been integrated into the ATLAS model, replacing the full stratospheric chemistry scheme. Simulations with SWIFT in ATLAS have proven that the

  7. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  8. The Validity and Reliability of the Turkish Version of the Neonatal Skin Risk Assessment Scale.

    Science.gov (United States)

    Sari, Çiğdem; Altay, Naime

    2017-03-01

    The study created a Turkish translation of the Neonatal Skin Risk Assessment Scale (NSRAS) that was developed by Huffines and Longsdon in 1997. Study authors used a cross-sectional survey design in order to determine the validity and reliability of the Turkish translation. The study was conducted at the neonatal intensive care unit of a university hospital in Ankara between March 15 and June 30, 2014. The research sample included 130 neonatal assessments from 17 patients. Data were collected by questionnaire regarding the characteristics of the participating neonates, 7 nurse observers, and the NSRAS and its subarticles. After translation and back-translation were performed to assess language validity of the scale, necessary corrections were made in line with expert suggestions, and content validity was ensured. Internal consistency of the scale was assessed by its homogeneity, Cronbach's α, and subarticle-general scale grade correlation. Cronbach's α for the scale overall was .88, and Cronbach's α values for the subarticles were between .83 and .90. Results showed a positive relationship among all the subarticles and the overall NSRAS scale grade (P Kaiser-Meyer-Olkin analysis was applied for sample sufficiency, and Bartlett test analysis was applied in order to assess the factor analysis of the sample. The Kaiser-Meyer-Olkin coefficient was 0.73, and the χ value found according to the Bartlett test was statistically significant at an advanced level (P < .05). In the 6 subarticles of the scale and in the general scale total grade, a high, positive, and significant relationship among the grades given by the researcher and the nurse observers was found (P < .05). The Turkish NSRAS is reliable and valid.

  9. Ariadne version 4 - a program for simulation of QCD cascades implementing the colour dipole model

    International Nuclear Information System (INIS)

    Loennblad, L.

    1992-01-01

    The fourth version of the Ariadne program for generating QCD cascades in the colour dipole approximation is presented. The underlying physics issues are discussed and a manual for using the program is given together with a few sample programs. The major changes from previous versions are the introduction of photon radiation from quarks and inclusion of interfaces to the LEPTO and PYTHIA programs. (orig.)

  10. Predictive Accuracy of Violence Risk Scale-Sexual Offender Version Risk and Change Scores in Treated Canadian Aboriginal and Non-Aboriginal Sexual Offenders.

    Science.gov (United States)

    Olver, Mark E; Sowden, Justina N; Kingston, Drew A; Nicholaichuk, Terry P; Gordon, Audrey; Beggs Christofferson, Sarah M; Wong, Stephen C P

    2018-04-01

    The present study examined the predictive properties of Violence Risk Scale-Sexual Offender version (VRS-SO) risk and change scores among Aboriginal and non-Aboriginal sexual offenders in a combined sample of 1,063 Canadian federally incarcerated men. All men participated in sexual offender treatment programming through the Correctional Service of Canada (CSC) at sites across its five regions. The Static-99R was also examined for comparison purposes. In total, 393 of the men were identified as Aboriginal (i.e., First Nations, Métis, Circumpolar) while 670 were non-Aboriginal and primarily White. Aboriginal men scored significantly higher on the Static-99R and VRS-SO and had higher rates of sexual and violent recidivism; however, there were no significant differences between Aboriginal and non-Aboriginal groups on treatment change with both groups demonstrating close to a half-standard deviation of change pre and post treatment. VRS-SO risk and change scores significantly predicted sexual and violent recidivism over fixed 5- and 10-year follow-ups for both racial/ancestral groups. Cox regression survival analyses also demonstrated positive treatment changes to be significantly associated with reductions in sexual and violent recidivism among Aboriginal and non-Aboriginal men after controlling baseline risk. A series of follow-up Cox regression analyses demonstrated that risk and change score information accounted for much of the observed differences between Aboriginal and non-Aboriginal men in rates of sexual recidivism; however, marked group differences persisted in rates of general violent recidivism even after controlling for these covariates. The results support the predictive properties of VRS-SO risk and change scores with treated Canadian Aboriginal sexual offenders.

  11. Korean risk assessment model for breast cancer risk prediction.

    Science.gov (United States)

    Park, Boyoung; Ma, Seung Hyun; Shin, Aesun; Chang, Myung-Chul; Choi, Ji-Yeob; Kim, Sungwan; Han, Wonshik; Noh, Dong-Young; Ahn, Sei-Hyun; Kang, Daehee; Yoo, Keun-Young; Park, Sue K

    2013-01-01

    We evaluated the performance of the Gail model for a Korean population and developed a Korean breast cancer risk assessment tool (KoBCRAT) based upon equations developed for the Gail model for predicting breast cancer risk. Using 3,789 sets of cases and controls, risk factors for breast cancer among Koreans were identified. Individual probabilities were projected using Gail's equations and Korean hazard data. We compared the 5-year and lifetime risk produced using the modified Gail model which applied Korean incidence and mortality data and the parameter estimators from the original Gail model with those produced using the KoBCRAT. We validated the KoBCRAT based on the expected/observed breast cancer incidence and area under the curve (AUC) using two Korean cohorts: the Korean Multicenter Cancer Cohort (KMCC) and National Cancer Center (NCC) cohort. The major risk factors under the age of 50 were family history, age at menarche, age at first full-term pregnancy, menopausal status, breastfeeding duration, oral contraceptive usage, and exercise, while those at and over the age of 50 were family history, age at menarche, age at menopause, pregnancy experience, body mass index, oral contraceptive usage, and exercise. The modified Gail model produced lower 5-year risk for the cases than for the controls (p = 0.017), while the KoBCRAT produced higher 5-year and lifetime risk for the cases than for the controls (pKorean women, especially urban women.

  12. Energy Integration for 2050 - A Strategic Impact Model (2050 SIM), Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    2010-10-01

    The United States (U.S.) energy infrastructure is among the most reliable, accessible, and economic in the world. On the other hand, it is also excessively reliant on foreign energy sources, experiences high volatility in energy prices, does not always practice good stewardship of finite indigenous energy resources, and emits significant quantities of greenhouse gas. The U.S. Department of Energy is conducting research and development on advanced nuclear reactor concepts and technologies, including High Temperature Gas Reactor (HTGR) technologies, directed at helping the United States meet its current and future energy challenges. This report discusses the Draft Strategic Impact Model (SIM), an initial version of which was created during the later part of FY-2010. SIM was developed to analyze and depict the benefits of various energy sources in meeting the energy demand and to provide an overall system understanding of the tradeoffs between building and using HTGRs versus other existing technologies for providing energy (heat and electricity) to various energy-use sectors in the United States. This report also provides the assumptions used in the model, the rationale for the methodology, and the references for the source documentation and source data used in developing the SIM.

  13. Energy Integration for 2050 - A Strategic Impact Model (2050 SIM), Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    John Collins

    2011-09-01

    The United States (U.S.) energy infrastructure is among the most reliable, accessible, and economic in the world. On the other hand, it is also excessively reliant on foreign energy sources, experiences high volatility in energy prices, does not always practice good stewardship of finite indigenous energy resources, and emits significant quantities of greenhouse gas. The U.S. Department of Energy is conducting research and development on advanced nuclear reactor concepts and technologies, including High Temperature Gas Reactor (HTGR) technologies, directed at helping the United States meet its current and future energy challenges. This report discusses the Draft Strategic Impact Model (SIM), an initial version of which was created during the later part of FY-2010. SIM was developed to analyze and depict the benefits of various energy sources in meeting the energy demand and to provide an overall system understanding of the tradeoffs between building and using HTGRs versus other existing technologies for providing energy (heat and electricity) to various energy-use sectors in the United States. This report also provides the assumptions used in the model, the rationale for the methodology, and the references for the source documentation and source data used in developing the SIM.

  14. Planar version of the CPT-even gauge sector of the standard model extension

    International Nuclear Information System (INIS)

    Ferreira Junior, Manoel M.; Casana, Rodolfo; Gomes, Adalto Rodrigues; Carvalho, Eduardo S.

    2011-01-01

    The CPT-even abelian gauge sector of the Standard Model Extension is represented by the Maxwell term supplemented by (K F ) μνρσ F μν F ρσ , where the Lorentz-violating background tensor, (K F ) μνρσ , possesses the symmetries of the Riemann tensor and a double null trace, which renders nineteen independent components. From these ones, ten components yield birefringence while nine are nonbirefringent ones. In the present work, we examine the planar version of this theory, obtained by means of a typical dimensional reduction procedure to (1 + 2) dimensions. We obtain a kind of planar scalar electrodynamics, which is composed of a gauge sector containing six Lorentz-violating coefficients, a scalar field endowed with a noncanonical kinetic term, and a coupling term that links the scalar and gauge sectors. The dispersion relation is exactly determined, revealing that the six parameters related to the pure electromagnetic sector do not yield birefringence at any order. In this model, the birefringence may appear only as a second order effect associated with the coupling tensor linking the gauge and scalar sectors.The equations of motion are written and solved in the stationary regime. The Lorentz-violating parameters do not alter the asymptotic behavior of the fields but induce an angular dependence not observed in the Maxwell planar theory. The energy-momentum tensor was evaluated as well, revealing that the theory presents energy stability. (author)

  15. A multi-sectoral version of the Post-Keynesian growth model

    Directory of Open Access Journals (Sweden)

    Ricardo Azevedo Araujo

    2015-03-01

    Full Text Available Abstract With this inquiry, we seek to develop a disaggregated version of the post-Keynesian approach to economic growth, by showing that indeed it can be treated as a particular case of the Pasinettian model of structural change and economic expansion. By relying upon vertical integration it becomes possible to carry out the analysis initiated by Kaldor (1956 and Robinson (1956, 1962, and followed by Dutt (1984, Rowthorn (1982 and later Bhaduri and Marglin (1990 in a multi-sectoral model in which demand and productivity increase at different paces in each sector. By adopting this approach it is possible to show that the structural economic dynamics is conditioned not only to patterns of evolving demand and diffusion of technological progress but also to the distributive features of the economy, which can give rise to different regimes of economic growth. Besides, we find it possible to determine the natural rate of profit that makes the mark-up rate to be constant over time.

  16. Systems Security Engineering Capability Maturity Model (SSECMM), Model Description, Version 1.1

    National Research Council Canada - National Science Library

    1997-01-01

    This document is designed to acquaint the reader with the SSE-CMM Project as a whole and present the project's major work product - the Systems Security Engineering Capability Maturity Model (SSE- CMM...

  17. ISM Approach to Model Offshore Outsourcing Risks

    Directory of Open Access Journals (Sweden)

    Sunand Kumar

    2014-07-01

    Full Text Available In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain network. This also leads to dissatisfaction among different stake holders. The main objective of this paper is to identify and understand the mutual interaction among various risks which affect the performance of offshore outsourcing.  To this effect, authors have identified various risks through extant review of literature.  From this information, an integrated model using interpretive structural modelling (ISM for risks affecting offshore outsourcing is developed and the structural relationships between these risks are modeled.  Further, MICMAC analysis is done to analyze the driving power and dependency of risks which shall be helpful to managers to identify and classify important criterions and to reveal the direct and indirect effects of each criterion on offshore outsourcing. Results show that political risk and risk due to cultural differences are act as strong drivers.

  18. Hydrogeochemical evaluation for Simpevarp model version 1.2. Preliminary site description of the Simpevarp area

    Energy Technology Data Exchange (ETDEWEB)

    Laaksoharju, Marcus (ed.) [Geopoint AB, Stockholm (Sweden)

    2004-12-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Simpevarp and Forsmark, to determine their geological, hydrogeochemical and hydrogeological characteristics. Present work completed has resulted in Model version 1.2 which represents the second evaluation of the available Simpevarp groundwater analytical data collected up to April, 2004. The deepest fracture groundwater samples with sufficient analytical data reflected depths down to 1.7 km. Model version 1.2 focusses on geochemical and mixing processes affecting the groundwater composition in the uppermost part of the bedrock, down to repository levels, and eventually extending to 1000 m depth. The groundwater flow regimes at Laxemar/Simpevarp are considered local and extend down to depths of around 600-1000 m depending on local topography. The marked differences in the groundwater flow regimes between Laxemar and Simpevarp are reflected in the groundwater chemistry where four major hydrochemical groups of groundwaters (types A-D) have been identified: TYPE A: This type comprises dilute groundwaters (< 1000 mg/L Cl; 0.5-2.0 g/L TDS) of Na-HCO{sub 3} type present at shallow (<200 m) depths at Simpevarp, but at greater depths (0-900 m) at Laxemar. At both localities the groundwaters are marginally oxidising close to the surface, but otherwise reducing. Main reactions involve weathering, ion exchange (Ca, Mg), surface complexation, and dissolution of calcite. Redox reactions include precipitation of Fe-oxyhydroxides and some microbially mediated reactions (SRB). Meteoric recharge water is mainly present at Laxemar whilst at Simpevarp potential mixing of recharge meteoric water and a modern sea component is observed. Localised mixing of meteoric water with deeper saline groundwaters is indicated at both Laxemar and Simpevarp. TYPE B: This type comprises brackish groundwaters (1000-6000 mg/L Cl; 5-10 g/L TDS) present at

  19. A spatially-dynamic preliminary risk assessment of the American peregrine falcon at the Los Alamos National Laboratory (version 1)

    International Nuclear Information System (INIS)

    Gallegos, A.F.; Gonzales, G.J.; Bennett, K.D.

    1997-06-01

    The Endangered Species Act and the Record of Decision on the Dual Axis Radiographic Hydrodynamic Test Facility at the Los Alamos National Laboratory require protection of the American peregrine falcon. A preliminary risk assessment of the peregrine was performed using a custom FORTRAN model and a geographical information system. Estimated doses to the falcon were compared against toxicity reference values to generate hazard indices. Hazard index results indicated no unacceptable risk to the falcon from the soil ingestion pathway, including a measure of cumulative effects from multiple contaminants that assumes a linear additive toxicity type. Scaling home ranges on the basis of maximizing falcon height for viewing prey decreased estimated risk by 69% in a canyons-based home range and increased estimated risk by 40% in a river-based home range. Improving model realism by weighting simulated falcon foraging based on distance from potential nest sites decreased risk by 93% in one exposure unit and by 82% in a second exposure unit. It was demonstrated that choice of toxicity reference values can have a substantial impact on risk estimates. Adding bioaccumulation factors for several organics increased partial hazard quotients by a factor of 110, but increased the mean hazard index by only 0.02 units. Adding a food consumption exposure pathway in the form of biomagnification factors for 15 contaminants of potential ecological concern increased the mean hazard index to 1.16 (± 1.0), which is above the level of acceptability (1.0). Aroclor-1254, dichlorodiphenyltrichlorethane (DDT) and dichlorodiphenylethelyne (DDE) accounted for 81% of the estimated risk that includes soil ingestion and food consumption Contaminant pathways and a biomagnification component. Information on risk by specific geographical location was generated, which can be used to manage contaminated areas, falcon habitat, facility siting, and/or facility operations. 123 refs., 10 figs., 2 tabs

  20. A spatially-dynamic preliminary risk assessment of the American peregrine falcon at the Los Alamos National Laboratory (version 1)

    Energy Technology Data Exchange (ETDEWEB)

    Gallegos, A.F.; Gonzales, G.J.; Bennett, K.D. [and others

    1997-06-01

    The Endangered Species Act and the Record of Decision on the Dual Axis Radiographic Hydrodynamic Test Facility at the Los Alamos National Laboratory require protection of the American peregrine falcon. A preliminary risk assessment of the peregrine was performed using a custom FORTRAN model and a geographical information system. Estimated doses to the falcon were compared against toxicity reference values to generate hazard indices. Hazard index results indicated no unacceptable risk to the falcon from the soil ingestion pathway, including a measure of cumulative effects from multiple contaminants that assumes a linear additive toxicity type. Scaling home ranges on the basis of maximizing falcon height for viewing prey decreased estimated risk by 69% in a canyons-based home range and increased estimated risk by 40% in a river-based home range. Improving model realism by weighting simulated falcon foraging based on distance from potential nest sites decreased risk by 93% in one exposure unit and by 82% in a second exposure unit. It was demonstrated that choice of toxicity reference values can have a substantial impact on risk estimates. Adding bioaccumulation factors for several organics increased partial hazard quotients by a factor of 110, but increased the mean hazard index by only 0.02 units. Adding a food consumption exposure pathway in the form of biomagnification factors for 15 contaminants of potential ecological concern increased the mean hazard index to 1.16 ({+-} 1.0), which is above the level of acceptability (1.0). Aroclor-1254, dichlorodiphenyltrichlorethane (DDT) and dichlorodiphenylethelyne (DDE) accounted for 81% of the estimated risk that includes soil ingestion and food consumption Contaminant pathways and a biomagnification component. Information on risk by specific geographical location was generated, which can be used to manage contaminated areas, falcon habitat, facility siting, and/or facility operations. 123 refs., 10 figs., 2 tabs.

  1. Concordance for prognostic models with competing risks

    DEFF Research Database (Denmark)

    Wolbers, Marcel; Blanche, Paul; Koller, Michael T

    2014-01-01

    The concordance probability is a widely used measure to assess discrimination of prognostic models with binary and survival endpoints. We formally define the concordance probability for a prognostic model of the absolute risk of an event of interest in the presence of competing risks and relate i...

  2. Why operational risk modelling creates inverse incentives

    NARCIS (Netherlands)

    Doff, R.

    2015-01-01

    Operational risk modelling has become commonplace in large international banks and is gaining popularity in the insurance industry as well. This is partly due to financial regulation (Basel II, Solvency II). This article argues that operational risk modelling is fundamentally flawed, despite efforts

  3. A NetCDF version of the two-dimensional energy balance model based on the full multigrid algorithm

    Science.gov (United States)

    Zhuang, Kelin; North, Gerald R.; Stevens, Mark J.

    A NetCDF version of the two-dimensional energy balance model based on the full multigrid method in Fortran is introduced for both pedagogical and research purposes. Based on the land-sea-ice distribution, orbital elements, greenhouse gases concentration, and albedo, the code calculates the global seasonal surface temperature. A step-by-step guide with examples is provided for practice.

  4. The Community WRF-Hydro Modeling System Version 4 Updates: Merging Toward Capabilities of the National Water Model

    Science.gov (United States)

    McAllister, M.; Gochis, D.; Dugger, A. L.; Karsten, L. R.; McCreight, J. L.; Pan, L.; Rafieeinasab, A.; Read, L. K.; Sampson, K. M.; Yu, W.

    2017-12-01

    The community WRF-Hydro modeling system is publicly available and provides researchers and operational forecasters a flexible and extensible capability for performing multi-scale, multi-physics options for hydrologic modeling that can be run independent or fully-interactive with the WRF atmospheric model. The core WRF-Hydro physics model contains very high-resolution descriptions of terrestrial hydrologic process representations such as land-atmosphere exchanges of energy and moisture, snowpack evolution, infiltration, terrain routing, channel routing, basic reservoir representation and hydrologic data assimilation. Complementing the core physics components of WRF-Hydro are an ecosystem of pre- and post-processing tools that facilitate the preparation of terrain and meteorological input data, an open-source hydrologic model evaluation toolset (Rwrfhydro), hydrologic data assimilation capabilities with DART and advanced model visualization capabilities. The National Center for Atmospheric Research (NCAR), through collaborative support from the National Science Foundation and other funding partners, provides community support for the entire WRF-Hydro system through a variety of mechanisms. This presentation summarizes the enhanced user support capabilities that are being developed for the community WRF-Hydro modeling system. These products and services include a new website, open-source code repositories, documentation and user guides, test cases, online training materials, live, hands-on training sessions, an email list serve, and individual user support via email through a new help desk ticketing system. The WRF-Hydro modeling system and supporting tools which now include re-gridding scripts and model calibration have recently been updated to Version 4 and are merging toward capabilities of the National Water Model.

  5. ASTER Global Digital Elevation Model Version 2 - summary of validation results

    Science.gov (United States)

    Tachikawa, Tetushi; Kaku, Manabu; Iwasaki, Akira; Gesch, Dean B.; Oimoen, Michael J.; Zhang, Z.; Danielson, Jeffrey J.; Krieger, Tabatha; Curtis, Bill; Haase, Jeff; Abrams, Michael; Carabajal, C.; Meyer, Dave

    2011-01-01

    On June 29, 2009, NASA and the Ministry of Economy, Trade and Industry (METI) of Japan released a Global Digital Elevation Model (GDEM) to users worldwide at no charge as a contribution to the Global Earth Observing System of Systems (GEOSS). This “version 1” ASTER GDEM (GDEM1) was compiled from over 1.2 million scenebased DEMs covering land surfaces between 83°N and 83°S latitudes. A joint U.S.-Japan validation team assessed the accuracy of the GDEM1, augmented by a team of 20 cooperators. The GDEM1 was found to have an overall accuracy of around 20 meters at the 95% confidence level. The team also noted several artifacts associated with poor stereo coverage at high latitudes, cloud contamination, water masking issues and the stacking process used to produce the GDEM1 from individual scene-based DEMs (ASTER GDEM Validation Team, 2009). Two independent horizontal resolution studies estimated the effective spatial resolution of the GDEM1 to be on the order of 120 meters.

  6. GENII Version 2 Users’ Guide

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.

    2004-03-08

    The GENII Version 2 computer code was developed for the Environmental Protection Agency (EPA) at Pacific Northwest National Laboratory (PNNL) to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) and the radiological risk estimating procedures of Federal Guidance Report 13 into updated versions of existing environmental pathway analysis models. The resulting environmental dosimetry computer codes are compiled in the GENII Environmental Dosimetry System. The GENII system was developed to provide a state-of-the-art, technically peer-reviewed, documented set of programs for calculating radiation dose and risk from radionuclides released to the environment. The codes were designed with the flexibility to accommodate input parameters for a wide variety of generic sites. Operation of a new version of the codes, GENII Version 2, is described in this report. Two versions of the GENII Version 2 code system are available, a full-featured version and a version specifically designed for demonstrating compliance with the dose limits specified in 40 CFR 61.93(a), the National Emission Standards for Hazardous Air Pollutants (NESHAPS) for radionuclides. The only differences lie in the limitation of the capabilities of the user to change specific parameters in the NESHAPS version. This report describes the data entry, accomplished via interactive, menu-driven user interfaces. Default exposure and consumption parameters are provided for both the average (population) and maximum individual; however, these may be modified by the user. Source term information may be entered as radionuclide release quantities for transport scenarios, or as basic radionuclide concentrations in environmental media (air, water, soil). For input of basic or derived concentrations, decay of parent radionuclides and ingrowth of radioactive decay products prior to the start of the exposure scenario may be considered. A single code run can

  7. Atmospheric radionuclide transport model with radon postprocessor and SBG module. Model description version 2.8.0; ARTM. Atmosphaerisches Radionuklid-Transport-Modell mit Radon Postprozessor und SBG-Modul. Modellbeschreibung zu Version 2.8.0

    Energy Technology Data Exchange (ETDEWEB)

    Richter, Cornelia; Sogalla, Martin; Thielen, Harald; Martens, Reinhard

    2015-04-20

    The study on the atmospheric radionuclide transport model with radon postprocessor and SBG module (model description version 2.8.0) covers the following issues: determination of emissions, radioactive decay, atmospheric dispersion calculation for radioactive gases, atmospheric dispersion calculation for radioactive dusts, determination of the gamma cloud radiation (gamma submersion), terrain roughness, effective source height, calculation area and model points, geographic reference systems and coordinate transformations, meteorological data, use of invalid meteorological data sets, consideration of statistical uncertainties, consideration of housings, consideration of bumpiness, consideration of terrain roughness, use of frequency distributions of the hourly dispersion situation, consideration of the vegetation period (summer), the radon post processor radon.exe, the SBG module, modeling of wind fields, shading settings.

  8. Uniform California earthquake rupture forecast, version 3 (UCERF3): the time-independent model

    Science.gov (United States)

    Field, Edward H.; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David D.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin R.; Page, Morgan T.; Parsons, Thomas; Powers, Peter M.; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua; ,

    2013-01-01

    In this report we present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation assumptions and to include multifault ruptures, both limitations of the previous model (UCERF2). The rates of all earthquakes are solved for simultaneously, and from a broader range of data, using a system-level "grand inversion" that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (for example, magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1,440 alternative logic tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded because of lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (for example, constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of

  9. Calculating excess lifetime risk in relative risk models

    International Nuclear Information System (INIS)

    Vaeth, M.; Pierce, D.A.

    1990-01-01

    When assessing the impact of radiation exposure it is common practice to present the final conclusions in terms of excess lifetime cancer risk in a population exposed to a given dose. The present investigation is mainly a methodological study focusing on some of the major issues and uncertainties involved in calculating such excess lifetime risks and related risk projection methods. The age-constant relative risk model used in the recent analyses of the cancer mortality that was observed in the follow-up of the cohort of A-bomb survivors in Hiroshima and Nagasaki is used to describe the effect of the exposure on the cancer mortality. In this type of model the excess relative risk is constant in age-at-risk, but depends on the age-at-exposure. Calculation of excess lifetime risks usually requires rather complicated life-table computations. In this paper we propose a simple approximation to the excess lifetime risk; the validity of the approximation for low levels of exposure is justified empirically as well as theoretically. This approximation provides important guidance in understanding the influence of the various factors involved in risk projections. Among the further topics considered are the influence of a latent period, the additional problems involved in calculations of site-specific excess lifetime cancer risks, the consequences of a leveling off or a plateau in the excess relative risk, and the uncertainties involved in transferring results from one population to another. The main part of this study relates to the situation with a single, instantaneous exposure, but a brief discussion is also given of the problem with a continuous exposure at a low-dose rate

  10. Aquaplaning : Development of a Risk Pond Model from Road Surface Measurements

    OpenAIRE

    Nygårdhs, Sara

    2003-01-01

    Aquaplaning accidents are relatively rare, but could have fatal effects. The task of this master’s thesis is to use data from the Laser Road Surface Tester to detect road sections with risk of aquaplaning. A three-dimensional model based on data from road surface measurements is created using MATLAB (version 6.1). From this general geometrical model of the road, a pond model is produced from which the theoretical risk ponds are detected. A risk pond indication table is fur-ther created. The...

  11. Coronary risk assessment by point-based vs. equation-based Framingham models: significant implications for clinical care.

    Science.gov (United States)

    Gordon, William J; Polansky, Jesse M; Boscardin, W John; Fung, Kathy Z; Steinman, Michael A

    2010-11-01

    US cholesterol guidelines use original and simplified versions of the Framingham model to estimate future coronary risk and thereby classify patients into risk groups with different treatment strategies. We sought to compare risk estimates and risk group classification generated by the original, complex Framingham model and the simplified, point-based version. We assessed 2,543 subjects age 20-79 from the 2001-2006 National Health and Nutrition Examination Surveys (NHANES) for whom Adult Treatment Panel III (ATP-III) guidelines recommend formal risk stratification. For each subject, we calculated the 10-year risk of major coronary events using the original and point-based Framingham models, and then compared differences in these risk estimates and whether these differences would place subjects into different ATP-III risk groups (20% risk). Using standard procedures, all analyses were adjusted for survey weights, clustering, and stratification to make our results nationally representative. Among 39 million eligible adults, the original Framingham model categorized 71% of subjects as having "moderate" risk (20%) risk. Estimates of coronary risk by the original and point-based models often differed substantially. The point-based system classified 15% of adults (5.7 million) into different risk groups than the original model, with 10% (3.9 million) misclassified into higher risk groups and 5% (1.8 million) into lower risk groups, for a net impact of classifying 2.1 million adults into higher risk groups. These risk group misclassifications would impact guideline-recommended drug treatment strategies for 25-46% of affected subjects. Patterns of misclassifications varied significantly by gender, age, and underlying CHD risk. Compared to the original Framingham model, the point-based version misclassifies millions of Americans into risk groups for which guidelines recommend different treatment strategies.

  12. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  13. Hydrogeochemical evaluation of the Simpevarp area, model version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Laaksoharju, Marcus (ed.) [Geopoint AB, Stockholm (Sweden); Smellie, John [Conterra AB, Uppsala (Sweden); Gimeno, Maria; Auque, Luis; Gomez, Javier [Univ. of Zaragoza (Spain). Dept. of Earth Sciences; Tullborg, Eva-Lena [Terralogica AB, Graabo (Sweden); Gurban, Ioana [3D-Terra (Sweden)

    2004-02-01

    Siting studies for SKB's programme of deep geological disposal of nuclear fuel waste currently involves the investigation of two locations, Simpevarp and Forsmark, on the eastern coast of Sweden to determine their geological, hydrogeochemical and hydrogeological characteristics. Present work completed has resulted in model version 1.1 which represents the first evaluation of the available Simpevarp groundwater analytical data collected up to July 1st, 2003 (i.e. the first 'data freeze' of the site). The HAG (Hydrochemical Analytical Group) group had access to a total of 535 water samples collected from the surface and sub-surface environment (e.g. soil pipes in the overburden, streams and lakes); only a few samples were collected from drilled boreholes. The deepest fracture groundwater samples with sufficient analytical data reflected depths down to 250 m. Furthermore, most of the waters sampled (79%) lacked crucial analytical information that restricted the evaluation. Consequently, model version 1.1 focussed on the processes taking place in the uppermost part of the bedrock rather than at repository levels. The complex groundwater evolution and patterns at Simpevarp are a result of many factors such as: a) the flat topography and proximity to the Baltic Sea, b) changes in hydrogeology related to glaciation/deglaciation and land uplift, c) repeated marine/lake water regressions/transgressions, and d) organic or inorganic alteration of the groundwater composition caused by microbial processes or water/rock interactions. The sampled groundwaters reflect to various degrees of modern or ancient water/rock interactions and mixing processes. Higher topography to the west of Simpevarp has resulted in hydraulic gradients which have partially flushed out old water types. Except for sea waters, most surface waters and some groundwaters from percussion boreholes are fresh, non-saline waters according to the classification used for Aespoe groundwaters. The rest

  14. Prevalence, outcome, and women’s experiences of external cephalic version in a low-risk population.

    NARCIS (Netherlands)

    Rijnders, M.; Offerhaus, P.; Dommelen, P. van; Wiegers, T.; Buitendijk, S.

    2010-01-01

    BACKGROUND: Until recently, external cephalic version to prevent breech presentation at birth was not widely accepted. The objective of our study was to assess the prevalence, outcomes, and women's experiences of external cephalic version to improve the implementation of the procedure in the

  15. User's guide to the MESOI diffusion model: Version 1.1 (for Data General Eclipse S/230 with AFOS)

    International Nuclear Information System (INIS)

    Athey, G.F.; Ramsdell, J.V.

    1982-09-01

    MESOI is an interactive, Langrangian puff trajectory model. The model theory is documented separately (Ramsdell and Athey, 1981). Version 1.1 is a modified form of the original 1.0. It is designed to run on a Data General Eclipse computer. The model has improved support features which make it useful as an emergency response tool. This report is intended to provide the user with the information necessary to successfully conduct model simulations using MESOI Version 1.1 and to use the support programs STAPREP and EXPLT. The user is also provided information on the use of the data file maintenance and review program UPDATE. Examples are given for the operation of the program. Test data sets are described which allow the user to practice with the programs and to confirm proper implementation and execution

  16. A Network Model of Credit Risk Contagion

    Directory of Open Access Journals (Sweden)

    Ting-Qiang Chen

    2012-01-01

    Full Text Available A network model of credit risk contagion is presented, in which the effect of behaviors of credit risk holders and the financial market regulators and the network structure are considered. By introducing the stochastic dominance theory, we discussed, respectively, the effect mechanisms of the degree of individual relationship, individual attitude to credit risk contagion, the individual ability to resist credit risk contagion, the monitoring strength of the financial market regulators, and the network structure on credit risk contagion. Then some derived and proofed propositions were verified through numerical simulations.

  17. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  18. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  19. Risk modelling study for carotid endarterectomy.

    Science.gov (United States)

    Kuhan, G; Gardiner, E D; Abidia, A F; Chetter, I C; Renwick, P M; Johnson, B F; Wilkinson, A R; McCollum, P T

    2001-12-01

    The aims of this study were to identify factors that influence the risk of stroke or death following carotid endarterectomy (CEA) and to develop a model to aid in comparative audit of vascular surgeons and units. A series of 839 CEAs performed by four vascular surgeons between 1992 and 1999 was analysed. Multiple logistic regression analysis was used to model the effect of 15 possible risk factors on the 30-day risk of stroke or death. Outcome was compared for four surgeons and two units after adjustment for the significant risk factors. The overall 30-day stroke or death rate was 3.9 per cent (29 of 741). Heart disease, diabetes and stroke were significant risk factors. The 30-day predicted stroke or death rates increased with increasing risk scores. The observed 30-day stroke or death rate was 3.9 per cent for both vascular units and varied from 3.0 to 4.2 per cent for the four vascular surgeons. Differences in the outcomes between the surgeons and vascular units did not reach statistical significance after risk adjustment. Diabetes, heart disease and stroke are significant risk factors for stroke or death following CEA. The risk score model identified patients at higher risk and aided in comparative audit.

  20. Suicidal Ideation and Interpersonal Needs: Factor Structure of a Short Version of the Interpersonal Needs Questionnaire in an At-Risk Military Sample.

    Science.gov (United States)

    Allan, Nicholas P; Gros, Daniel F; Hom, Melanie A; Joiner, Thomas E; Stecker, Tracy

    2016-01-01

    The interpersonal-psychological theory of suicide posits that perceived burdensomeness (PB; i.e., the belief that others would be better off if one were dead) and thwarted belongingness (TB; i.e., the belief that one lacks meaningful social connections) are both necessary risk factors for the development of suicidal ideation. To test these relations, measures are needed that are well validated, especially in samples of at-risk adults. The current study was designed to examine the factor structure of an eight-item version of the Interpersonal Needs Questionnaire (INQ) in a sample of 405 U.S. past and current military personnel (M age  = 31.57 years, SD = 7.28; 90.4% male) who endorsed either current suicidal ideation and/or a past suicide attempt. Analyses were conducted using confirmatory factor analysis (CFA). A bifactor model comprising a general factor, labeled interpersonal needs, and two specific factors, labeled PB and TB, fit the data best. The general factor captured a high proportion of overall variance (81.9%). In contrast, the TB factor captured only a modest amount of variance in items meant to capture this factor (59.1%) and the PB factor captured very little variance in items meant to capture this factor (13.5%). Further, only the interpersonal needs factor was associated with lifetime and past-week suicidal ideation as well as suicidal ideation frequency and duration. The current findings indicate that, for the INQ-8 in high-risk military personnel, a general interpersonal needs factor accounted for the relations PB and TB share with suicidal ideation.

  1. Risk of developmental dysplasia of the hip in breech presentation: the effect of successful external cephalic version.

    Science.gov (United States)

    Lambeek, A F; De Hundt, M; Vlemmix, F; Akerboom, B M C; Bais, J M J; Papatsonis, D N M; Mol, B W J; Kok, M

    2013-04-01

    To evaluate the effect of successful external cephalic version on the incidence of developmental dysplasia of the hip (DDH) requiring treatment in singleton breech presentation at term. Observational cohort study. Three large teaching hospitals in the Netherlands. Women with a singleton breech presentation of 34 weeks of gestation or more, who underwent an external cephalic version attempt. We made a comparison of the incidence of DDH between children born in breech presentation and children born in cephalic presentation after a successful external cephalic version. The incidence of DDH requiring either conservative treatment, with a harness, or surgical treatment. A total of 498 newborns were included in the study, of which 40 (8%) were diagnosed with DDH and 35 required treatment. Multivariate analysis showed that female gender (OR 2.79, 95% CI 1.23-6.35) and successful external cephalic version (OR 0.29, 95% CI 0.09-0.95) were independently associated with DDH. A successful external cephalic version is associated with a lower incidence of DDH, although a high percentage of children born after a successful external cephalic version still appear to have DDH. A larger cohort study is needed to establish the definite nature of this relationship. Until then, we recommend the same screening policy for infants born in cephalic position after a successful external cephalic version as for infants born in breech position. © 2012 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2012 RCOG.

  2. Single-Column Modeling of Convection During the CINDY2011/DYNAMO Field Campaign With the CNRM Climate Model Version 6

    Science.gov (United States)

    Abdel-Lathif, Ahmat Younous; Roehrig, Romain; Beau, Isabelle; Douville, Hervé

    2018-03-01

    A single-column model (SCM) approach is used to assess the CNRM climate model (CNRM-CM) version 6 ability to represent the properties of the apparent heat source (Q1) and moisture sink (Q2) as observed during the 3 month CINDY2011/DYNAMO field campaign, over its Northern Sounding Array (NSA). The performance of the CNRM SCM is evaluated in a constrained configuration in which the latent and sensible heat surface fluxes are prescribed, as, when forced by observed sea surface temperature, the model is strongly limited by the underestimate of the surface fluxes, most probably related to the SCM forcing itself. The model exhibits a significant cold bias in the upper troposphere, near 200 hPa, and strong wet biases close to the surface and above 700 hPa. The analysis of the Q1 and Q2 profile distributions emphasizes the properties of the convective parameterization of the CNRM-CM physics. The distribution of the Q2 profile is particularly challenging. The model strongly underestimates the frequency of occurrence of the deep moistening profiles, which likely involve misrepresentation of the shallow and congestus convection. Finally, a statistical approach is used to objectively define atmospheric regimes and construct a typical convection life cycle. A composite analysis shows that the CNRM SCM captures the general transition from bottom-heavy to mid-heavy to top-heavy convective heating. Some model errors are shown to be related to the stratiform regimes. The moistening observed during the shallow and congestus convection regimes also requires further improvements of this CNRM-CM physics.

  3. Competing Risks Copula Models for Unemployment Duration

    DEFF Research Database (Denmark)

    Lo, Simon M. S.; Stephan, Gesine; Wilke, Ralf

    2017-01-01

    The copula graphic estimator (CGE) for competing risks models has received little attention in empirical research, despite having been developed into a comprehensive research method. In this paper, we bridge the gap between theoretical developments and applied research by considering a general...... class of competing risks copula models, which nests popular models such as the Cox proportional hazards model, the semiparametric multivariate mixed proportional hazards model (MMPHM), and the CGE as special cases. Analyzing the effects of a German Hartz reform on unemployment duration, we illustrate...

  4. Risk Monitoring through Traceability Information Model

    OpenAIRE

    Juan P. Zamora; Wilson Adarme; Laura Palacios

    2012-01-01

    This paper shows a traceability framework for supply risk monitoring, beginning with the identification, analysis, and evaluation of the supply chain risk and focusing on the supply operations of the Health Care Institutions with oncology services in Bogota, Colombia. It includes a brief presentation of the state of the art of the Supply Chain Risk Management and traceability systems in logistics operations, and it concludes with the methodology to integrate the SCRM model with the traceabili...

  5. Criterion of Semi-Markov Dependent Risk Model

    Institute of Scientific and Technical Information of China (English)

    Xiao Yun MO; Xiang Qun YANG

    2014-01-01

    A rigorous definition of semi-Markov dependent risk model is given. This model is a generalization of the Markov dependent risk model. A criterion and necessary conditions of semi-Markov dependent risk model are obtained. The results clarify relations between elements among semi-Markov dependent risk model more clear and are applicable for Markov dependent risk model.

  6. Risk management model of winter navigation operations

    International Nuclear Information System (INIS)

    Valdez Banda, Osiris A.; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-01-01

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish–Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible. - Highlights: •A model to assess and manage the risk of winter navigation operations is proposed. •The risks of oil spills in winter navigation in the Gulf of Finland are analysed. •The model assesses and prioritizes actions to control the risk of the operations. •The model suggests navigational training as the most efficient risk control option.

  7. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  8. Modeling CANDU type fuel behaviour during extended burnup irradiations using a revised version of the ELESIM code

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Richmond, W.R.

    1992-05-01

    The high-burnup database for CANDU fuel, with a variety of cases, offers a good opportunity to check models of fuel behaviour, and to identify areas for improvement. Good agreement of calculated values of fission-gas release, and sheath hoop strain, with experimental data indicates that the global behaviour of the fuel element is adequately simulated by a computer code. Using, the ELESIM computer code, the fission-gas release, swelling, and fuel pellet expansion models were analysed, and changes made for gaseous swelling, and diffusional release of fission-gas atoms to the grain boundaries. Using this revised version of ELESIM, satisfactory agreement between measured values of fission-gas release was found for most of the high-burnup database cases. It is concluded that the revised version of the ELESIM code is able to simulate with reasonable accuracy high-burnup as well as low-burnup CANDU fuel

  9. [A model list of high risk drugs].

    Science.gov (United States)

    Cotrina Luque, J; Guerrero Aznar, M D; Alvarez del Vayo Benito, C; Jimenez Mesa, E; Guzman Laura, K P; Fernández Fernández, L

    2013-12-01

    «High-risk drugs» are those that have a very high «risk» of causing death or serious injury if an error occurs during its use. The Institute for Safe Medication Practices (ISMP) has prepared a high-risk drugs list applicable to the general population (with no differences between the pediatric and adult population). Thus, there is a lack of information for the pediatric population. The main objective of this work is to develop a high-risk drug list adapted to the neonatal or pediatric population as a reference model for the pediatric hospital health workforce. We made a literature search in May 2012 to identify any published lists or references in relation to pediatric and/or neonatal high-risk drugs. A total of 15 studies were found, from which 9 were selected. A model list was developed mainly based on the ISMP one, adding strongly perceived pediatric risk drugs and removing those where the pediatric use was anecdotal. There is no published list that suits pediatric risk management. The list of pediatric and neonatal high-risk drugs presented here could be a «reference list of high-risk drugs » for pediatric hospitals. Using this list and training will help to prevent medication errors in each drug supply chain (prescribing, transcribing, dispensing and administration). Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  10. A new risk prediction model for critical care: the Intensive Care National Audit & Research Centre (ICNARC) model.

    Science.gov (United States)

    Harrison, David A; Parry, Gareth J; Carpenter, James R; Short, Alasdair; Rowan, Kathy

    2007-04-01

    To develop a new model to improve risk prediction for admissions to adult critical care units in the UK. Prospective cohort study. The setting was 163 adult, general critical care units in England, Wales, and Northern Ireland, December 1995 to August 2003. Patients were 216,626 critical care admissions. None. The performance of different approaches to modeling physiologic measurements was evaluated, and the best methods were selected to produce a new physiology score. This physiology score was combined with other information relating to the critical care admission-age, diagnostic category, source of admission, and cardiopulmonary resuscitation before admission-to develop a risk prediction model. Modeling interactions between diagnostic category and physiology score enabled the inclusion of groups of admissions that are frequently excluded from risk prediction models. The new model showed good discrimination (mean c index 0.870) and fit (mean Shapiro's R 0.665, mean Brier's score 0.132) in 200 repeated validation samples and performed well when compared with recalibrated versions of existing published risk prediction models in the cohort of patients eligible for all models. The hypothesis of perfect fit was rejected for all models, including the Intensive Care National Audit & Research Centre (ICNARC) model, as is to be expected in such a large cohort. The ICNARC model demonstrated better discrimination and overall fit than existing risk prediction models, even following recalibration of these models. We recommend it be used to replace previously published models for risk adjustment in the UK.

  11. A NetCDF version of the two-dimensional energy balance model based on the full multigrid algorithm

    Directory of Open Access Journals (Sweden)

    Kelin Zhuang

    2017-01-01

    Full Text Available A NetCDF version of the two-dimensional energy balance model based on the full multigrid method in Fortran is introduced for both pedagogical and research purposes. Based on the land–sea–ice distribution, orbital elements, greenhouse gases concentration, and albedo, the code calculates the global seasonal surface temperature. A step-by-step guide with examples is provided for practice.

  12. Programs OPTMAN and SHEMMAN Version 6 (1999) - Coupled-Channels optical model and collective nuclear structure calculation -

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Jong Hwa; Lee, Jeong Yeon; Lee, Young Ouk; Sukhovitski, Efrem Sh [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-01-01

    Programs SHEMMAN and OPTMAN (Version 6) have been developed for determinations of nuclear Hamiltonian parameters and for optical model calculations, respectively. The optical model calculations by OPTMAN with coupling schemes built on wave functions functions of non-axial soft-rotator are self-consistent, since the parameters of the nuclear Hamiltonian are determined by adjusting the energies of collective levels to experimental values with SHEMMAN prior to the optical model calculation. The programs have been installed at Nuclear Data Evaluation Laboratory of KAERI. This report is intended as a brief manual of these codes. 43 refs., 9 figs., 1 tabs. (Author)

  13. Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Final Report, Version 2)

    Science.gov (United States)

    EPA's announced the availability of the final report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2). This update furthered land change modeling by providing nationwide housing developmen...

  14. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  15. SHADOW3: a new version of the synchrotron X-ray optics modelling package

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez del Rio, Manuel, E-mail: srio@esrf.eu [European Synchrotron Radiation Facility, 6 Jules Horowitz, 38000 Grenoble (France); Canestrari, Niccolo [CNRS, Grenoble (France); European Synchrotron Radiation Facility, 6 Jules Horowitz, 38000 Grenoble (France); Jiang, Fan; Cerrina, Franco [Boston University, 8 St Mary’s Street, Boston, MA 02215 (United States)

    2011-09-01

    SHADOW3, a new version of the X-ray tracing code SHADOW, is introduced. A new version of the popular X-ray tracing code SHADOW is presented. An important step has been made in restructuring the code following new computer engineering standards, ending with a modular Fortran 2003 structure and an application programming interface (API). The new code has been designed to be compatible with the original file-oriented SHADOW philosophy, but simplifying the compilation, installation and use. In addition, users can now become programmers using the newly designed SHADOW3 API for creating scripts, macros and programs; being able to deal with optical system optimization, image simulation, and also low transmission calculations requiring a large number of rays (>10{sup 6}). Plans for future development and questions on how to accomplish them are also discussed.

  16. SHADOW3: a new version of the synchrotron X-ray optics modelling package

    International Nuclear Information System (INIS)

    Sanchez del Rio, Manuel; Canestrari, Niccolo; Jiang, Fan; Cerrina, Franco

    2011-01-01

    SHADOW3, a new version of the X-ray tracing code SHADOW, is introduced. A new version of the popular X-ray tracing code SHADOW is presented. An important step has been made in restructuring the code following new computer engineering standards, ending with a modular Fortran 2003 structure and an application programming interface (API). The new code has been designed to be compatible with the original file-oriented SHADOW philosophy, but simplifying the compilation, installation and use. In addition, users can now become programmers using the newly designed SHADOW3 API for creating scripts, macros and programs; being able to deal with optical system optimization, image simulation, and also low transmission calculations requiring a large number of rays (>10 6 ). Plans for future development and questions on how to accomplish them are also discussed

  17. Lung cancer risk models from experimental animals

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1988-03-01

    The objective of this paper is to present analyses of data based on methods that adequately account for time-related factors and competiting risks, and that yield results that are expressed in a form comparable to results obtained from recent analyses of epidemiological studies of humans exposed to radon and radon daughters. These epidemiological analyses have modeled the hazard, or age-specific death rates, as a function of factors such as dose and dose rate, time from exposure, and time from cessation of exposure. The starting point for many of the analyses of human data has been the constant relative risk modeling which the age-specific death rates are assumed to be a function of cumulative dose, and the risks due to exposure are assumed to be proportional to the age-specific baseline death rates. However, departures from this initial model, such as dependence of risks on age at risk and/or time from exposure, have been investigated. These analyses have frequently been based on a non-parametric model that requires minimal assumptions regarding the baseline risks and their dependence on age

  18. Quantitative occupational risk model: Single hazard

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Aneziris, O.N.; Bellamy, L.J.; Ale, B.J.M.; Oh, J.

    2017-01-01

    A model for the quantification of occupational risk of a worker exposed to a single hazard is presented. The model connects the working conditions and worker behaviour to the probability of an accident resulting into one of three types of consequence: recoverable injury, permanent injury and death. Working conditions and safety barriers in place to reduce the likelihood of an accident are included. Logical connections are modelled through an influence diagram. Quantification of the model is based on two sources of information: a) number of accidents observed over a period of time and b) assessment of exposure data of activities and working conditions over the same period of time and the same working population. Effectiveness of risk reducing measures affecting the working conditions, worker behaviour and/or safety barriers can be quantified through the effect of these measures on occupational risk. - Highlights: • Quantification of occupational risk from a single hazard. • Influence diagram connects working conditions, worker behaviour and safety barriers. • Necessary data include the number of accidents and the total exposure of worker • Effectiveness of risk reducing measures is quantified through the impact on the risk • An example illustrates the methodology.

  19. Conceptual models for cumulative risk assessment.

    Science.gov (United States)

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  20. Item and response-category functioning of the Persian version of the KIDSCREEN-27: Rasch partial credit model

    Directory of Open Access Journals (Sweden)

    Jafari Peyman

    2012-10-01

    Full Text Available Abstract Background The purpose of the study was to determine whether the Persian version of the KIDSCREEN-27 has the optimal number of response category to measure health-related quality of life (HRQoL in children and adolescents. Moreover, we aimed to determine if all the items contributed adequately to their own domain. Findings The Persian version of the KIDSCREEN-27 was completed by 1083 school children and 1070 of their parents. The Rasch partial credit model (PCM was used to investigate item statistics and ordering of response categories. The PCM showed that no item was misfitting. The PCM also revealed that, successive response categories for all items were located in the expected order except for category 1 in self- and proxy-reports. Conclusions Although Rasch analysis confirms that all the items belong to their own underlying construct, response categories should be reorganized and evaluated in further studies, especially in children with chronic conditions.

  1. The use of biologically based cancer risk models in radiation epidemiology

    International Nuclear Information System (INIS)

    Krewski, D.; Zielinski, J.M.; Hazelton, W.D.; Garner, M.J.; Moolgavkar, S.H.

    2003-01-01

    Biologically based risk projection models for radiation carcinogenesis seek to describe the fundamental biological processes involved in neoplastic transformation of somatic cells into malignant cancer cells. A validated biologically based model, whose parameters have a direct biological interpretation, can also be used to extrapolate cancer risks to different exposure conditions with some confidence. In this article, biologically based models for radiation carcinogenesis, including the two-stage clonal expansion (TSCE) model and its extensions, are reviewed. The biological and mathematical bases for such models are described, and the implications of key model parameters for cancer risk assessment examined. Specific applications of versions of the TSCE model to important epidemiologic datasets are discussed, including the Colorado uranium miners' cohort; a cohort of Chinese tin miners; the lifespan cohort of atomic bomb survivors in Hiroshima and Nagasaki; and a cohort of over 200,000 workers included in the National Dose Registry (NDR) of Canada. (author)

  2. Fuzzy logic model to quantify risk perception

    International Nuclear Information System (INIS)

    Bukh, Julia; Dickstein, Phineas

    2008-01-01

    The aim of this study is a quantification of public risk perception towards the nuclear field so as to be considered in decision making whenever the public involvement is sought. The proposed model includes both qualitative factors such as familiarity and voluntariness and numerical factors influencing risk perception, such as probability of occurrence and severity of consequence. Since part of these factors can be characterized only by qualitative expressions and the determination of them are linked with vagueness, imprecision and uncertainty, the most suitable method for the risk level assessment is Fuzzy Logic, which models qualitative aspects of knowledge and reasoning processes without employing precise quantitative analyses. This work, then, offers a Fuzzy-Logic based mean of representing the risk perception by a single numerical feature, which can be weighted and accounted for in decision making procedures. (author)

  3. UNSAT-H Version 3.0: Unsaturated Soil Water and Heat Flow Model Theory, User Manual, and Examples

    International Nuclear Information System (INIS)

    Fayer, M.J.

    2000-01-01

    The UNSAT-H model was developed at Pacific Northwest National Laboratory (PNNL) to assess the water dynamics of arid sites and, in particular, estimate recharge fluxes for scenarios pertinent to waste disposal facilities. During the last 4 years, the UNSAT-H model received support from the Immobilized Waste Program (IWP) of the Hanford Site's River Protection Project. This program is designing and assessing the performance of on-site disposal facilities to receive radioactive wastes that are currently stored in single- and double-shell tanks at the Hanford Site (LMHC 1999). The IWP is interested in estimates of recharge rates for current conditions and long-term scenarios involving the vadose zone disposal of tank wastes. Simulation modeling with UNSAT-H is one of the methods being used to provide those estimates (e.g., Rockhold et al. 1995; Fayer et al. 1999). To achieve the above goals for assessing water dynamics and estimating recharge rates, the UNSAT-H model addresses soil water infiltration, redistribution, evaporation, plant transpiration, deep drainage, and soil heat flow as one-dimensional processes. The UNSAT-H model simulates liquid water flow using Richards' equation (Richards 1931), water vapor diffusion using Fick's law, and sensible heat flow using the Fourier equation. This report documents UNSAT-H .Version 3.0. The report includes the bases for the conceptual model and its numerical implementation, benchmark test cases, example simulations involving layered soils and plants, and the code manual. Version 3.0 is an, enhanced-capability update of UNSAT-H Version 2.0 (Fayer and Jones 1990). New features include hysteresis, an iterative solution of head and temperature, an energy balance check, the modified Picard solution technique, additional hydraulic functions, multiple-year simulation capability, and general enhancements

  4. Lifestyle-based risk model for fall risk assessment

    OpenAIRE

    Sannino, Giovanna; De Falco, Ivanoe; De Pietro, Guiseppe

    2016-01-01

    Purpose: The aim of this study was to identify the explicit relationship between life-style and the risk of falling under the form of a mathematical model. Starting from some personal and behavioral information of a subject as, e.g., weight, height, age, data about physical activity habits, and concern about falling, the model would estimate the score of her/his Mini-Balance Evaluation Systems (Mini-BES) test. This score ranges within 0 and 28, and the lower its value the more likely the subj...

  5. Simulations of the Mid-Pliocene Warm Period Using Two Versions of the NASA-GISS ModelE2-R Coupled Model

    Science.gov (United States)

    Chandler, M. A.; Sohl, L. E.; Jonas, J. A.; Dowsett, H. J.; Kelley, M.

    2013-01-01

    The mid-Pliocene Warm Period (mPWP) bears many similarities to aspects of future global warming as projected by the Intergovernmental Panel on Climate Change (IPCC, 2007). Both marine and terrestrial data point to high-latitude temperature amplification, including large decreases in sea ice and land ice, as well as expansion of warmer climate biomes into higher latitudes. Here we present our most recent simulations of the mid-Pliocene climate using the CMIP5 version of the NASAGISS Earth System Model (ModelE2-R). We describe the substantial impact associated with a recent correction made in the implementation of the Gent-McWilliams ocean mixing scheme (GM), which has a large effect on the simulation of ocean surface temperatures, particularly in the North Atlantic Ocean. The effect of this correction on the Pliocene climate results would not have been easily determined from examining its impact on the preindustrial runs alone, a useful demonstration of how the consequences of code improvements as seen in modern climate control runs do not necessarily portend the impacts in extreme climates.Both the GM-corrected and GM-uncorrected simulations were contributed to the Pliocene Model Intercomparison Project (PlioMIP) Experiment 2. Many findings presented here corroborate results from other PlioMIP multi-model ensemble papers, but we also emphasize features in the ModelE2-R simulations that are unlike the ensemble means. The corrected version yields results that more closely resemble the ocean core data as well as the PRISM3D reconstructions of the mid-Pliocene, especially the dramatic warming in the North Atlantic and Greenland-Iceland-Norwegian Sea, which in the new simulation appears to be far more realistic than previously found with older versions of the GISS model. Our belief is that continued development of key physical routines in the atmospheric model, along with higher resolution and recent corrections to mixing parameterisations in the ocean model, have led

  6. Risk of developmental dysplasia of the hip in breech presentation: the effect of successful external cephalic version

    NARCIS (Netherlands)

    Lambeek, A. F.; de Hundt, M.; Vlemmix, F.; Akerboom, B. M. C.; Bais, J. M. J.; Papatsonis, D. N. M.; Mol, B. W. J.; Kok, M. [=Marjolein

    2013-01-01

    To evaluate the effect of successful external cephalic version on the incidence of developmental dysplasia of the hip (DDH) requiring treatment in singleton breech presentation at term. Observational cohort study. Three large teaching hospitals in the Netherlands. Women with a singleton breech

  7. Risk Measurement and Risk Modelling Using Applications of Vine Copulas

    Directory of Open Access Journals (Sweden)

    David E. Allen

    2017-09-01

    Full Text Available This paper features an application of Regular Vine copulas which are a novel and recently developed statistical and mathematical tool which can be applied in the assessment of composite financial risk. Copula-based dependence modelling is a popular tool in financial applications, but is usually applied to pairs of securities. By contrast, Vine copulas provide greater flexibility and permit the modelling of complex dependency patterns using the rich variety of bivariate copulas which may be arranged and analysed in a tree structure to explore multiple dependencies. The paper features the use of Regular Vine copulas in an analysis of the co-dependencies of 10 major European Stock Markets, as represented by individual market indices and the composite STOXX 50 index. The sample runs from 2005 to the end of 2013 to permit an exploration of how correlations change indifferent economic circumstances using three different sample periods: pre-GFC (January 2005–July 2007, GFC (July 2007– September 2009, and post-GFC periods (September 2009–December 2013. The empirical results suggest that the dependencies change in a complex manner, and are subject to change in different economic circumstances. One of the attractions of this approach to risk modelling is the flexibility in the choice of distributions used to model co-dependencies. The practical application of Regular Vine metrics is demonstrated via an example of the calculation of the VaR of a portfolio made up of the indices.

  8. Categorical Inputs, Sensitivity Analysis, Optimization and Importance Tempering with tgp Version 2, an R Package for Treed Gaussian Process Models

    Directory of Open Access Journals (Sweden)

    Robert B. Gramacy

    2010-02-01

    Full Text Available This document describes the new features in version 2.x of the tgp package for R, implementing treed Gaussian process (GP models. The topics covered include methods for dealing with categorical inputs and excluding inputs from the tree or GP part of the model; fully Bayesian sensitivity analysis for inputs/covariates; sequential optimization of black-box functions; and a new Monte Carlo method for inference in multi-modal posterior distributions that combines simulated tempering and importance sampling. These additions extend the functionality of tgp across all models in the hierarchy: from Bayesian linear models, to classification and regression trees (CART, to treed Gaussian processes with jumps to the limiting linear model. It is assumed that the reader is familiar with the baseline functionality of the package, outlined in the first vignette (Gramacy 2007.

  9. Land-total and Ocean-total Precipitation and Evaporation from a Community Atmosphere Model version 5 Perturbed Parameter Ensemble

    Energy Technology Data Exchange (ETDEWEB)

    Covey, Curt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lucas, Donald D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Trenberth, Kevin E. [National Center for Atmospheric Research, Boulder, CO (United States)

    2016-03-02

    This document presents the large scale water budget statistics of a perturbed input-parameter ensemble of atmospheric model runs. The model is Version 5.1.02 of the Community Atmosphere Model (CAM). These runs are the “C-Ensemble” described by Qian et al., “Parametric Sensitivity Analysis of Precipitation at Global and Local Scales in the Community Atmosphere Model CAM5” (Journal of Advances in Modeling the Earth System, 2015). As noted by Qian et al., the simulations are “AMIP type” with temperature and sea ice boundary conditions chosen to match surface observations for the five year period 2000-2004. There are 1100 ensemble members in addition to one run with default inputparameter values.

  10. Refinement and evaluation of the Massachusetts firm-yield estimator model version 2.0

    Science.gov (United States)

    Levin, Sara B.; Archfield, Stacey A.; Massey, Andrew J.

    2011-01-01

    The firm yield is the maximum average daily withdrawal that can be extracted from a reservoir without risk of failure during an extended drought period. Previously developed procedures for determining the firm yield of a reservoir were refined and applied to 38 reservoir systems in Massachusetts, including 25 single- and multiple-reservoir systems that were examined during previous studies and 13 additional reservoir systems. Changes to the firm-yield model include refinements to the simulation methods and input data, as well as the addition of several scenario-testing capabilities. The simulation procedure was adapted to run at a daily time step over a 44-year simulation period, and daily streamflow and meteorological data were compiled for all the reservoirs for input to the model. Another change to the model-simulation methods is the adjustment of the scaling factor used in estimating groundwater contributions to the reservoir. The scaling factor is used to convert the daily groundwater-flow rate into a volume by multiplying the rate by the length of reservoir shoreline that is hydrologically connected to the aquifer. Previous firm-yield analyses used a constant scaling factor that was estimated from the reservoir surface area at full pool. The use of a constant scaling factor caused groundwater flows during periods when the reservoir stage was very low to be overestimated. The constant groundwater scaling factor used in previous analyses was replaced with a variable scaling factor that is based on daily reservoir stage. This change reduced instability in the groundwater-flow algorithms and produced more realistic groundwater-flow contributions during periods of low storage. Uncertainty in the firm-yield model arises from many sources, including errors in input data. The sensitivity of the model to uncertainty in streamflow input data and uncertainty in the stage-storage relation was examined. A series of Monte Carlo simulations were performed on 22 reservoirs

  11. Evaluation of dust and trace metal estimates from the Community Multiscale Air Quality (CMAQ model version 5.0

    Directory of Open Access Journals (Sweden)

    K. W. Appel

    2013-07-01

    Full Text Available The Community Multiscale Air Quality (CMAQ model is a state-of-the-science air quality model that simulates the emission, transformation, transport, and fate of the many different air pollutant species that comprise particulate matter (PM, including dust (or soil. The CMAQ model version 5.0 (CMAQv5.0 has several enhancements over the previous version of the model for estimating the emission and transport of dust, including the ability to track the specific elemental constituents of dust and have the model-derived concentrations of those elements participate in chemistry. The latest version of the model also includes a parameterization to estimate emissions of dust due to wind action. The CMAQv5.0 modeling system was used to simulate the entire year 2006 for the continental United States, and the model estimates were evaluated against daily surface-based measurements from several air quality networks. The CMAQ modeling system overall did well replicating the observed soil concentrations in the western United States (mean bias generally around ±0.5 μg m−3; however, the model consistently overestimated the observed soil concentrations in the eastern United States (mean bias generally between 0.5–1.5 μg m−3, regardless of season. The performance of the individual trace metals was highly dependent on the network, species, and season, with relatively small biases for Fe, Al, Si, and Ti throughout the year at the Interagency Monitoring of Protected Visual Environments (IMPROVE sites, while Ca, K, and Mn were overestimated and Mg underestimated. For the urban Chemical Speciation Network (CSN sites, Fe, Mg, and Mn, while overestimated, had comparatively better performance throughout the year than the other trace metals, which were consistently overestimated, including very large overestimations of Al (380%, Ti (370% and Si (470% in the fall. An underestimation of nighttime mixing in the urban areas appears to contribute to the overestimation of

  12. Result Summary for the Area 5 Radioactive Waste Management Site Performance Assessment Model Version 4.110

    International Nuclear Information System (INIS)

    2011-01-01

    Results for Version 4.110 of the Area 5 Radioactive Waste Management Site (RWMS) performance assessment (PA) model are summarized. Version 4.110 includes the fiscal year (FY) 2010 inventory estimate, including a future inventory estimate. Version 4.110 was implemented in GoldSim 10.11(SP4). The following changes have been implemented since the last baseline model, Version 4.105: (1) Updated the inventory and disposal unit configurations with data through the end of FY 2010. (1) Implemented Federal Guidance Report 13 Supplemental CD dose conversion factors (U.S. Environmental Protection Agency, 1999). Version 4.110 PA results comply with air pathway and all-pathways annual total effective dose (TED) performance objectives (Tables 2 and 3, Figures 1 and 2). Air pathways results decrease moderately for all scenarios. The time of the maximum for the air pathway open rangeland scenario shifts from 1,000 to 100 years (y). All-pathways annual TED increases for all scenarios except the resident scenario. The maximum member of public all-pathways dose occurs at 1,000 y for the resident farmer scenario. The resident farmer dose was predominantly due to technetium-99 (Tc-99) (82 percent) and lead-210 (Pb-210) (13 percent). Pb-210 present at 1,000 y is produced predominantly by radioactive decay of uranium-234 (U-234) present at the time of disposal. All results for the postdrilling and intruder-agriculture scenarios comply with the performance objectives (Tables 4 and 5, Figures 3 and 4). The postdrilling intruder results are similar to Version 4.105 results. The intruder-agriculture results are similar to Version 4.105, except for the Pit 6 Radium Disposal Unit (RaDU). The intruder-agriculture result for the Shallow Land Burial (SLB) disposal units is a significant fraction of the performance objective and exceeds the performance objective at the 95th percentile. The intruder-agriculture dose is due predominantly to Tc-99 (75 percent) and U-238 (9.5 percent). The acute

  13. Salutary effects of high-intensity interval training in persons with elevated cardiovascular risk [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Jerome L. Fleg

    2016-09-01

    Full Text Available Although moderate-intensity continuous training (MICT has been the traditional model for aerobic exercise training for over four decades, a growing body of literature has demonstrated equal if not greater improvement in aerobic capacity and similar beneficial effects on body composition, glucose metabolism, blood pressure, and quality of life from high-intensity interval training (HIIT. An advantage of HIIT over MICT is the shorter time required to perform the same amount of energy expenditure. The current brief review summarizes the effects of HIIT on peak aerobic capacity and cardiovascular risk factors in healthy adults and those with various cardiovascular diseases, including coronary artery disease, chronic heart failure, and post heart transplantation.

  14. A Probabilistic Typhoon Risk Model for Vietnam

    Science.gov (United States)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  15. Risk terrain modeling predicts child maltreatment.

    Science.gov (United States)

    Daley, Dyann; Bachmann, Michael; Bachmann, Brittany A; Pedigo, Christian; Bui, Minh-Thuy; Coffman, Jamye

    2016-12-01

    As indicated by research on the long-term effects of adverse childhood experiences (ACEs), maltreatment has far-reaching consequences for affected children. Effective prevention measures have been elusive, partly due to difficulty in identifying vulnerable children before they are harmed. This study employs Risk Terrain Modeling (RTM), an analysis of the cumulative effect of environmental factors thought to be conducive for child maltreatment, to create a highly accurate prediction model for future substantiated child maltreatment cases in the City of Fort Worth, Texas. The model is superior to commonly used hotspot predictions and more beneficial in aiding prevention efforts in a number of ways: 1) it identifies the highest risk areas for future instances of child maltreatment with improved precision and accuracy; 2) it aids the prioritization of risk-mitigating efforts by informing about the relative importance of the most significant contributing risk factors; 3) since predictions are modeled as a function of easily obtainable data, practitioners do not have to undergo the difficult process of obtaining official child maltreatment data to apply it; 4) the inclusion of a multitude of environmental risk factors creates a more robust model with higher predictive validity; and, 5) the model does not rely on a retrospective examination of past instances of child maltreatment, but adapts predictions to changing environmental conditions. The present study introduces and examines the predictive power of this new tool to aid prevention efforts seeking to improve the safety, health, and wellbeing of vulnerable children. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Risk considerations related to lung modeling

    International Nuclear Information System (INIS)

    Masse, R.; Cross, F.T.

    1989-01-01

    Improved lung models provide a more accurate assessment of dose from inhalation exposures and, therefore, more accurate dose-response relationships for risk evaluation and exposure limitation. Epidemiological data for externally irradiated persons indicate that the numbers of excess respiratory tract carcinomas differ in the upper airways, bronchi, and distal lung. Neither their histogenesis and anatomical location nor their progenitor cells are known with sufficient accuracy for accurate assessment of the microdosimetry. The nuclei of sensitive cells generally can be assumed to be distributed at random in the epithelium, beneath the mucus and tips of the beating cilia and cells. In stratified epithelia, basal cells may be considered the only cells at risk. Upper-airway tumors have been observed in both therapeutically irradiated patients and in Hiroshima-Nagasaki survivors. The current International Commission on Radiological Protection Lung-Model Task Group proposes that the upper airways and lung have a similar relative risk coefficient for cancer induction. The partition of the risk weighting factor, therefore, will be proportional to the spontaneous death rate from tumors, and 80% of the weighting factor for the respiratory tract should be attributed to the lung. For Weibel lung-model branching generations 0 to 16 and 17 to 23, the Task Group proposes an 80/20 partition of the risk, i.e., 64% and 16%, respectively, of the total risk. Regarding risk in animals, recent data in rats indicate a significantly lower effectiveness for lung-cancer induction at low doses from insoluble long-lived alpha-emitters than from Rn daughters. These findings are due, in part, to the fact that different regions of the lung are irradiated. Tumors in the lymph nodes are rare in people and animals exposed to radiation.44 references

  17. Modeling foreign exchange risk premium in Armenia

    Czech Academy of Sciences Publication Activity Database

    Poghosyan, T.; Kočenda, E.; Zemčík, Petr

    2008-01-01

    Roč. 44, č. 1 (2008), s. 41-61 ISSN 1540-496X R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:AV0Z70850503 Keywords : foreign exchange risk premium * Armenia * affine term structure models Subject RIV: AH - Economics Impact factor: 0.611, year: 2008

  18. Modeling foreign exchange risk premium in Armenia

    Czech Academy of Sciences Publication Activity Database

    Poghosyan, Tigran; Kočenda, Evžen; Zemčík, P.

    2008-01-01

    Roč. 44, č. 1 (2008), s. 41-61 ISSN 1540-496X R&D Projects: GA MŠk LC542 Institutional research plan: CEZ:MSM0021620846 Keywords : foreign exchange risk premium * Armenia * affine term structure models Subject RIV: AH - Economics Impact factor: 0.611, year: 2008

  19. A Probabilistic Asteroid Impact Risk Model

    Science.gov (United States)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  20. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  1. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  2. Update of the Polar SWIFT model for polar stratospheric ozone loss (Polar SWIFT version 2

    Directory of Open Access Journals (Sweden)

    I. Wohltmann

    2017-07-01

    Full Text Available The Polar SWIFT model is a fast scheme for calculating the chemistry of stratospheric ozone depletion in polar winter. It is intended for use in global climate models (GCMs and Earth system models (ESMs to enable the simulation of mutual interactions between the ozone layer and climate. To date, climate models often use prescribed ozone fields, since a full stratospheric chemistry scheme is computationally very expensive. Polar SWIFT is based on a set of coupled differential equations, which simulate the polar vortex-averaged mixing ratios of the key species involved in polar ozone depletion on a given vertical level. These species are O3, chemically active chlorine (ClOx, HCl, ClONO2 and HNO3. The only external input parameters that drive the model are the fraction of the polar vortex in sunlight and the fraction of the polar vortex below the temperatures necessary for the formation of polar stratospheric clouds. Here, we present an update of the Polar SWIFT model introducing several improvements over the original model formulation. In particular, the model is now trained on vortex-averaged reaction rates of the ATLAS Chemistry and Transport Model, which enables a detailed look at individual processes and an independent validation of the different parameterizations contained in the differential equations. The training of the original Polar SWIFT model was based on fitting complete model runs to satellite observations and did not allow for this. A revised formulation of the system of differential equations is developed, which closely fits vortex-averaged reaction rates from ATLAS that represent the main chemical processes influencing ozone. In addition, a parameterization for the HNO3 change by denitrification is included. The rates of change of the concentrations of the chemical species of the Polar SWIFT model are purely chemical rates of change in the new version, whereas in the original Polar SWIFT model, they included a transport effect

  3. Implementation of methane cycling for deep time, global warming simulations with the DCESS Earth System Model (Version 1.2)

    DEFF Research Database (Denmark)

    Shaffer, Gary; Villanueva, Esteban Fernández; Rondanelli, Roberto

    2017-01-01

    Geological records reveal a number of ancient, large and rapid negative excursions of carbon-13 isotope. Such excursions can only be explained by massive injections of depleted carbon to the Earth System over a short duration. These injections may have forced strong global warming events, sometimes....... With this improved DCESS model version and paleo-reconstructions, we are now better armed to gauge the amounts, types, time scales and locations of methane injections driving specific, observed deep time, global warming events....

  4. Vortex dynamics in nonrelativistic version of Abelian Higgs model: Effects of the medium on the vortex motion

    Directory of Open Access Journals (Sweden)

    Kozhevnikov Arkadii

    2016-01-01

    Full Text Available The closed vortex dynamics is considered in the nonrelativistic version of the Abelian Higgs Model. The effect of the exchange of excitations propagating in the medium on the vortex string motion is taken into account. The obtained are the effective action and the equation of motion both including the exchange of the propagating excitations between the distant segments of the vortex and the possibility of its interaction with the static fermion asymmetric background. They are applied to the derivation of the time dependence of the basic geometrical contour characteristics.

  5. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE), Version 5.0. Volume 5, Systems Analysis and Risk Assessment (SARA) tutorial manual

    International Nuclear Information System (INIS)

    Sattison, M.B.; Russell, K.D.; Skinner, N.L.

    1994-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of several microcomputer programs that were developed to create and analyze probabilistic risk assessments (PRAs) primarily for nuclear power plants. This volume is the tutorial manual for the Systems Analysis and Risk Assessment (SARA) System Version 5.0, a microcomputer-based system used to analyze the safety issues of a open-quotes familyclose quotes [i.e., a power plant, a manufacturing facility, any facility on which a probabilistic risk assessment (PRA) might be performed]. A series of lessons is provided that guides the user through some basic steps common to most analyses performed with SARA. The example problems presented in the lessons build on one another, and in combination, lead the user through all aspects of SARA sensitivity analysis capabilities

  6. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  7. Mechanistic modeling for mammography screening risks

    International Nuclear Information System (INIS)

    Bijwaard, Harmen

    2008-01-01

    Full text: Western populations show a very high incidence of breast cancer and in many countries mammography screening programs have been set up for the early detection of these cancers. Through these programs large numbers of women (in the Netherlands, 700.000 per year) are exposed to low but not insignificant X-ray doses. ICRP based risk estimates indicate that the number of breast cancer casualties due to mammography screening can be as high as 50 in the Netherlands per year. The number of lives saved is estimated to be much higher, but for an accurate calculation of the benefits of screening a better estimate of these risks is indispensable. Here it is attempted to better quantify the radiological risks of mammography screening through the application of a biologically based model for breast tumor induction by X-rays. The model is applied to data obtained from the National Institutes of Health in the U.S. These concern epidemiological data of female TB patients who received high X-ray breast doses in the period 1930-1950 through frequent fluoroscopy of their lungs. The mechanistic model that is used to describe the increased breast cancer incidence is based on an earlier study by Moolgavkar et al. (1980), in which the natural background incidence of breast cancer was modeled. The model allows for a more sophisticated extrapolation of risks to the low dose X-ray exposures that are common in mammography screening and to the higher ages that are usually involved. Furthermore, it allows for risk transfer to other (non-western) populations. The results have implications for decisions on the frequency of screening, the number of mammograms taken at each screening, minimum and maximum ages for screening and the transfer to digital equipment. (author)

  8. Risk-prone individuals prefer the wrong options on a rat version of the Iowa Gambling Task.

    Science.gov (United States)

    Rivalan, Marion; Ahmed, Serge H; Dellu-Hagedorn, Françoise

    2009-10-15

    Decision making in complex and conflicting situations, as measured in the widely used Iowa Gambling Task (IGT), can be profoundly impaired in psychiatric disorders, such as attention-deficit/hyperactivity disorder, drug addiction, and also in healthy individuals for whom immediate gratification prevails over long-term gain. The cognitive processes underlying these deficits are poorly understood, in part due to a lack of suitable animal models assessing complex decision making with good construct validity. We developed a rat gambling task analogous to the IGT that tracks, for the first time, the ongoing decision process within a single session in an operant cage. Rats could choose between various options. Disadvantageous options, as opposed to advantageous ones, offered bigger immediate food reward but were followed by longer, unpredictable penalties (time-out). The majority of rats can evaluate and deduce favorable options more or less rapidly according to task complexity, whereas others systematically choose disadvantageously. These interindividual differences are stable over time and do not depend on task difficulty or on the level of food restriction. We find that poor decision making does not result from a failure to acquire relevant information but from hypersensitivity to reward and higher risk taking in anxiogenic situations. These results suggest that rats, as well as human poor performers, share similar traits to those observed in decision-making related psychiatric disorders. These traits could constitute risk factors of developing such disorders. The rapid identification of poor decision makers using the rat gambling task should promote the discovery of the specific brain dysfunctions that cause maladapted decision making.

  9. Identifying child abuse and neglect risk among postpartum women in Japan using the Japanese version of the Kempe Family Stress Checklist.

    Science.gov (United States)

    Baba, Kaori; Kataoka, Yaeko

    2014-11-01

    The aims of this study were to determine the rate of women who are high-risk for child abuse and neglect in a perinatal unit in Japan, and to identify the factors associated with risk level. To assess the potential risk for child abuse and neglect the Japanese version of the Kempe Family Stress Checklist (FSC-J) was used to guide interviews with postpartum women. FSC-J uses a three-point scale to score 10 categories, categorizing responses as "no risk=0", "risk=5", and "high risk=10". The range of FSC-J is 0-100. Using an established cutoff point of 25, subjects were divided into high and low risk groups. For both groups, relationships between factors were analyzed. Of the 174 subjects who agreed to participate, 12 (6.9%) scored high-risk, and 162 (93.1%) scored low-risk. Adjusted odds ratio identified three associated factors as important for predicting risk level: past mental illness (OR=341.1), previous experience of intimate partner violence (OR=68.0), and having a partner who was unemployed (OR=14.5). Although this study was on a small sample of women in one hospital in Japan and a larger population would make this study much stronger, these results suggest that some 6.9% of postpartum women in Japan may be at high-risk for child abuse and neglect. It is critical, therefore, to develop a system for screening, intervention, and referral for such women and their children. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Risk analysis: divergent models and convergent interpretations

    Science.gov (United States)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  11. Evaluation of the nutritional status of older hospitalised geriatric patients: a comparative analysis of a Mini Nutritional Assessment (MNA) version and the Nutritional Risk Screening (NRS 2002).

    Science.gov (United States)

    Christner, S; Ritt, M; Volkert, D; Wirth, R; Sieber, C C; Gaßmann, K-G

    2016-12-01

    The present study aimed to evaluate a short-form (MNA-SF) version of the Mini Nutritional Assessment (MNA), in which some of the items were operationalised, based on scores from tools used for a comprehensive geriatric assessment, as a method for analysing the nutritional status of hospitalised geriatric patients. We compared this MNA-SF version with the corresponding MNA long-form (MNA-LF) and Nutritional Risk Screening 2002 (NRS 2002) in terms of completion rate, prevalence and agreement regarding malnutrition and/or the risk of this. In total, 201 patients aged ≥65 years who were hospitalised in geriatric wards were included in this analysis. The MNA-SF, MNA-LF and NRS 2002 were completed in 98.0%, 95.5% and 99.5% of patients (P = 0.06), respectively. The MNA-SF, MNA-LF and NRS 2002 categorised 93.4%, 91.1% and 66.0% of patients as being malnourished or at risk of being malnourished (P evaluating the nutritional status of hospitalised geriatric patients. The NRS 2002 part 1 showed limited value as a prescreening aid in relation to the NRS 2002 part 2 in the same group of patients. © 2016 The British Dietetic Association Ltd.

  12. Risk management model in road transport systems

    Science.gov (United States)

    Sakhapov, R. L.; Nikolaeva, R. V.; Gatiyatullin, M. H.; Makhmutov, M. M.

    2016-08-01

    The article presents the results of a study of road safety indicators that influence the development and operation of the transport system. Road safety is considered as a continuous process of risk management. Authors constructed a model that relates the social risks of a major road safety indicator - the level of motorization. The model gives a fairly accurate assessment of the level of social risk for any given level of motorization. Authors calculated the dependence of the level of socio-economic costs of accidents and injured people in them. The applicability of the concept of socio-economic damage is caused by the presence of a linear relationship between the natural and economic indicators damage from accidents. The optimization of social risk is reduced to finding the extremum of the objective function that characterizes the economic effect of the implementation of measures to improve safety. The calculations make it possible to maximize the net present value, depending on the costs of improving road safety, taking into account socio-economic damage caused by accidents. The proposed econometric models make it possible to quantify the efficiency of the transportation system, allow to simulate the change in road safety indicators.

  13. Recent extensions and use of the statistical model code EMPIRE-II - version: 2.17 Millesimo

    International Nuclear Information System (INIS)

    Herman, M.

    2003-01-01

    This lecture notes describe new features of the modular code EMPIRE-2.17 designed to perform comprehensive calculations of nuclear reactions using variety of nuclear reaction models. Compared to the version 2.13, the current release has been extended by including Coupled-Channel mechanism, exciton model, Monte Carlo approach to preequilibrium emission, use of microscopic level densities, widths fluctuation correction, detailed calculation of the recoil spectra, and powerful plotting capabilities provided by the ZVView package. The second part of this lecture concentrates on the use of the code in practical calculations, with emphasis on the aspects relevant to nuclear data evaluation. In particular, adjusting model parameters is discussed in details. (author)

  14. Construction of Site Risk Model using Individual Unit Risk Model in a NPP Site

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Ho Gon; Han, Sang Hoon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    Since Fukushima accident, strong needs to estimate site risk has been increased to identify the possibility of re-occurrence of such a tremendous disaster and prevent such a disaster. Especially, in a site which has large fleet of nuclear power plants, reliable site risk assessment is very emergent to confirm the safety. In Korea, there are several nuclear power plant site which have more than 6 NPPs. In general, risk model of a NPP in terms of PSA is very complicated and furthermore, it is expected that the site risk model is more complex than that. In this paper, the method for constructing site risk model is proposed by using individual unit risk model. Procedure for the development of site damage (risk) model was proposed in the present paper. Since the site damage model is complicated in the sense of the scale of the system and dependency of the components of the system, conventional method may not be applicable in many side of the problem.

  15. Value at Risk models for Energy Risk Management

    OpenAIRE

    Novák, Martin

    2010-01-01

    The main focus of this thesis lies on description of Risk Management in context of Energy Trading. The paper will predominantly discuss Value at Risk and its modifications as a main overall indicator of Energy Risk.

  16. Online dynamical downscaling of temperature and precipitation within the iLOVECLIM model (version 1.1)

    Science.gov (United States)

    Quiquet, Aurélien; Roche, Didier M.; Dumas, Christophe; Paillard, Didier

    2018-02-01

    This paper presents the inclusion of an online dynamical downscaling of temperature and precipitation within the model of intermediate complexity iLOVECLIM v1.1. We describe the following methodology to generate temperature and precipitation fields on a 40 km × 40 km Cartesian grid of the Northern Hemisphere from the T21 native atmospheric model grid. Our scheme is not grid specific and conserves energy and moisture in the same way as the original climate model. We show that we are able to generate a high-resolution field which presents a spatial variability in better agreement with the observations compared to the standard model. Although the large-scale model biases are not corrected, for selected model parameters, the downscaling can induce a better overall performance compared to the standard version on both the high-resolution grid and on the native grid. Foreseen applications of this new model feature include the improvement of ice sheet model coupling and high-resolution land surface models.

  17. Online dynamical downscaling of temperature and precipitation within the iLOVECLIM model (version 1.1

    Directory of Open Access Journals (Sweden)

    A. Quiquet

    2018-02-01

    Full Text Available This paper presents the inclusion of an online dynamical downscaling of temperature and precipitation within the model of intermediate complexity iLOVECLIM v1.1. We describe the following methodology to generate temperature and precipitation fields on a 40 km  ×  40 km Cartesian grid of the Northern Hemisphere from the T21 native atmospheric model grid. Our scheme is not grid specific and conserves energy and moisture in the same way as the original climate model. We show that we are able to generate a high-resolution field which presents a spatial variability in better agreement with the observations compared to the standard model. Although the large-scale model biases are not corrected, for selected model parameters, the downscaling can induce a better overall performance compared to the standard version on both the high-resolution grid and on the native grid. Foreseen applications of this new model feature include the improvement of ice sheet model coupling and high-resolution land surface models.

  18. Crop insurance: Risks and models of insurance

    Directory of Open Access Journals (Sweden)

    Čolović Vladimir

    2014-01-01

    Full Text Available The issue of crop protection is very important because of a variety of risks that could cause difficult consequences. One type of risk protection is insurance. The author in the paper states various models of insurance in some EU countries and the systems of subsidizing of insurance premiums by state. The author also gives a picture of crop insurance in the U.S., noting that in this country pays great attention to this matter. As for crop insurance in Serbia, it is not at a high level. The main problem with crop insurance is not only the risks but also the way of protection through insurance. The basic question that arises not only in the EU is the question is who will insure and protect crops. There are three possibilities: insurance companies under state control, insurance companies that are public-private partnerships or private insurance companies on a purely commercial basis.

  19. A psychometric evaluation of the Swedish version of the Research Utilization Questionnaire using a Rasch measurement model.

    Science.gov (United States)

    Lundberg, Veronica; Boström, Anne-Marie; Malinowsky, Camilla

    2017-07-30

    Evidence-based practice and research utilisation has become a commonly used concept in health care. The Research Utilization Questionnaire (RUQ) has been recognised to be a widely used instrument measuring the perception of research utilisation among nursing staff in clinical practice. Few studies have however analysed the psychometric properties of the RUQ. The aim of this study was to examine the psychometric properties of the Swedish version of the three subscales in RUQ using a Rasch measurement model. This study has a cross-sectional design using a sample of 163 staff (response rate 81%) working in one nursing home in Sweden. Data were collected using the Swedish version of RUQ in 2012. The three subscales Attitudes towards research, Availability of and support for research use and Use of research findings in clinical practice were investigated. Data were analysed using a Rasch measurement model. The results indicate presence of multidimensionality in all subscales. Moreover, internal scale validity and person response validity also provide some less satisfactory results, especially for the subscale Use of research findings. Overall, there seems to be a problem with the negatively worded statements. The findings suggest that clarification and refining of items, including additional psychometric evaluation of the RUQ, are needed before using the instrument in clinical practice and research studies among staff in nursing homes. © 2017 Nordic College of Caring Science.

  20. Validation of the malaysian versions of parents and children health survey for asthma by using rasch-model.

    Science.gov (United States)

    Hussein, Maryam Se; Akram, Waqas; Mamat, Mohd Nor; Majeed, Abu Bakar Abdul; Ismail, Nahlah Elkudssiah Binti

    2015-04-01

    In recent years, health-related quality of life (HRQOL) has become an important outcome measure in epidemiologic studies and clinical trials. For patients with asthma there are many instruments but most of them have been developed in English. With the increase in research project, researchers working in other languages have two options; either to develop a new measure or to translate an already developed measure. Children Health Survey for Asthma is developed by American Academy of Paediatrics which has two versions one for the parents (CHSA) and the other for the child (CHSA-C). However, there is no Malay version of the CHSA or the CHSA-C. The aim of this study was to translate and determine the validity and reliability of the Malaysian versions of Parent and Children Health Survey for Asthma. Questionnaires were translated to Bahasa Malayu using previously established guidelines, data from 180 respondents (asthmatic children and their parent) were analysed using Rasch-Model; as, it is an approach that has been increasingly used in health field and also it explores the performance of each item rather than total set score. The internal consistency was high for the parent questionnaire (CHSA) (reliability score for persons = 0.88 and for items was 0.97), and good for child questionnaire (CHSA-C) (reliability score for persons = 0.83 and for items was 0.94). Also, this study shows that all items measure for both questionnaires (CHSA and CHSA-C) are fitted to Rasch-Model. This study produced questionnaires that are conceptually equivalent to the original, easy to understand for the children and their parents, and good in terms of internal consistency. Because of the questionnaire has two versions one for the child and the other for the parents, they could be used in clinical practice to measure the effect of asthma on the child and their families. This current research had translated two instruments to other language (BahasaMalayu) and evaluated their reliability and

  1. Improvement of the projection models for radiogenic cancer risk

    International Nuclear Information System (INIS)

    Tong Jian

    2005-01-01

    Calculations of radiogenic cancer risk are based on the risk projection models for specific cancer sites. Improvement has been made for the parameters used in the previous models including introductions of mortality and morbidity risk coefficients, and age-/ gender-specific risk coefficients. These coefficients have been applied to calculate the radiogenic cancer risks for specific organs and radionuclides under different exposure scenarios. (authors)

  2. Accounting for observation uncertainties in an evaluation metric of low latitude turbulent air-sea fluxes: application to the comparison of a suite of IPSL model versions

    Science.gov (United States)

    Servonnat, Jérôme; Găinuşă-Bogdan, Alina; Braconnot, Pascale

    2017-09-01

    Turbulent momentum and heat (sensible heat and latent heat) fluxes at the air-sea interface are key components of the whole energetic of the Earth's climate. The evaluation of these fluxes in the climate models is still difficult because of the large uncertainties associated with the reference products. In this paper we present an objective metric accounting for reference uncertainties to evaluate the annual cycle of the low latitude turbulent fluxes of a suite of IPSL climate models. This metric consists in a Hotelling T 2 test between the simulated and observed field in a reduce space characterized by the dominant modes of variability that are common to both the model and the reference, taking into account the observational uncertainty. The test is thus more severe when uncertainties are small as it is the case for sea surface temperature (SST). The results of the test show that for almost all variables and all model versions the model-reference differences are not zero. It is not possible to distinguish between model versions for sensible heat and meridional wind stress, certainly due to the large observational uncertainties. All model versions share similar biases for the different variables. There is no improvement between the reference versions of the IPSL model used for CMIP3 and CMIP5. The test also reveals that the higher horizontal resolution fails to improve the representation of the turbulent surface fluxes compared to the other versions. The representation of the fluxes is further degraded in a version with improved atmospheric physics with an amplification of some of the biases in the Indian Ocean and in the intertropical convergence zone. The ranking of the model versions for the turbulent fluxes is not correlated with the ranking found for SST. This highlights that despite the fact that SST gradients are important for the large-scale atmospheric circulation patterns, other factors such as wind speed, and air-sea temperature contrast play an

  3. Medical Updates Number 5 to the International Space Station Probability Risk Assessment (PRA) Model Using the Integrated Medical Model

    Science.gov (United States)

    Butler, Doug; Bauman, David; Johnson-Throop, Kathy

    2011-01-01

    The Integrated Medical Model (IMM) Project has been developing a probabilistic risk assessment tool, the IMM, to help evaluate in-flight crew health needs and impacts to the mission due to medical events. This package is a follow-up to a data package provided in June 2009. The IMM currently represents 83 medical conditions and associated ISS resources required to mitigate medical events. IMM end state forecasts relevant to the ISS PRA model include evacuation (EVAC) and loss of crew life (LOCL). The current version of the IMM provides the basis for the operational version of IMM expected in the January 2011 timeframe. The objectives of this data package are: 1. To provide a preliminary understanding of medical risk data used to update the ISS PRA Model. The IMM has had limited validation and an initial characterization of maturity has been completed using NASA STD 7009 Standard for Models and Simulation. The IMM has been internally validated by IMM personnel but has not been validated by an independent body external to the IMM Project. 2. To support a continued dialogue between the ISS PRA and IMM teams. To ensure accurate data interpretation, and that IMM output format and content meets the needs of the ISS Risk Management Office and ISS PRA Model, periodic discussions are anticipated between the risk teams. 3. To help assess the differences between the current ISS PRA and IMM medical risk forecasts of EVAC and LOCL. Follow-on activities are anticipated based on the differences between the current ISS PRA medical risk data and the latest medical risk data produced by IMM.

  4. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    Science.gov (United States)

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 1 object-oriented parameter estimation code is here extended to Version 3 to incorporate additional algorithms and tools to further improve support for large and complex environmental modeling problems. PEST++ Version 3 includes the Gauss-Marquardt-Levenberg (GML) algorithm for nonlinear parameter estimation, Tikhonov regularization, integrated linear-based uncertainty quantification, options of integrated TCP/IP based parallel run management or external independent run management by use of a Version 2 update of the GENIE Version 1 software code, and utilities for global sensitivity analyses. The Version 3 code design is consistent with PEST++ Version 1 and continues to be designed to lower the barriers of entry for users as well as developers while providing efficient and optimized algorithms capable of accommodating large, highly parameterized inverse problems. As such, this effort continues the original focus of (1) implementing the most popular and powerful features of the PEST software suite in a fashion that is easy for novice or experienced modelers to use and (2) developing a software framework that is easy to extend.

  5. Assessment of two versions of regional climate model in simulating the Indian Summer Monsoon over South Asia CORDEX domain

    Science.gov (United States)

    Pattnayak, K. C.; Panda, S. K.; Saraswat, Vaishali; Dash, S. K.

    2018-04-01

    This study assess the performance of two versions of Regional Climate Model (RegCM) in simulating the Indian summer monsoon over South Asia for the period 1998 to 2003 with an aim of conducting future climate change simulations. Two sets of experiments were carried out with two different versions of RegCM (viz. RegCM4.2 and RegCM4.3) with the lateral boundary forcings provided from European Center for Medium Range Weather Forecast Reanalysis (ERA-interim) at 50 km horizontal resolution. The major updates in RegCM4.3 in comparison to the older version RegCM4.2 are the inclusion of measured solar irradiance in place of hardcoded solar constant and additional layers in the stratosphere. The analysis shows that the Indian summer monsoon rainfall, moisture flux and surface net downward shortwave flux are better represented in RegCM4.3 than that in the RegCM4.2 simulations. Excessive moisture flux in the RegCM4.2 simulation over the northern Arabian Sea and Peninsular India resulted in an overestimation of rainfall over the Western Ghats, Peninsular region as a result of which the all India rainfall has been overestimated. RegCM4.3 has performed well over India as a whole as well as its four rainfall homogenous zones in reproducing the mean monsoon rainfall and inter-annual variation of rainfall. Further, the monsoon onset, low-level Somali Jet and the upper level tropical easterly jet are better represented in the RegCM4.3 than RegCM4.2. Thus, RegCM4.3 has performed better in simulating the mean summer monsoon circulation over the South Asia. Hence, RegCM4.3 may be used to study the future climate change over the South Asia.

  6. The MIRAB Model of Small Island Economies in the Pacific and their Security Issues: Revised Version

    OpenAIRE

    Tisdell, Clem

    2014-01-01

    The MIRAB model of Pacific island micro-economies was developed in the mid-1980s by the New Zealand economists, Bertram and Watters, and dominated the literature on the economics of small island nations and economies until alternative models were proposed two decades later. Nevertheless, it is still an influential theory. MIRAB is an acronym for migration (MI), remittance (R) and foreign aid (A) and the public bureaucracy (B); the main components of the MIRAB model. The nature of this model i...

  7. Human Plague Risk: Spatial-Temporal Models

    Science.gov (United States)

    Pinzon, Jorge E.

    2010-01-01

    This chpater reviews the use of spatial-temporal models in identifying potential risks of plague outbreaks into the human population. Using earth observations by satellites remote sensing there has been a systematic analysis and mapping of the close coupling between the vectors of the disease and climate variability. The overall result is that incidence of plague is correlated to positive El Nino/Southem Oscillation (ENSO).

  8. Implementation of methane cycling for deep time, global warming simulations with the DCESS Earth System Model (Version 1.2)

    DEFF Research Database (Denmark)

    Shaffer, Gary; Villanueva, Esteban Fernández; Rondanelli, Roberto

    2017-01-01

    Geological records reveal a number of ancient, large and rapid negative excursions of carbon-13 isotope. Such excursions can only be explained by massive injections of depleted carbon to the Earth System over a short duration. These injections may have forced strong global warming events, sometimes....... With this improved DCESS model version and paleo-reconstructions, we are now better armed to gauge the amounts, types, time scales and locations of methane injections driving specific, observed deep time, global warming events......., or from warming-induced dissociation of methane hydrate, a solid compound of methane and water found in ocean sediments. As a consequence of the ubiquity and importance of methane in major Earth events, Earth System models should include a comprehensive treatment of methane cycling but such a treatment...

  9. T-HERPS Version 1.0 User's Guide for Risk to Amphibians and Reptiles from Pesticides

    Science.gov (United States)

    T-HERPS estimates dietary exposure and risk to terrestrial-phase amphibians and reptiles from pesticide use. Currently approved for assessing exposure and risk to the California red-legged frog and terrestrial-phase herptiles with similar dietary behavior

  10. A framework for expanding aqueous chemistry in the Community Multiscale Air Quality (CMAQ) model version 5.1

    Science.gov (United States)

    Fahey, Kathleen M.; Carlton, Annmarie G.; Pye, Havala O. T.; Baek, Jaemeen; Hutzell, William T.; Stanier, Charles O.; Baker, Kirk R.; Wyat Appel, K.; Jaoui, Mohammed; Offenberg, John H.

    2017-04-01

    This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM - KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosenbrock solver (Rodas3) to integrate the stiff system of ordinary differential equations (ODEs) that describe the mass transfer, chemical kinetics, and scavenging processes of CMAQ clouds. CMAQ's standard cloud chemistry module (AQCHEM) is structurally limited to the treatment of a simple chemical mechanism. This work advances our ability to test and implement more sophisticated aqueous chemical mechanisms in CMAQ and further investigate the impacts of microphysical parameters on cloud chemistry. Box model cloud chemistry simulations were performed to choose efficient solver and tolerance settings, evaluate the implementation of the KPP solver, and assess the direct impacts of alternative solver and kinetic mass transfer on predicted concentrations for a range of scenarios. Month-long CMAQ simulations for winter and summer periods over the US reveal the changes in model predictions due to these cloud module updates within the full chemical transport model. While monthly average CMAQ predictions are not drastically altered between AQCHEM and AQCHEM - KMT, hourly concentration differences can be significant. With added in-cloud secondary organic aerosol (SOA) formation from biogenic epoxides (AQCHEM - KMTI), normalized mean error and bias statistics are slightly improved for 2-methyltetrols and 2-methylglyceric acid at the Research Triangle Park measurement site in North Carolina during the Southern Oxidant and Aerosol Study (SOAS) period. The added in-cloud chemistry leads to a monthly average increase of 11-18 % in cloud SOA at the surface in the eastern United States for June 2013.

  11. Model of Axiological Dimension Risk Management

    Directory of Open Access Journals (Sweden)

    Kulińska Ewa

    2016-01-01

    Full Text Available It was on the basis of the obtained results that identify the key prerequisites for the integration of the management of logistics processes, management of the value creation process, and risk management that the methodological basis for the construction of the axiological dimension of the risk management (ADRM model of logistics processes was determined. By taking into account the contribution of individual concepts to the new research area, its essence was defined as an integrated, structured instrumentation aimed at the identification and implementation of logistics processes supporting creation of the value added as well as the identification and elimination of risk factors disturbing the process of the value creation for internal and external customers. The base for the ADRM concept of logistics processes is the use of the potential being inherent in synergistic effects which are obtained by using prerequisites for the integration of the management of logistics processes, of value creation and risk management as the key determinants of the value creation.

  12. On-the-fly confluence detection for statistical model checking (extended version)

    NARCIS (Netherlands)

    Hartmanns, Arnd; Timmer, Mark

    Statistical model checking is an analysis method that circumvents the state space explosion problem in model-based verification by combining probabilistic simulation with statistical methods that provide clear error bounds. As a simulation-based technique, it can only provide sound results if the

  13. Technical documentation and user's guide for City-County Allocation Model (CCAM). Version 1.0

    International Nuclear Information System (INIS)

    Clark, L.T. Jr.; Scott, M.J.; Hammer, P.

    1986-05-01

    The City-County Allocation Model (CCAM) was developed as part of the Monitored Retrievable Storage (MRS) Program. The CCAM model was designed to allocate population changes forecasted by the MASTER model to specific local communities within commuting distance of the MRS facility. The CCAM model was designed to then forecast the potential changes in demand for key community services such as housing, police protection, and utilities for these communities. The CCAM model uses a flexible on-line data base on demand for community services that is based on a combination of local service levels and state and national service standards. The CCAM model can be used to quickly forecast the potential community service consequence of economic development for local communities anywhere in the country. The remainder of this document is organized as follows. The purpose of this manual is to assist the user in understanding and operating the City-County Allocation Model (CCAM). The annual explains the data sources for the model and code modifications as well as the operational procedures

  14. Comments on a time-dependent version of the linear-quadratic model

    International Nuclear Information System (INIS)

    Tucker, S.L.; Travis, E.L.

    1990-01-01

    The accuracy and interpretation of the 'LQ + time' model are discussed. Evidence is presented, based on data in the literature, that this model does not accurately describe the changes in isoeffect dose occurring with protraction of the overall treatment time during fractionated irradiation of the lung. This lack of fit of the model explains, in part, the surprisingly large values of γ/α that have been derived from experimental lung data. The large apparent time factors for lung suggested by the model are also partly explained by the fact that γT/α, despite having units of dose, actually measures the influence of treatment time on the effect scale, not the dose scale, and is shown to consistently overestimate the change in total dose. The unusually high values of α/β that have been derived for lung using the model are shown to be influenced by the method by which the model was fitted to data. Reanalyses of the data using a more statistically valid regression procedure produce estimates of α/β more typical of those usually cited for lung. Most importantly, published isoeffect data from lung indicate that the true deviation from the linear-quadratic (LQ) model is nonlinear in time, instead of linear, and also depends on other factors such as the effect level and the size of dose per fraction. Thus, the authors do not advocate the use of the 'LQ + time' expression as a general isoeffect model. (author). 32 refs.; 3 figs.; 1 tab

  15. Hydrogen Macro System Model User Guide, Version 1.2.1

    Energy Technology Data Exchange (ETDEWEB)

    Ruth, M.; Diakov, V.; Sa, T.; Goldsby, M.; Genung, K.; Hoseley, R.; Smith, A.; Yuzugullu, E.

    2009-07-01

    The Hydrogen Macro System Model (MSM) is a simulation tool that links existing and emerging hydrogen-related models to perform rapid, cross-cutting analysis. It allows analysis of the economics, primary energy-source requirements, and emissions of hydrogen production and delivery pathways.

  16. Electricity market pricing, risk hedging and modeling

    Science.gov (United States)

    Cheng, Xu

    In this dissertation, we investigate the pricing, price risk hedging/arbitrage, and simplified system modeling for a centralized LMP-based electricity market. In an LMP-based market model, the full AC power flow model and the DC power flow model are most widely used to represent the transmission system. We investigate the differences of dispatching results, congestion pattern, and LMPs for the two power flow models. An appropriate LMP decomposition scheme to quantify the marginal costs of the congestion and real power losses is critical for the implementation of financial risk hedging markets. However, the traditional LMP decomposition heavily depends on the slack bus selection. In this dissertation we propose a slack-independent scheme to break LMP down into energy, congestion, and marginal loss components by analyzing the actual marginal cost of each bus at the optimal solution point. The physical and economic meanings of the marginal effect at each bus provide accurate price information for both congestion and losses, and thus the slack-dependency of the traditional scheme is eliminated. With electricity priced at the margin instead of the average value, the market operator typically collects more revenue from power sellers than that paid to power buyers. According to the LMP decomposition results, the revenue surplus is then divided into two parts: congestion charge surplus and marginal loss revenue surplus. We apply the LMP decomposition results to the financial tools, such as financial transmission right (FTR) and loss hedging right (LHR), which have been introduced to hedge against price risks associated to congestion and losses, to construct a full price risk hedging portfolio. The two-settlement market structure and the introduction of financial tools inevitably create market manipulation opportunities. We investigate several possible market manipulation behaviors by virtual bidding and propose a market monitor approach to identify and quantify such

  17. Model Package Report: Central Plateau Vadose Zone Geoframework Version 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Springer, Sarah D.

    2018-03-27

    The purpose of the Central Plateau Vadose Zone (CPVZ) Geoframework model (GFM) is to provide a reasonable, consistent, and defensible three-dimensional (3D) representation of the vadose zone beneath the Central Plateau at the Hanford Site to support the Composite Analysis (CA) vadose zone contaminant fate and transport models. The GFM is a 3D representation of the subsurface geologic structure. From this 3D geologic model, exported results in the form of point, surface, and/or volumes are used as inputs to populate and assemble the various numerical model architectures, providing a 3D-layered grid that is consistent with the GFM. The objective of this report is to define the process used to produce a hydrostratigraphic model for the vadose zone beneath the Hanford Site Central Plateau and the corresponding CA domain.

  18. U.S. Coast Guard Guide for the Management of Crew Endurance Risk Factors - Version 1.0

    National Research Council Canada - National Science Library

    Comperatore, Carlos

    2001-01-01

    .... This Guide will show you how to identify and manage crew endurance risk factors. The step-by-step process will guide you in selecting and implementing the controls necessary to improve crew endurance...

  19. Regional scale ecological risk assessment: using the relative risk model

    National Research Council Canada - National Science Library

    Landis, Wayne G

    2005-01-01

    ...) in the performance of regional-scale ecological risk assessments. The initial chapters present the methodology and the critical nature of the interaction between risk assessors and decision makers...

  20. Multicomponent mass transport model: theory and numerical implementation (discrete-parcel-random-walk version)

    International Nuclear Information System (INIS)

    Ahlstrom, S.W.; Foote, H.P.; Arnett, R.C.; Cole, C.R.; Serne, R.J.

    1977-05-01

    The Multicomponent Mass Transfer (MMT) Model is a generic computer code, currently in its third generation, that was developed to predict the movement of radiocontaminants in the saturated and unsaturated sediments of the Hanford Site. This model was designed to use the water movement patterns produced by the unsaturated and saturated flow models coupled with dispersion and soil-waste reaction submodels to predict contaminant transport. This report documents the theorical foundation and the numerical solution procedure of the current (third) generation of the MMT Model. The present model simulates mass transport processes using an analog referred to as the Discrete-Parcel-Random-Walk (DPRW) algorithm. The basic concepts of this solution technique are described and the advantages and disadvantages of the DPRW scheme are discussed in relation to more conventional numerical techniques such as the finite-difference and finite-element methods. Verification of the numerical algorithm is demonstrated by comparing model results with known closed-form solutions. A brief error and sensitivity analysis of the algorithm with respect to numerical parameters is also presented. A simulation of the tritium plume beneath the Hanford Site is included to illustrate the use of the model in a typical application. 32 figs

  1. Statistical analysis of fracture data, adapted for modelling Discrete Fracture Networks-Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Munier, Raymond

    2004-04-01

    The report describes the parameters which are necessary for DFN modelling, the way in which they can be extracted from the data base acquired during site investigations, and their assignment to geometrical objects in the geological model. The purpose here is to present a methodology for use in SKB modelling projects. Though the methodology is deliberately tuned to facilitate subsequent DFN modelling with other tools, some of the recommendations presented here are applicable to other aspects of geo-modelling as well. For instance, we here recommend a nomenclature to be used within SKB modelling projects, which are truly multidisciplinary, to ease communications between scientific disciplines and avoid misunderstanding of common concepts. This report originally occurred as an appendix to a strategy report for geological modelling (SKB-R--03-07). Strategy reports were intended to be successively updated to include experience gained during site investigations and site modelling. Rather than updating the entire strategy report, we choose to present the update of the appendix as a stand-alone document. This document thus replaces Appendix A2 in SKB-R--03-07. In short, the update consists of the following: The target audience has been broadened and as a consequence thereof, the purpose of the document. Correction of errors found in various formulae. All expressions have been rewritten. Inclusion of more worked examples in each section. A new section describing area normalisation. A new section on spatial correlation. A new section describing anisotropy. A new chapter describing the expected output from DFN modelling, within SKB projects.

  2. Statistical analysis of fracture data, adapted for modelling Discrete Fracture Networks-Version 2

    International Nuclear Information System (INIS)

    Munier, Raymond

    2004-04-01

    The report describes the parameters which are necessary for DFN modelling, the way in which they can be extracted from the data base acquired during site investigations, and their assignment to geometrical objects in the geological model. The purpose here is to present a methodology for use in SKB modelling projects. Though the methodology is deliberately tuned to facilitate subsequent DFN modelling with other tools, some of the recommendations presented here are applicable to other aspects of geo-modelling as well. For instance, we here recommend a nomenclature to be used within SKB modelling projects, which are truly multidisciplinary, to ease communications between scientific disciplines and avoid misunderstanding of common concepts. This report originally occurred as an appendix to a strategy report for geological modelling (SKB-R--03-07). Strategy reports were intended to be successively updated to include experience gained during site investigations and site modelling. Rather than updating the entire strategy report, we choose to present the update of the appendix as a stand-alone document. This document thus replaces Appendix A2 in SKB-R--03-07. In short, the update consists of the following: The target audience has been broadened and as a consequence thereof, the purpose of the document. Correction of errors found in various formulae. All expressions have been rewritten. Inclusion of more worked examples in each section. A new section describing area normalisation. A new section on spatial correlation. A new section describing anisotropy. A new chapter describing the expected output from DFN modelling, within SKB projects

  3. A Scalable Version of the Navy Operational Global Atmospheric Prediction System Spectral Forecast Model

    Directory of Open Access Journals (Sweden)

    Thomas E. Rosmond

    2000-01-01

    Full Text Available The Navy Operational Global Atmospheric Prediction System (NOGAPS includes a state-of-the-art spectral forecast model similar to models run at several major operational numerical weather prediction (NWP centers around the world. The model, developed by the Naval Research Laboratory (NRL in Monterey, California, has run operational at the Fleet Numerical Meteorological and Oceanographic Center (FNMOC since 1982, and most recently is being run on a Cray C90 in a multi-tasked configuration. Typically the multi-tasked code runs on 10 to 15 processors with overall parallel efficiency of about 90%. resolution is T159L30, but other operational and research applications run at significantly lower resolutions. A scalable NOGAPS forecast model has been developed by NRL in anticipation of a FNMOC C90 replacement in about 2001, as well as for current NOGAPS research requirements to run on DOD High-Performance Computing (HPC scalable systems. The model is designed to run with message passing (MPI. Model design criteria include bit reproducibility for different processor numbers and reasonably efficient performance on fully shared memory, distributed memory, and distributed shared memory systems for a wide range of model resolutions. Results for a wide range of processor numbers, model resolutions, and different vendor architectures are presented. Single node performance has been disappointing on RISC based systems, at least compared to vector processor performance. This is a common complaint, and will require careful re-examination of traditional numerical weather prediction (NWP model software design and data organization to fully exploit future scalable architectures.

  4. Parameterization Improvements and Functional and Structural Advances in Version 4 of the Community Land Model

    Directory of Open Access Journals (Sweden)

    Andrew G. Slater

    2011-05-01

    Full Text Available The Community Land Model is the land component of the Community Climate System Model. Here, we describe a broad set of model improvements and additions that have been provided through the CLM development community to create CLM4. The model is extended with a carbon-nitrogen (CN biogeochemical model that is prognostic with respect to vegetation, litter, and soil carbon and nitrogen states and vegetation phenology. An urban canyon model is added and a transient land cover and land use change (LCLUC capability, including wood harvest, is introduced, enabling study of historic and future LCLUC on energy, water, momentum, carbon, and nitrogen fluxes. The hydrology scheme is modified with a revised numerical solution of the Richards equation and a revised ground evaporation parameterization that accounts for litter and within-canopy stability. The new snow model incorporates the SNow and Ice Aerosol Radiation model (SNICAR - which includes aerosol deposition, grain-size dependent snow aging, and vertically-resolved snowpack heating –– as well as new snow cover and snow burial fraction parameterizations. The thermal and hydrologic properties of organic soil are accounted for and the ground column is extended to ~50-m depth. Several other minor modifications to the land surface types dataset, grass and crop optical properties, atmospheric forcing height, roughness length and displacement height, and the disposition of snow-capped runoff are also incorporated.Taken together, these augmentations to CLM result in improved soil moisture dynamics, drier soils, and stronger soil moisture variability. The new model also exhibits higher snow cover, cooler soil temperatures in organic-rich soils, greater global river discharge, and lower albedos over forests and grasslands, all of which are improvements compared to CLM3.5. When CLM4 is run with CN, the mean biogeophysical simulation is slightly degraded because the vegetation structure is prognostic rather

  5. On a discrete version of the CP 1 sigma model and surfaces immersed in R3

    International Nuclear Information System (INIS)

    Grundland, A M; Levi, D; Martina, L

    2003-01-01

    We present a discretization of the CP 1 sigma model. We show that the discrete CP 1 sigma model is described by a nonlinear partial second-order difference equation with rational nonlinearity. To derive discrete surfaces immersed in three-dimensional Euclidean space a 'complex' lattice is introduced. The so-obtained surfaces are characterized in terms of the quadrilateral cross-ratio of four surface points. In this way we prove that all surfaces associated with the discrete CP 1 sigma model are of constant mean curvature. An explicit example of such discrete surfaces is constructed

  6. Technical manual for basic version of the Markov chain nest productivity model (MCnest)

    Science.gov (United States)

    The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...

  7. The Cloud Feedback Model Intercomparison Project Observational Simulator Package: Version 2

    Science.gov (United States)

    Swales, Dustin J.; Pincus, Robert; Bodas-Salcedo, Alejandro

    2018-01-01

    The Cloud Feedback Model Intercomparison Project Observational Simulator Package (COSP) gathers together a collection of observation proxies or satellite simulators that translate model-simulated cloud properties to synthetic observations as would be obtained by a range of satellite observing systems. This paper introduces COSP2, an evolution focusing on more explicit and consistent separation between host model, coupling infrastructure, and individual observing proxies. Revisions also enhance flexibility by allowing for model-specific representation of sub-grid-scale cloudiness, provide greater clarity by clearly separating tasks, support greater use of shared code and data including shared inputs across simulators, and follow more uniform software standards to simplify implementation across a wide range of platforms. The complete package including a testing suite is freely available.

  8. The Cloud Feedback Model Intercomparison Project Observational Simulator Package: Version 2

    Directory of Open Access Journals (Sweden)

    D. J. Swales

    2018-01-01

    Full Text Available The Cloud Feedback Model Intercomparison Project Observational Simulator Package (COSP gathers together a collection of observation proxies or satellite simulators that translate model-simulated cloud properties to synthetic observations as would be obtained by a range of satellite observing systems. This paper introduces COSP2, an evolution focusing on more explicit and consistent separation between host model, coupling infrastructure, and individual observing proxies. Revisions also enhance flexibility by allowing for model-specific representation of sub-grid-scale cloudiness, provide greater clarity by clearly separating tasks, support greater use of shared code and data including shared inputs across simulators, and follow more uniform software standards to simplify implementation across a wide range of platforms. The complete package including a testing suite is freely available.

  9. FMCSA safety program effectiveness measurement : carrier intervention effectiveness model, version 1.0 : [analysis brief].

    Science.gov (United States)

    2015-01-01

    The Carrier Intervention Effectiveness Model (CIEM) : provides the Federal Motor Carrier Safety : Administration (FMCSA) with a tool for measuring : the safety benefits of carrier interventions conducted : under the Compliance, Safety, Accountability...

  10. Modeled Radar Attenuation Rate Profile at the Vostok 5G Ice Core Site, Antarctica, Version 1

    Data.gov (United States)

    National Aeronautics and Space Administration — This data set provides a modeled radar attenuation rate profile, showing the predicted contributions from pure ice and impurities to radar attenuation at the Vostok...

  11. User’s manual for basic version of MCnest Markov chain nest productivity model

    Science.gov (United States)

    The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...

  12. MAPSS: Mapped Atmosphere-Plant-Soil System Model, Version 1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — ABSTRACT: MAPSS (Mapped Atmosphere-Plant-Soil System) is a landscape to global vegetation distribution model that was developed to simulate the potential biosphere...

  13. MAPSS: Mapped Atmosphere-Plant-Soil System Model, Version 1.0

    Data.gov (United States)

    National Aeronautics and Space Administration — MAPSS (Mapped Atmosphere-Plant-Soil System) is a landscape to global vegetation distribution model that was developed to simulate the potential biosphere impacts and...

  14. Illustrating and homology modeling the proteins of the Zika virus [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Sean Ekins

    2016-09-01

    Full Text Available The Zika virus (ZIKV is a flavivirus of the family Flaviviridae, which is similar to dengue virus, yellow fever and West Nile virus. Recent outbreaks in South America, Latin America, the Caribbean and in particular Brazil have led to concern for the spread of the disease and potential to cause Guillain-Barré syndrome and microcephaly. Although ZIKV has been known of for over 60 years there is very little in the way of knowledge of the virus with few publications and no crystal structures. No antivirals have been tested against it either in vitro or in vivo. ZIKV therefore epitomizes a neglected disease. Several suggested steps have been proposed which could be taken to initiate ZIKV antiviral drug discovery using both high throughput screens as well as structure-based design based on homology models for the key proteins. We now describe preliminary homology models created for NS5, FtsJ, NS4B, NS4A, HELICc, DEXDc, peptidase S7, NS2B, NS2A, NS1, E stem, glycoprotein M, propeptide, capsid and glycoprotein E using SWISS-MODEL. Eleven out of 15 models pass our model quality criteria for their further use. While a ZIKV glycoprotein E homology model was initially described in the immature conformation as a trimer, we now describe the mature dimer conformer which allowed the construction of an illustration of the complete virion. By comparing illustrations of ZIKV based on this new homology model and the dengue virus crystal structure we propose potential differences that could be exploited for antiviral and vaccine design. The prediction of sites for glycosylation on this protein may also be useful in this regard. While we await a cryo-EM structure of ZIKV and eventual crystal structures of the individual proteins, these homology models provide the community with a starting point for structure-based design of drugs and vaccines as well as a for computational virtual screening.

  15. Formal Analysis of Functional Behaviour for Model Transformations Based on Triple Graph Grammars - Extended Version

    OpenAIRE

    Hermann, Frank; Ehrig, Hartmut; Orejas, Fernando; Ulrike, Golas

    2010-01-01

    Triple Graph Grammars (TGGs) are a well-established concept for the specification of model transformations. In previous work we have formalized and analyzed already crucial properties of model transformations like termination, correctness and completeness, but functional behaviour - especially local confluence - is missing up to now. In order to close this gap we generate forward translation rules, which extend standard forward rules by translation attributes keeping track of the elements whi...

  16. Code-switched English Pronunciation Modeling for Swahili Spoken Term Detection (Pub Version, Open Access)

    Science.gov (United States)

    2016-05-03

    model (JSM), developed using Sequitur16,17 and trained on the CMUDict0.7b18 Amer- ican English dictionary (over 134k words), was used to detect English ...modeled using the closest Swahili vowel or vowel combination. In both cases these English L2P predictions were added to a dictionary as variants to swa... English queries as a function of overlap/correspondence with an existing reference English pronunciation dictionary . As the reference dictionary , we

  17. Impact of numerical choices on water conservation in the E3SM Atmosphere Model version 1 (EAMv1

    Directory of Open Access Journals (Sweden)

    K. Zhang

    2018-06-01

    Full Text Available The conservation of total water is an important numerical feature for global Earth system models. Even small conservation problems in the water budget can lead to systematic errors in century-long simulations. This study quantifies and reduces various sources of water conservation error in the atmosphere component of the Energy Exascale Earth System Model. Several sources of water conservation error have been identified during the development of the version 1 (V1 model. The largest errors result from the numerical coupling between the resolved dynamics and the parameterized sub-grid physics. A hybrid coupling using different methods for fluid dynamics and tracer transport provides a reduction of water conservation error by a factor of 50 at 1° horizontal resolution as well as consistent improvements at other resolutions. The second largest error source is the use of an overly simplified relationship between the surface moisture flux and latent heat flux at the interface between the host model and the turbulence parameterization. This error can be prevented by applying the same (correct relationship throughout the entire model. Two additional types of conservation error that result from correcting the surface moisture flux and clipping negative water concentrations can be avoided by using mass-conserving fixers. With all four error sources addressed, the water conservation error in the V1 model becomes negligible and insensitive to the horizontal resolution. The associated changes in the long-term statistics of the main atmospheric features are small. A sensitivity analysis is carried out to show that the magnitudes of the conservation errors in early V1 versions decrease strongly with temporal resolution but increase with horizontal resolution. The increased vertical resolution in V1 results in a very thin model layer at the Earth's surface, which amplifies the conservation error associated with the surface moisture flux correction. We note

  18. NGNP Risk Management Database: A Model for Managing Risk

    International Nuclear Information System (INIS)

    Collins, John

    2009-01-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft(reg s ign) Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool's design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  19. NGNP Risk Management Database: A Model for Managing Risk

    Energy Technology Data Exchange (ETDEWEB)

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  20. Hypnotic drug risks of mortality, infection, depression, and cancer: but lack of benefit [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Daniel F. Kripke

    2016-05-01

    Full Text Available This is a review of hypnotic drug risks and benefits, reassessing and updating advice presented to the Commissioner of the Food and Drug Administration (United States FDA. Almost every month, new information appears about the risks of hypnotics (sleeping pills. This review includes new information on the growing USA overdose epidemic, eight new epidemiologic studies of hypnotics’ mortality not available for previous compilations, and new emphasis on risks of short-term hypnotic prescription. The most important risks of hypnotics include excess mortality, especially overdose deaths, quiet deaths at night, infections, cancer, depression and suicide, automobile crashes, falls, and other accidents, and hypnotic-withdrawal insomnia. The short-term use of one-two prescriptions is associated with greater risk per dose than long-term use. Hypnotics are usually prescribed without approved indication, most often with specific contraindications, but even when indicated, there is little or no benefit. The recommended doses objectively increase sleep little if at all, daytime performance is often made worse, not better, and the lack of general health benefits is commonly misrepresented in advertising. Treatments such as the cognitive behavioral treatment of insomnia and bright light treatment of circadian rhythm disorders might offer safer and more effective alternative approaches to insomnia.

  1. Geological discrete fracture network model for the Olkiluoto site, Eurajoki, Finland. Version 2.0

    International Nuclear Information System (INIS)

    Fox, A.; Forchhammer, K.; Pettersson, A.; La Pointe, P.; Lim, D-H.

    2012-06-01

    This report describes the methods, analyses, and conclusions of the modeling team in the production of the 2010 revision to the geological discrete fracture network (DFN) model for the Olkiluoto Site in Finland. The geological DFN is a statistical model for stochastically simulating rock fractures and minor faults at a scale ranging from approximately 0.05 m to approximately 565m; deformation zones are expressly excluded from the DFN model. The DFN model is presented as a series of tables summarizing probability distributions for several parameters necessary for fracture modeling: fracture orientation, fracture size, fracture intensity, and associated spatial constraints. The geological DFN is built from data collected during site characterization (SC) activities at Olkiluoto, which is selected to function as a final deep geological repository for spent fuel and nuclear waste from the Finnish nuclear power program. Data used in the DFN analyses include fracture maps from surface outcrops and trenches, geological and structural data from cored drillholes, and fracture information collected during the construction of the main tunnels and shafts at the ONKALO laboratory. Unlike the initial geological DFN, which was focused on the vicinity of the ONKALO tunnel, the 2010 revisions present a model parameterization for the entire island. Fracture domains are based on the tectonic subdivisions at the site (northern, central, and southern tectonic units) presented in the Geological Site Model (GSM), and are further subdivided along the intersection of major brittle-ductile zones. The rock volume at Olkiluoto is dominated by three distinct fracture sets: subhorizontally-dipping fractures striking north-northeast and dipping to the east that is subparallel to the mean bedrock foliation direction, a subvertically-dipping fracture set striking roughly north-south, and a subvertically-dipping fracture set striking approximately east-west. The subhorizontally-dipping fractures

  2. Geological discrete fracture network model for the Olkiluoto site, Eurajoki, Finland. Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Fox, A.; Forchhammer, K.; Pettersson, A. [Golder Associates AB, Stockholm (Sweden); La Pointe, P.; Lim, D-H. [Golder Associates Inc. (Finland)

    2012-06-15

    This report describes the methods, analyses, and conclusions of the modeling team in the production of the 2010 revision to the geological discrete fracture network (DFN) model for the Olkiluoto Site in Finland. The geological DFN is a statistical model for stochastically simulating rock fractures and minor faults at a scale ranging from approximately 0.05 m to approximately 565m; deformation zones are expressly excluded from the DFN model. The DFN model is presented as a series of tables summarizing probability distributions for several parameters necessary for fracture modeling: fracture orientation, fracture size, fracture intensity, and associated spatial constraints. The geological DFN is built from data collected during site characterization (SC) activities at Olkiluoto, which is selected to function as a final deep geological repository for spent fuel and nuclear waste from the Finnish nuclear power program. Data used in the DFN analyses include fracture maps from surface outcrops and trenches, geological and structural data from cored drillholes, and fracture information collected during the construction of the main tunnels and shafts at the ONKALO laboratory. Unlike the initial geological DFN, which was focused on the vicinity of the ONKALO tunnel, the 2010 revisions present a model parameterization for the entire island. Fracture domains are based on the tectonic subdivisions at the site (northern, central, and southern tectonic units) presented in the Geological Site Model (GSM), and are further subdivided along the intersection of major brittle-ductile zones. The rock volume at Olkiluoto is dominated by three distinct fracture sets: subhorizontally-dipping fractures striking north-northeast and dipping to the east that is subparallel to the mean bedrock foliation direction, a subvertically-dipping fracture set striking roughly north-south, and a subvertically-dipping fracture set striking approximately east-west. The subhorizontally-dipping fractures

  3. Water, Energy, and Biogeochemical Model (WEBMOD), user’s manual, version 1

    Science.gov (United States)

    Webb, Richard M.T.; Parkhurst, David L.

    2017-02-08

    The Water, Energy, and Biogeochemical Model (WEBMOD) uses the framework of the U.S. Geological Survey (USGS) Modular Modeling System to simulate fluxes of water and solutes through watersheds. WEBMOD divides watersheds into model response units (MRU) where fluxes and reactions are simulated for the following eight hillslope reservoir types: canopy; snowpack; ponding on impervious surfaces; O-horizon; two reservoirs in the unsaturated zone, which represent preferential flow and matrix flow; and two reservoirs in the saturated zone, which also represent preferential flow and matrix flow. The reservoir representing ponding on impervious surfaces, currently not functional (2016), will be implemented once the model is applied to urban areas. MRUs discharge to one or more stream reservoirs that flow to the outlet of the watershed. Hydrologic fluxes in the watershed are simulated by modules derived from the USGS Precipitation Runoff Modeling System; the National Weather Service Hydro-17 snow model; and a topography-driven hydrologic model (TOPMODEL). Modifications to the standard TOPMODEL include the addition of heterogeneous vertical infiltration rates; irrigation; lateral and vertical preferential flows through the unsaturated zone; pipe flow draining the saturated zone; gains and losses to regional aquifer systems; and the option to simulate baseflow discharge by using an exponential, parabolic, or linear decrease in transmissivity. PHREEQC, an aqueous geochemical model, is incorporated to simulate chemical reactions as waters evaporate, mix, and react within the various reservoirs of the model. The reactions that can be specified for a reservoir include equilibrium reactions among water; minerals; surfaces; exchangers; and kinetic reactions such as kinetic mineral dissolution or precipitation, biologically mediated reactions, and radioactive decay. WEBMOD also simulates variations in the concentrations of the stable isotopes deuterium and oxygen-18 as a result of

  4. Biosphere-Atmosphere Transfer Scheme (BATS) version le as coupled to the NCAR community climate model. Technical note. [NCAR (National Center for Atmospheric Research)

    Energy Technology Data Exchange (ETDEWEB)

    Dickinson, R.E.; Henderson-Sellers, A.; Kennedy, P.J.

    1993-08-01

    A comprehensive model of land-surface processes has been under development suitable for use with various National Center for Atmospheric Research (NCAR) General Circulation Models (GCMs). Special emphasis has been given to describing properly the role of vegetation in modifying the surface moisture and energy budgets. The result of these efforts has been incorporated into a boundary package, referred to as the Biosphere-Atmosphere Transfer Scheme (BATS). The current frozen version, BATS1e is a piece of software about four thousand lines of code that runs as an offline version or coupled to the Community Climate Model (CCM).

  5. Geological discrete-fracture network model (version 1) for the Olkiluoto site, Finland

    International Nuclear Information System (INIS)

    Fox, A.; Buoro, A.; Dahlbo, K.; Wiren, L.

    2009-10-01

    This report describes the methods, analyses, and conclusions of the modelling team in the production of a discrete-fracture network (DFN) model for the Olkiluoto Site in Finland. The geological DFN is a statistical model for stochastically simulating rock fractures and minor faults at a scale ranging from approximately 0.05 m to approximately 500 m; an upper scale limit is not expressly defined, but the DFN model explicitly excludes structures at deformation-zone scales (∼ 500 m) and larger. The DFN model is presented as a series of tables summarizing probability distributions for several parameters necessary for fracture modelling: fracture orientation, fracture size, fracture intensity, and associated spatial constraints. The geological DFN is built from data collected during site characterization (SC) activities at Olkiluoto, which is currently planned to function as a final deep geological repository for spent fuel and nuclear waste from the Finnish nuclear power program. Data used in the DFN analyses include fracture maps from surface outcrops and trenches (as of July 2007), geological and structural data from cored boreholes (as of July 2007), and fracture information collected during the construction of the main tunnels and shafts at the ONKALO laboratory (January 2008). The modelling results suggest that the rock volume at Olkiluoto surrounding the ONKALO tunnel can be separated into three distinct volumes (fracture domains): an upper block, an intermediate block, and a lower block. The three fracture domains are bounded horizontally and vertically by large deformation zones. Fracture properties, such as fracture orientation and relative orientation set intensity, vary between fracture domains. The rock volume at Olkiluoto is dominated by three distinct fracture sets: subhorizontally-dipping fractures striking north-northeast and dipping to the east, a subvertically-dipping fracture set striking roughly north-south, and a subverticallydipping fracture set

  6. Hydrogeological DFN modelling using structural and hydraulic data from KLX04. Preliminary site description Laxemar subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Follin, Sven [SF GeoLogic AB, Taeby (Sweden); Stigsson, Martin [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2006-04-15

    SKB is conducting site investigations for a high-level nuclear waste repository in fractured crystalline rocks at two coastal areas in Sweden. The two candidate areas are named Forsmark and Simpevarp. The site characterisation work is divided into two phases, an initial site investigation phase (ISI) and a complete site investigation phase (CSI). The results of the ISI phase are used as a basis for deciding on the subsequent CSI phase. On the basis of the CSI investigations a decision is made as to whether detailed characterisation will be performed (including sinking of a shaft). An integrated component in the site characterisation work is the development of site descriptive models. These comprise basic models in three dimensions with an accompanying text description. Central in the modelling work is the geological model which provides the geometrical context in terms of a model of deformation zones and the less fractured rock mass between the zones. Using the geological and geometrical description models as a basis, descriptive models for other disciplines (surface ecosystems, hydrogeology, hydrogeochemistry, rock mechanics, thermal properties and transport properties) will be developed. Great care is taken to arrive at a general consistency in the description of the various models and assessment of uncertainty and possible needs of alternative models. The main objective of this study is to support the development of a hydrogeological DFN model (Discrete Fracture Network) for the Preliminary Site Description of the Laxemar area on a regional-scale (SDM version L1.2). A more specific objective of this study is to assess the propagation of uncertainties in the geological DFN modelling reported for L1.2 into the groundwater flow modelling. An improved understanding is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. The latter will serve as a basis for describing the present

  7. Hydrogeological DFN modelling using structural and hydraulic data from KLX04. Preliminary site description Laxemar subarea - version 1.2

    International Nuclear Information System (INIS)

    Follin, Sven; Stigsson, Martin; Svensson, Urban

    2006-04-01

    SKB is conducting site investigations for a high-level nuclear waste repository in fractured crystalline rocks at two coastal areas in Sweden. The two candidate areas are named Forsmark and Simpevarp. The site characterisation work is divided into two phases, an initial site investigation phase (ISI) and a complete site investigation phase (CSI). The results of the ISI phase are used as a basis for deciding on the subsequent CSI phase. On the basis of the CSI investigations a decision is made as to whether detailed characterisation will be performed (including sinking of a shaft). An integrated component in the site characterisation work is the development of site descriptive models. These comprise basic models in three dimensions with an accompanying text description. Central in the modelling work is the geological model which provides the geometrical context in terms of a model of deformation zones and the less fractured rock mass between the zones. Using the geological and geometrical description models as a basis, descriptive models for other disciplines (surface ecosystems, hydrogeology, hydrogeochemistry, rock mechanics, thermal properties and transport properties) will be developed. Great care is taken to arrive at a general consistency in the description of the various models and assessment of uncertainty and possible needs of alternative models. The main objective of this study is to support the development of a hydrogeological DFN model (Discrete Fracture Network) for the Preliminary Site Description of the Laxemar area on a regional-scale (SDM version L1.2). A more specific objective of this study is to assess the propagation of uncertainties in the geological DFN modelling reported for L1.2 into the groundwater flow modelling. An improved understanding is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. The latter will serve as a basis for describing the present

  8. The SF-8 Spanish Version for Health-Related Quality of Life Assessment: Psychometric Study with IRT and CFA Models.

    Science.gov (United States)

    Tomás, José M; Galiana, Laura; Fernández, Irene

    2018-03-22

    The aim of current research is to analyze the psychometric properties of the Spanish version of the SF-8, overcoming previous shortcomings. A double line of analyses was used: competitive structural equations models to establish factorial validity, and Item Response theory to analyze item psychometric characteristics and information. 593 people aged 60 years or older, attending long life learning programs at the University were surveyed. Their age ranged from 60 to 92 years old. 67.6% were women. The survey included scales on personality dimensions, attitudes, perceptions, and behaviors related to aging. Competitive confirmatory models pointed out two-factors (physical and mental health) as the best representation of the data: χ2(13) = 72.37 (p < .01); CFI = .99; TLI = .98; RMSEA = .08 (.06, .10). Item 5 was removed because of unreliability and cross-loading. Graded response models showed appropriate fit for two-parameter logistic model both the physical and the mental dimensions. Item Information Curves and Test Information Functions pointed out that the SF-8 was more informative for low levels of health. The Spanish SF-8 has adequate psychometric properties, being better represented by two dimensions, once Item 5 is removed. Gathering evidence on patient-reported outcome measures is of crucial importance, as this type of measurement instruments are increasingly used in clinical arena.

  9. SITE-94. The CRYSTAL Geosphere Transport Model: Technical documentation version 2.1

    International Nuclear Information System (INIS)

    Worgan, K.; Robinson, P.

    1995-12-01

    CRYSTAL, a one-dimensional contaminant transport model of a densely fissured geosphere, was originally developed for the SKI Project-90 performance assessment program. It has since been extended to include matrix blocks of alternative basic geometries. CRYSTAL predicts the transport of arbitrary-length decay chains by advection, diffusion and surface sorption in the fissures and diffusion into the rock matrix blocks. The model equations are solved in Laplace transform space, and inverted numerically to the time domain. This approach avoids time-stepping and consequently is numerically very efficient. The source term for crystal may be supplied internally using either simple leaching or band release submodels or by input of a general time-series output from a near-field model. The time series input is interfaced with the geosphere model using the method of convolution. The response of the geosphere to delta-function inputs from each nuclide is combined with the time series outputs from the near-field, to obtain the nuclide flux emerging from the far-field. 14 refs

  10. User's Manual MCnest - Markov Chain Nest Productivity Model Version 2.0

    Science.gov (United States)

    The Markov chain nest productivity model, or MCnest, is a set of algorithms for integrating the results of avian toxicity tests with reproductive life-history data to project the relative magnitude of chemical effects on avian reproduction. The mathematical foundation of MCnest i...

  11. A Functional Model of Sensemaking in a Neurocognitive Architecture (Open Access, Publisher’s Version)

    Science.gov (United States)

    2013-07-08

    updating processes involved in sensemaking. We do this by developing ACT-R models to specify how ele- mentary cognitive modules and processes are marshaled ...13] M. I. Posner, R. Goldsmith , and K. E. Welton Jr., “Perceived distance and the classification of distorted patterns,” Journal of Experimental

  12. Landfill Gas Energy Cost Model Version 3.0 (LFGcost-Web V3.0)

    Science.gov (United States)

    To help stakeholders estimate the costs of a landfill gas (LFG) energy project, in 2002, LMOP developed a cost tool (LFGcost). Since then, LMOP has routinely updated the tool to reflect changes in the LFG energy industry. Initially the model was designed for EPA to assist landfil...

  13. LANDFILL GAS EMISSIONS MODEL (LANDGEM) VERSION 3.02 USER'S GUIDE

    Science.gov (United States)

    The Landfill Gas Emissions Model (LandGEM) is an automated estimation tool with a Microsoft Excel interface that can be used to estimate emission rates for total landfill gas, methane, carbon dioxide, nonmethane organic compounds, and individual air pollutants from municipal soli...

  14. Unit testing, model validation, and biological simulation [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Gopal P. Sarma

    2016-08-01

    Full Text Available The growth of the software industry has gone hand in hand with the development of tools and cultural practices for ensuring the reliability of complex pieces of software. These tools and practices are now acknowledged to be essential to the management of modern software. As computational models and methods have become increasingly common in the biological sciences, it is important to examine how these practices can accelerate biological software development and improve research quality. In this article, we give a focused case study of our experience with the practices of unit testing and test-driven development in OpenWorm, an open-science project aimed at modeling Caenorhabditis elegans. We identify and discuss the challenges of incorporating test-driven development into a heterogeneous, data-driven project, as well as the role of model validation tests, a category of tests unique to software which expresses scientific models.

  15. The GRASP 3: Graphical Reliability Analysis Simulation Program. Version 3: A users' manual and modelling guide

    Science.gov (United States)

    Phillips, D. T.; Manseur, B.; Foster, J. W.

    1982-01-01

    Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.

  16. Integrated source-risk model for radon: A definition study

    International Nuclear Information System (INIS)

    Laheij, G.M.H.; Aldenkamp, F.J.; Stoop, P.

    1993-10-01

    The purpose of a source-risk model is to support policy making on radon mitigation by comparing effects of various policy options and to enable optimization of counter measures applied to different parts of the source-risk chain. There are several advantages developing and using a source-risk model: risk calculations are standardized; the effects of measures applied to different parts of the source-risk chain can be better compared because interactions are included; and sensitivity analyses can be used to determine the most important parameters within the total source-risk chain. After an inventory of processes and sources to be included in the source-risk chain, the models presently available in the Netherlands are investigated. The models were screened for completeness, validation and operational status. The investigation made clear that, by choosing for each part of the source-risk chain the most convenient model, a source-risk chain model for radon may be realized. However, the calculation of dose out of the radon concentrations and the status of the validation of most models should be improved. Calculations with the proposed source-risk model will give estimations with a large uncertainty at the moment. For further development of the source-risk model an interaction between the source-risk model and experimental research is recommended. Organisational forms of the source-risk model are discussed. A source-risk model in which only simple models are included is also recommended. The other models are operated and administrated by the model owners. The model owners execute their models for a combination of input parameters. The output of the models is stored in a database which will be used for calculations with the source-risk model. 5 figs., 15 tabs., 7 appendices, 14 refs

  17. Preliminary site description: Groundwater flow simulations. Simpevarp area (version 1.1) modelled with CONNECTFLOW

    International Nuclear Information System (INIS)

    Hartley, Lee; Worth, David; Gylling, Bjoern; Marsic, Niko; Holmen, Johan

    2004-08-01

    The main objective of this study is to assess the role of known and unknown hydrogeological conditions for the present-day distribution of saline groundwater at the Simpevarp and Laxemar sites. An improved understanding of the paleo-hydrogeology is necessary in order to gain credibility for the Site Descriptive Model in general and the Site Hydrogeological Description in particular. This is to serve as a basis for describing the present hydrogeological conditions as well as predictions of future hydrogeological conditions. This objective implies a testing of: geometrical alternatives in the structural geology and bedrock fracturing, variants in the initial and boundary conditions, and parameter uncertainties (i.e. uncertainties in the hydraulic property assignment). This testing is necessary in order to evaluate the impact on the groundwater flow field of the specified components and to promote proposals of further investigations of the hydrogeological conditions at the site. The general methodology for modelling transient salt transport and groundwater flow using CONNECTFLOW that was developed for Forsmark has been applied successfully also for Simpevarp. Because of time constraints only a key set of variants were performed that focussed on the influences of DFN model parameters, the kinematic porosity, and the initial condition. Salinity data in deep boreholes available at the time of the project was too limited to allow a good calibration exercise. However, the model predictions are compared with the available data from KLX01 and KLX02 below. Once more salinity data is available it may be possible to draw more definite conclusions based on the differences between variants. At the moment though the differences should just be used understand the sensitivity of the models to various input parameters

  18. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  19. Hypnotic drug risks of mortality, infection, depression, and cancer: but lack of benefit [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Daniel F. Kripke

    2017-03-01

    Full Text Available This is a review of hypnotic drug risks and benefits, reassessing and updating advice presented to the Commissioner of the Food and Drug Administration (United States FDA. Almost every month, new information appears about the risks of hypnotics (sleeping pills. The most important risks of hypnotics include excess mortality, especially overdose deaths, quiet deaths at night, infections, cancer, depression and suicide, automobile crashes, falls, and other accidents, and hypnotic-withdrawal insomnia. Short-term use of one-two prescriptions is associated with greater risk per dose than long-term use. Hypnotics have usually been prescribed without approved indication, most often with specific contraindications, but even when indicated, there is little or no benefit. The recommended doses objectively increase sleep little if at all, daytime performance is often made worse, not better, and the lack of general health benefits is commonly misrepresented in advertising. Treatments such as the cognitive behavioral treatment of insomnia and bright light treatment of circadian rhythm disorders offer safer and more effective alternative approaches to insomnia.

  20. Measuring Psychological Trauma in the Workplace: Psychometric Properties of the Italian Version of the Psychological Injury Risk Indicator—A Cross-Sectional Study

    Directory of Open Access Journals (Sweden)

    Nicola Magnavita

    2015-01-01

    Full Text Available Background. The aim of this study was to cross-culturally adapt the Psychological Injury Risk Indicator (PIRI and to validate its psychometric properties. Methods. Workers from 24 small companies were invited to self-complete the PIRI before undergoing their routine medical examination at the workplace. All participants (841 out of 845, 99.6% were also asked to report occupational injuries and episodes of violence that had occurred at the workplace in the previous 12 months and were given the General Health Questionnaire (GHQ12 to complete. Results. Exploratory factor analysis revealed a 4-factor structure, “sleep problems,” “recovery failure,” “posttraumatic stress symptoms,” and “chronic fatigue,” which were the same subscales observed in the original version. The internal consistency was excellent (alpha = 0.932. ROC curve analysis revealed that the PIRI was much more efficient than GHQ12 in diagnosing workers who had suffered trauma (workplace violence or injury in the previous year, as it revealed an area under the curve (AUC of 0.679 (95% CI: 0.625–0.734 for the PIRI, while for the GHQ12 the AUC was 0.551 (not significant. Conclusions. This study, performed on a large population of workers, provides evidence of the validity of the Italian version of the PIRI.

  1. Business models for renewable energy in the built environment. Updated version

    Energy Technology Data Exchange (ETDEWEB)

    Wuertenberger, L.; Menkveld, M.; Vethman, P.; Van Tilburg, X. [ECN Policy Studies, Amsterdam (Netherlands); Bleyl, J.W. [Energetic Solutions, Graz (Austria)

    2012-04-15

    The project RE-BIZZ aims to provide insight to policy makers and market actors in the way new and innovative business models (and/or policy measures) can stimulate the deployment of renewable energy technologies (RET) and energy efficiency (EE) measures in the built environment. The project is initiated and funded by the IEA Implementing Agreement for Renewable Energy Technology Deployment (IEA-RETD). It analysed ten business models in three categories (amongst others different types of Energy Service Companies (ESCOs), Developing properties certified with a 'green' building label, Building owners profiting from rent increases after EE measures, Property Assessed Clean Energy (PACE) financing, On-bill financing, and Leasing of RET equipment) including their organisational and financial structure, the existing market and policy context, and an analysis of Strengths, Weaknesses, Opportunities and Threats (SWOT). The study concludes with recommendations for policy makers and other market actors.

  2. Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) (Version 2) (External Review Draft)

    Science.gov (United States)

    EPA announced the availability of the draft report, Updates to the Demographic and Spatial Allocation Models to Produce Integrated Climate and Land Use Scenarios (ICLUS) for a 30-day public comment period. The ICLUS version 2 (v2) modeling tool furthered land change mod...

  3. The Everglades Depth Estimation Network (EDEN) surface-water model, version 2

    Science.gov (United States)

    Telis, Pamela A.; Xie, Zhixiao; Liu, Zhongwei; Li, Yingru; Conrads, Paul

    2015-01-01

    The Everglades Depth Estimation Network (EDEN) is an integrated network of water-level gages, interpolation models that generate daily water-level and water-depth data, and applications that compute derived hydrologic data across the freshwater part of the greater Everglades landscape. The U.S. Geological Survey Greater Everglades Priority Ecosystems Science provides support for EDEN in order for EDEN to provide quality-assured monitoring data for the U.S. Army Corps of Engineers Comprehensive Everglades Restoration Plan.

  4. Ion temperature in the outer ionosphere - first version of a global empirical model

    Czech Academy of Sciences Publication Activity Database

    Třísková, Ludmila; Truhlík, Vladimír; Šmilauer, Jan; Smirnova, N. F.

    2004-01-01

    Roč. 34, č. 9 (2004), s. 1998-2003 ISSN 0273-1177 R&D Projects: GA ČR GP205/02/P037; GA AV ČR IAA3042201; GA MŠk ME 651 Institutional research plan: CEZ:AV0Z3042911 Keywords : plasma temperatures * topside ionosphere * empirical models Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 0.548, year: 2004

  5. Air Force Systems Engineering Assessment Model (AF SEAM) Management Guide, Version 2

    Science.gov (United States)

    2010-09-21

    gleaned from experienced professionals who assisted with the model’s development. Examples of the references used include the following: • ISO /IEC...Defense Acquisition Guidebook, Chapter 4 • AFI 63-1201, Life Cycle Systems Engineering • IEEE/EIA 12207 , Software Life Cycle Processes • Air...Selection criteria Reference Material: IEEE/EIA 12207 , MIL-HDBK-514 Other Considerations: Modeling, simulation and analysis techniques can be

  6. Modelling turbulent vertical mixing sensitivity using a 1-D version of NEMO

    Science.gov (United States)

    Reffray, G.; Bourdalle-Badie, R.; Calone, C.

    2015-01-01

    Through two numerical experiments, a 1-D vertical model called NEMO1D was used to investigate physical and numerical turbulent-mixing behaviour. The results show that all the turbulent closures tested (k+l from Blanke and Delecluse, 1993, and two equation models: generic length scale closures from Umlauf and Burchard, 2003) are able to correctly reproduce the classical test of Kato and Phillips (1969) under favourable numerical conditions while some solutions may diverge depending on the degradation of the spatial and time discretization. The performances of turbulence models were then compared with data measured over a 1-year period (mid-2010 to mid-2011) at the PAPA station, located in the North Pacific Ocean. The modelled temperature and salinity were in good agreement with the observations, with a maximum temperature error between -2 and 2 °C during the stratified period (June to October). However, the results also depend on the numerical conditions. The vertical RMSE varied, for different turbulent closures, from 0.1 to 0.3 °C during the stratified period and from 0.03 to 0.15 °C during the homogeneous period. This 1-D configuration at the PAPA station (called PAPA1D) is now available in NEMO as a reference configuration including the input files and atmospheric forcing set described in this paper. Thus, all the results described can be recovered by downloading and launching PAPA1D. The configuration is described on the NEMO site (PAPA">http://www.nemo-ocean.eu/Using-NEMO/Configurations/C1D_PAPA). This package is a good starting point for further investigation of vertical processes.

  7. Software Design Description for the Navy Coastal Ocean Model (NCOM) Version 4.0

    Science.gov (United States)

    2008-12-31

    Recipes Software, U.S., p. 659. Rood, R. B., (1987). Numerical advection algorithms and their role in atmospheric transport and chemistry models... cstr ,lenc) Data Declaration: Integer lenc Character cstr Coamps_uvg2uv Subroutine COAMPS_UVG2UV...are removed from the substrings. Calling Sequence: strpars(cline, cdelim, nstr, cstr , nsto, ierr) NRL/MR/7320--08-9149

  8. The Canadian Defence Input-Output Model DIO Version 4.41

    Science.gov (United States)

    2011-09-01

    Request to develop DND tailored Input/Output Model. Electronic communication from AllenWeldon to Team Leader, Defence Economics Team onMarch 12, 2011...and similar contain- ers 166 1440 Handbags, wallets and similar personal articles such as eyeglass and cigar cases and coin purses 167 1450 Cotton yarn...408 3600 Radar and radio navigation equipment 409 3619 Semi-conductors 410 3621 Printed circuits 411 3622 Integrated circuits 412 3623 Other electronic

  9. Regional groundwater flow model for a glaciation scenario. Simpevarp subarea - version 1.2

    International Nuclear Information System (INIS)

    Jaquet, O.; Siegel, P.

    2006-10-01

    A groundwater flow model (glaciation model) was developed at a regional scale in order to study long term transient effects related to a glaciation scenario likely to occur in response to climatic changes. Conceptually the glaciation model was based on the regional model of Simpevarp and was then extended to a mega-regional scale (of several hundred kilometres) in order to account for the effects of the ice sheet. These effects were modelled using transient boundary conditions provided by a dynamic ice sheet model describing the phases of glacial build-up, glacial completeness and glacial retreat needed for the glaciation scenario. The results demonstrate the strong impact of the ice sheet on the flow field, in particular during the phases of the build-up and the retreat of the ice sheet. These phases last for several thousand years and may cause large amounts of melt water to reach the level of the repository and below. The highest fluxes of melt water are located in the vicinity of the ice margin. As the ice sheet approaches the repository location, the advective effects gain dominance over diffusive effects in the flow field. In particular, up-coning effects are likely to occur at the margin of the ice sheet leading to potential increases in salinity at repository level. For the base case, the entire salinity field of the model is almost completely flushed out at the end of the glaciation period. The flow patterns are strongly governed by the location of the conductive features in the subglacial layer. The influence of these glacial features is essential for the salinity distribution as is their impact on the flow trajectories and, therefore, on the resulting performance measures. Travel times and F-factor were calculated using the method of particle tracking. Glacial effects cause major consequences on the results. In particular, average travel times from the repository to the surface are below 10 a during phases of glacial build-up and retreat. In comparison

  10. Regional groundwater flow model for a glaciation scenario. Simpevarp subarea - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Jaquet, O.; Siegel, P. [Colenco Power Engineering Ltd, Baden-Daettwil (Switzerland)

    2006-10-15

    A groundwater flow model (glaciation model) was developed at a regional scale in order to study long term transient effects related to a glaciation scenario likely to occur in response to climatic changes. Conceptually the glaciation model was based on the regional model of Simpevarp and was then extended to a mega-regional scale (of several hundred kilometres) in order to account for the effects of the ice sheet. These effects were modelled using transient boundary conditions provided by a dynamic ice sheet model describing the phases of glacial build-up, glacial completeness and glacial retreat needed for the glaciation scenario. The results demonstrate the strong impact of the ice sheet on the flow field, in particular during the phases of the build-up and the retreat of the ice sheet. These phases last for several thousand years and may cause large amounts of melt water to reach the level of the repository and below. The highest fluxes of melt water are located in the vicinity of the ice margin. As the ice sheet approaches the repository location, the advective effects gain dominance over diffusive effects in the flow field. In particular, up-coning effects are likely to occur at the margin of the ice sheet leading to potential increases in salinity at repository level. For the base case, the entire salinity field of the model is almost completely flushed out at the end of the glaciation period. The flow patterns are strongly governed by the location of the conductive features in the subglacial layer. The influence of these glacial features is essential for the salinity distribution as is their impact on the flow trajectories and, therefore, on the resulting performance measures. Travel times and F-factor were calculated using the method of particle tracking. Glacial effects cause major consequences on the results. In particular, average travel times from the repository to the surface are below 10 a during phases of glacial build-up and retreat. In comparison

  11. CHROMAT trademark Version 1.1--Soil Chromium Attenuation Evaluation Model

    International Nuclear Information System (INIS)

    Felmy, A.R.; Rai, D.; Zachara, J.M.; Thapa, M.; Gold, M.

    1992-07-01

    This document is the user's manual and technical reference for the Soil Chromium Attenuation Model (CHROMAT trademark), a computer code designed to calculate both the dissolved Cr concentration and the amount of Cr attenuated in soils as a result of the geochemical reactions that occur as Cr-containing leachates migrate through porous soils. The dissolved Cr concentration and the amount of Cr attenuated are calculated using thermodynamic (mechanistic) data for aqueous complexation reactions, adsorption/ desorption reactions, and precipitation/dissolution reactions involving both CR(III) and Cr(VI) species. Use of this mechanistic approach means that CHROMAT trademark requires a minimum amount of site-specific data on leachate and soil characteristics. CHROMAT trademark is distributed in executable form for IBM and IBM-compatible personal computers through a license from the Electric Power Research Institute (EPRI). The user interacts with CHROMAT trademark using menu-driven screen displays. Interactive on-line help options are available. Output from the code can be obtained in tabular or graphic form. This manual describes the development of CHROMAT trademark, including experimental data development in support of the model and model validation studies. The thermodynamic data and computational algorithm are also described. Example problems and results are included

  12. Two modified versions of the speciation code PHREEQE for modelling macromolecule-proton/cation interaction

    International Nuclear Information System (INIS)

    Falck, W.E.

    1991-01-01

    There is a growing need to consider the influence of organic macromolecules on the speciation of ions in natural waters. It is recognized that a simple discrete ligand approach to the binding of protons/cations to organic macromolecules is not appropriate to represent heterogeneities of binding site distributions. A more realistic approach has been incorporated into the speciation code PHREEQE which retains the discrete ligand approach but modifies the binding intensities using an electrostatic (surface complexation) model. To allow for different conformations of natural organic material two alternative concepts have been incorporated: it is assumed that (a) the organic molecules form rigid, impenetrable spheres, and (b) the organic molecules form flat surfaces. The former concept will be more appropriate for molecules in the smaller size range, while the latter will be more representative for larger size molecules or organic surface coatings. The theoretical concept is discussed and the relevant changes to the standard PHREEQE code are explained. The modified codes are called PHREEQEO-RS and PHREEQEO-FS for the rigid-sphere and flat-surface models respectively. Improved output facilities for data transfer to other computers, e.g. the Macintosh, are introduced. Examples where the model is tested against literature data are shown and practical problems are discussed. Appendices contain listings of the modified subroutines GAMMA and PTOT, an example input file and an example command procedure to run the codes on VAX computers

  13. Nuclear power investment risk economic model

    International Nuclear Information System (INIS)

    Houghton, W.J.; Postula, F.D.

    1985-12-01

    This paper describes an economic model which was developed to evaluate the net costs incurred by a utility due to an accident induced outage at a nuclear power plant. During such an outage the portion of the plant operating costs associated with power production are saved; however, the owning utility faces a sizable expense as fossil fuels are burned as a substitute for the incapacitated nuclear power. Additional expenses are incurred by the utility for plant repair and if necessary, decontamination costs. The model makes provision for mitigating these costs by sales of power, property damage insurance payments, tax write-offs and increased rates. Over 60 economic variables contribute to the net cost uncertainty. The values of these variables are treated as uncertainty distributions and are used in a Monte carlo computer program to evaluate the cost uncertainty (investment risk) associated with damage which could occur from various categories of initiating accidents. As an example, results of computations for various levels of damage associated with a loss of coolant accident are shown as a range of consequential plant downtime and unrecovered cost. A typical investment risk profile is shown for these types of accidents. Cost/revenue values for each economic factor are presented for a Three Mile Island - II type accident, e.g., uncontrolled core heatup. 4 refs., 6 figs., 3 tabs

  14. Regional hydrogeological simulations. Numerical modelling using ConnectFlow. Preliminary site description Simpevarp sub area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Hoch, Andrew; Hunter, Fiona; Jackson, Peter [Serco Assurance, Risley (United Kingdom); Marsic, Niko [Kemakta Konsult, Stockholm (Sweden)

    2005-02-01

    objective of this study is to support the development of a preliminary Site Description of the Simpevarp area on a regional-scale based on the available data of August 2004 (Data Freeze S1.2) and the previous Site Description. A more specific objective of this study is to assess the role of known and unknown hydrogeological conditions for the present-day distribution of saline groundwater in the Simpevarp area on a regional-scale. An improved understanding of the paleo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale as well as predictions of future hydrogeological conditions. Other key objectives were to identify the model domain required to simulate regional flow and solute transport at the Simpevarp area and to incorporate a new geological model of the deformation zones produced for Version S1.2.Another difference with Version S1.1 is the increased effort invested in conditioning the hydrogeological property models to the fracture boremap and hydraulic data. A new methodology was developed for interpreting the discrete fracture network (DFN) by integrating the geological description of the DFN (GeoDFN) with the hydraulic test data from Posiva Flow-Log and Pipe-String System double-packer techniques to produce a conditioned Hydro-DFN model. This was done in a systematic way that addressed uncertainties associated with the assumptions made in interpreting the data, such as the relationship between fracture transmissivity and length. Consistent hydraulic data was only available for three boreholes, and therefore only relatively simplistic models were proposed as there isn't sufficient data to justify extrapolating the DFN away from the boreholes based on rock domain, for example. Significantly, a far greater quantity of hydro-geochemical data was available for calibration in the

  15. The operational eEMEP model version 10.4 for volcanic SO2 and ash forecasting

    Science.gov (United States)

    Steensen, Birthe M.; Schulz, Michael; Wind, Peter; Valdebenito, Álvaro M.; Fagerli, Hilde

    2017-05-01

    This paper presents a new version of the EMEP MSC-W model called eEMEP developed for transportation and dispersion of volcanic emissions, both gases and ash. EMEP MSC-W is usually applied to study problems with air pollution and aerosol transport and requires some adaptation to treat volcanic eruption sources and effluent dispersion. The operational set-up of model simulations in case of a volcanic eruption is described. Important choices have to be made to achieve CPU efficiency so that emergency situations can be tackled in time, answering relevant questions of ash advisory authorities. An efficient model needs to balance the complexity of the model and resolution. We have investigated here a meteorological uncertainty component of the volcanic cloud forecast by using a consistent ensemble meteorological dataset (GLAMEPS forecast) at three resolutions for the case of SO2 emissions from the 2014 Barðarbunga eruption. The low resolution (40 × 40 km) ensemble members show larger agreement in plume position and intensity, suggesting that the ensemble here does not give much added value. To compare the dispersion at different resolutions, we compute the area where the column load of the volcanic tracer, here SO2, is above a certain threshold, varied for testing purposes between 0.25 and 50 Dobson units. The increased numerical diffusion causes a larger area (+34 %) to be covered by the volcanic tracer in the low resolution simulations than in the high resolution ones. The higher resolution (10 × 10 km) ensemble members show higher column loads farther away from the volcanic eruption site in narrower clouds. Cloud positions are more varied between the high resolution members, and the cloud forms resemble the observed clouds more than the low resolution ones. For a volcanic emergency case this means that to obtain quickly results of the transport of volcanic emissions, an individual simulation with our low resolution is sufficient; however, to forecast peak

  16. User's guide to revised method-of-characteristics solute-transport model (MOC--version 31)

    Science.gov (United States)

    Konikow, Leonard F.; Granato, G.E.; Hornberger, G.Z.

    1994-01-01

    The U.S. Geological Survey computer model to simulate two-dimensional solute transport and dispersion in ground water (Konikow and Bredehoeft, 1978; Goode and Konikow, 1989) has been modified to improve management of input and output data and to provide progressive run-time information. All opening and closing of files are now done automatically by the program. Names of input data files are entered either interactively or using a batch-mode script file. Names of output files, created automatically by the program, are based on the name of the input file. In the interactive mode, messages are written to the screen during execution to allow the user to monitor the status and progress of the simulation and to anticipate total running time. Information reported and updated during a simulation include the current pumping period and time step, number of particle moves, and percentage completion of the current time step. The batch mode enables a user to run a series of simulations consecutively, without additional control. A report of the model's activity in the batch mode is written to a separate output file, allowing later review. The user has several options for creating separate output files for different types of data. The formats are compatible with many commercially available applications, which facilitates graphical postprocessing of model results. Geohydrology and Evaluation of Stream-Aquifer Relations in the Apalachicola-Chattahoochee-Flint River Basin, Southeastern Alabama, Northwestern Florida, and Southwestern Georgia By Lynn J. Torak, Gary S. Davis, George A. Strain, and Jennifer G. Herndon Abstract The lower Apalachieola-Chattahoochec-Flint River Basin is underlain by Coastal Plain sediments of pre-Cretaceous to Quaternary age consisting of alternating units of sand, clay, sandstone, dolomite, and limestone that gradually thicken and dip gently to the southeast. The stream-aquifer system consism of carbonate (limestone and dolomite) and elastic sediments

  17. Presentation, calibration and validation of the low-order, DCESS Earth System Model (Version 1

    Directory of Open Access Journals (Sweden)

    J. O. Pepke Pedersen

    2008-11-01

    Full Text Available A new, low-order Earth System Model is described, calibrated and tested against Earth system data. The model features modules for the atmosphere, ocean, ocean sediment, land biosphere and lithosphere and has been designed to simulate global change on time scales of years to millions of years. The atmosphere module considers radiation balance, meridional transport of heat and water vapor between low-mid latitude and high latitude zones, heat and gas exchange with the ocean and sea ice and snow cover. Gases considered are carbon dioxide and methane for all three carbon isotopes, nitrous oxide and oxygen. The ocean module has 100 m vertical resolution, carbonate chemistry and prescribed circulation and mixing. Ocean biogeochemical tracers are phosphate, dissolved oxygen, dissolved inorganic carbon for all three carbon isotopes and alkalinity. Biogenic production of particulate organic matter in the ocean surface layer depends on phosphate availability but with lower efficiency in the high latitude zone, as determined by model fit to ocean data. The calcite to organic carbon rain ratio depends on surface layer temperature. The semi-analytical, ocean sediment module considers calcium carbonate dissolution and oxic and anoxic organic matter remineralisation. The sediment is composed of calcite, non-calcite mineral and reactive organic matter. Sediment porosity profiles are related to sediment composition and a bioturbated layer of 0.1 m thickness is assumed. A sediment segment is ascribed to each ocean layer and segment area stems from observed ocean depth distributions. Sediment burial is calculated from sedimentation velocities at the base of the bioturbated layer. Bioturbation rates and oxic and anoxic remineralisation rates depend on organic carbon rain rates and dissolved oxygen concentrations. The land biosphere module considers leaves, wood, litter and soil. Net primary production depends on atmospheric carbon dioxide concentration and

  18. RadCon: A radiological consequences model. Technical guide - Version 2.0

    International Nuclear Information System (INIS)

    Crawford, J; Domel, R.U.; Harris, F.F.; Twining, J.R.

    2000-05-01

    A Radiological Consequence model (RadCon) is being developed at ANSTO to assess the radiological consequences, after an incident, in any climate, using appropriate meteorological and radiological transfer parameters. The major areas of interest to the developers are tropical and subtropical climates. This is particularly so given that it is anticipated that nuclear energy will become a mainstay for economies in these regions within the foreseeable future. Therefore, data acquisition and use of parameter values have been concentrated primarily on these climate types. Atmospheric dispersion and deposition for Australia can be modelled and supplied by the Regional Specialised Meteorological Centre (RSMC, one of five in the world) which is part of the Bureau of Meteorology Research Centre (BMRC), Puri et al. (1992). RadCon combines these data (i.e. the time dependent air and ground concentration generated by the dispersion model or measured quantities in the case of an actual incident) with specific regional parameter values to determine the dose to people via the major pathways of external and internal irradiation. For the external irradiation calculations, data are needed on lifestyle information such as the time spent indoors/outdoors, the high/low physical activity rates for different groups of people (especially critical groups) and shielding factors for housing types. For the internal irradiation calculations, data are needed on food consumption, effect of food processing, transfer parameters (soil to plant, plant to animal) and interception values appropriate for the region under study. Where the relevant data are not available default temperate data are currently used. The results of a wide ranging literature search has highlighted where specific research will be initiated to determine the information required for tropical and sub-tropical regions. The user is able to initiate sensitivity analyses within RadCon. This allows the parameters to be ranked in

  19. Dayton Aircraft Cabin Fire Model, Version 3, Volume I. Physical Description.

    Science.gov (United States)

    1982-06-01

    contact to any surface directly above a burning element, provided that the current flame length makes contact possible. For fires originating on the...no extension of the flames horizontally beneath the surface is considered. The equation for computing the flame length is presented in Section 5. For...high as 0.3. The values chosen for DACFIR3 are 0.15 for Ec and 0.10 for E P. The Steward model is also used to compute flame length , hf, for the fire

  20. ITS Version 3.0: Powerful, user-friendly software for radiation modelling

    International Nuclear Information System (INIS)

    Kensek, R.P.; Halbleib, J.A.; Valdez, G.D.

    1993-01-01

    ITS (the Integrated Tiger Series) is a powerful, but user-friendly, software package permitting state-of-the-art modelling of electron and/or photon radiation effects. The programs provide Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields. The ITS system combines operational simplicity and physical accuracy in order to provide experimentalist and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems

  1. A shortened version of the THERP/Handbook approach to human reliability analysis for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1986-01-01

    The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance

  2. Model for Analysis of the Energy Demand (MAED) users' manual for version MAED-1

    International Nuclear Information System (INIS)

    1986-09-01

    This manual is organized in two major parts. The first part includes eight main sections describing how to use the MAED-1 computer program and the second one consists of five appendices giving some additional information about the program. Concerning the main sections of the manual, Section 1 gives a summary description and some background information about the MAED-1 model. Section 2 extends the description of the MAED-1 model in more detail. Section 3 introduces some concepts, mainly related to the computer requirements imposed by the program, that are used throughout this document. Sections 4 to 7 describe how to execute each of the various programs (or modules) of the MAED-1 package. The description for each module shows the user how to prepare the control and data cards needed to execute the module and how to interpret the printed output produced. Section 8 recapitulates about the use of MAED-1 for carrying out energy and electricity planning studies, describes the several phases normally involved in this type of study and provides the user with practical hints about the most important aspects that need to be verified at each phase while executing the various MAED modules

  3. An improved risk-explicit interval linear programming model for pollution load allocation for watershed management.

    Science.gov (United States)

    Xia, Bisheng; Qian, Xin; Yao, Hong

    2017-11-01

    Although the risk-explicit interval linear programming (REILP) model has solved the problem of having interval solutions, it has an equity problem, which can lead to unbalanced allocation between different decision variables. Therefore, an improved REILP model is proposed. This model adds an equity objective function and three constraint conditions to overcome this equity problem. In this case, pollution reduction is in proportion to pollutant load, which supports balanced development between different regional economies. The model is used to solve the problem of pollution load allocation in a small transboundary watershed. Compared with the REILP original model result, our model achieves equity between the upstream and downstream pollutant loads; it also overcomes the problem of greatest pollution reduction, where sources are nearest to the control section. The model provides a better solution to the problem of pollution load allocation than previous versions.

  4. Model-based mitigation of availability risks

    NARCIS (Netherlands)

    Zambon, E.; Bolzoni, D.; Etalle, S.; Salvato, M.

    2007-01-01

    The assessment and mitigation of risks related to the availability of the IT infrastructure is becoming increasingly important in modern organizations. Unfortunately, present standards for risk assessment and mitigation show limitations when evaluating and mitigating availability risks. This is due

  5. Model-Based Mitigation of Availability Risks

    NARCIS (Netherlands)

    Zambon, Emmanuele; Bolzoni, D.; Etalle, Sandro; Salvato, Marco

    2007-01-01

    The assessment and mitigation of risks related to the availability of the IT infrastructure is becoming increasingly important in modern organizations. Unfortunately, present standards for Risk Assessment and Mitigation show limitations when evaluating and mitigating availability risks. This is due

  6. ISM Approach to Model Offshore Outsourcing Risks

    OpenAIRE

    Kumar, Sunand; Sharma, Rajiv Kumar; Chauhan, Prashant

    2014-01-01

    [EN] In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain ...

  7. Health risks associated with biogas. Assessment of health risks related to the injection of biogas into the natural gas network. Affset opinion. Collective expertise report. Final version

    International Nuclear Information System (INIS)

    JAEG, Jean-Philippe; Bajeat, Philippe; Wenisch, Sandrine; Bellenfant, Gael; Godon, Jean-Jacques; Keck, Gerard; Lattes, Armand; Moletta-Denat, Marina; Naja, Ghinwa; Ramalho, Olivier; Zdanevitch, Isabelle; ALARY, Rene; RAMEL, Martine

    2008-10-01

    This publication reports a study which aimed at acquiring and analysing available bibliographical data regarding risks associated with the exposure to toxic compounds in relationship with the injection of biogas into the natural gas network, at characterising biogas composition and notably their content in potentially toxic compounds with respect to the currently distributed natural gas, at assessing health risks related to the exposure to toxic agents before and after combustion, also with respect to the currently distributed natural gas, and, based on this risk assessment, at determining biogas composition characteristics. Thus, after a presentation of the context, scope and modalities of this study, the report proposes an overview of various contextual aspects related to biogas (interest, production means, purification processes, valorisation, injection processes), the report discusses chemical risks related to biogas: bibliographical study, biogas chemical composition, chemical composition of biogas combustion residues. It also discusses microbiological risks. Several appendices are provided

  8. Modeling urban flood risk territories for Riga city

    Science.gov (United States)

    Piliksere, A.; Sennikovs, J.; Virbulis, J.; Bethers, U.; Bethers, P.; Valainis, A.

    2012-04-01

    the Gumbell extreme value analysis. The hydrological modelling driven by the temperature and precipitation data series from regional climate models were used for evaluation of rain event maximums in the future periods. The usage of the climate model data in hydrological models causes systematic errors; therefore the bias correction method (Sennikovs, Bethers, 2009) was applied for determination of the future rainfall intensities. SWMM model was built for the urban area. Objects of hydraulic importance (manifold, penstock, ditch, pumping station, weir, well, catchment sub-basin etc.) were included in the model. There exist pure rain sewage system and mixed rain-water/household sewage system in Riga. Sewage system with wastewater load proportional to population density was taken account and calibrated. Model system was calibrated for a real rain event against the water flux time series into sewage treatment plant of Riga. High resolution (~1.5 points per square meter) digital terrain map was used as the base for finite element mesh for the geospatial mapping of results of hydraulic calculations. Main results of study are (1) detection of the hot spots of densely populated urban areas; (2) identification of the weak chains of the melioration and sewage systems; (3) mapping the elevation of ground water mainly caused by snow melting. A.Piliksere, A.Valainis, J.Seņņikovs, (2011), A flood risk assessment for Riga city taking account climate changes, EGU, Vienna, Austria. EPA, (2004), Storm water management model. User's manual version 5.0. US Environmental Protection Agency J.Sennikovs, U.Bethers, (2009), Statistical downscaling method of regional climate model results for hydrological modelling. 18th World IMACS/MODSIM Congress, Cairns, Australia.

  9. MIG version 0.0 model interface guidelines: Rules to accelerate installation of numerical models into any compliant parent code

    Energy Technology Data Exchange (ETDEWEB)

    Brannon, R.M.; Wong, M.K.

    1996-08-01

    A set of model interface guidelines, called MIG, is presented as a means by which any compliant numerical material model can be rapidly installed into any parent code without having to modify the model subroutines. Here, {open_quotes}model{close_quotes} usually means a material model such as one that computes stress as a function of strain, though the term may be extended to any numerical operation. {open_quotes}Parent code{close_quotes} means a hydrocode, finite element code, etc. which uses the model and enforces, say, the fundamental laws of motion and thermodynamics. MIG requires the model developer (who creates the model package) to specify model needs in a standardized but flexible way. MIG includes a dictionary of technical terms that allows developers and parent code architects to share a common vocabulary when specifying field variables. For portability, database management is the responsibility of the parent code. Input/output occurs via structured calling arguments. As much model information as possible (such as the lists of required inputs, as well as lists of precharacterized material data and special needs) is supplied by the model developer in an ASCII text file. Every MIG-compliant model also has three required subroutines to check data, to request extra field variables, and to perform model physics. To date, the MIG scheme has proven flexible in beta installations of a simple yield model, plus a more complicated viscodamage yield model, three electromechanical models, and a complicated anisotropic microcrack constitutive model. The MIG yield model has been successfully installed using identical subroutines in three vectorized parent codes and one parallel C++ code, all predicting comparable results. By maintaining one model for many codes, MIG facilitates code-to-code comparisons and reduces duplication of effort, thereby reducing the cost of installing and sharing models in diverse new codes.

  10. Offshore Wind Guidance Document: Oceanography and Sediment Stability (Version 1) Development of a Conceptual Site Model.

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Jesse D.; Jason Magalen; Craig Jones

    2014-06-01

    This guidance document provide s the reader with an overview of the key environmental considerations for a typical offshore wind coastal location and the tools to help guide the reader through a thoro ugh planning process. It will enable readers to identify the key coastal processes relevant to their offshore wind site and perform pertinent analysis to guide siting and layout design, with the goal of minimizing costs associated with planning, permitting , and long - ter m maintenance. The document highlight s site characterization and assessment techniques for evaluating spatial patterns of sediment dynamics in the vicinity of a wind farm under typical, extreme, and storm conditions. Finally, the document des cribe s the assimilation of all of this information into the conceptual site model (CSM) to aid the decision - making processes.

  11. Theoretical modelling of epigenetically modified DNA sequences [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Alexandra Teresa Pires Carvalho

    2015-05-01

    Full Text Available We report herein a set of calculations designed to examine the effects of epigenetic modifications on the structure of DNA. The incorporation of methyl, hydroxymethyl, formyl and carboxy substituents at the 5-position of cytosine is shown to hardly affect the geometry of CG base pairs, but to result in rather larger changes to hydrogen-bond and stacking binding energies, as predicted by dispersion-corrected density functional theory (DFT methods. The same modifications within double-stranded GCG and ACA trimers exhibit rather larger structural effects, when including the sugar-phosphate backbone as well as sodium counterions and implicit aqueous solvation. In particular, changes are observed in the buckle and propeller angles within base pairs and the slide and roll values of base pair steps, but these leave the overall helical shape of DNA essentially intact. The structures so obtained are useful as a benchmark of faster methods, including molecular mechanics (MM and hybrid quantum mechanics/molecular mechanics (QM/MM methods. We show that previously developed MM parameters satisfactorily reproduce the trimer structures, as do QM/MM calculations which treat bases with dispersion-corrected DFT and the sugar-phosphate backbone with AMBER. The latter are improved by inclusion of all six bases in the QM region, since a truncated model including only the central CG base pair in the QM region is considerably further from the DFT structure. This QM/MM method is then applied to a set of double-stranded DNA heptamers derived from a recent X-ray crystallographic study, whose size puts a DFT study beyond our current computational resources. These data show that still larger structural changes are observed than in base pairs or trimers, leading us to conclude that it is important to model epigenetic modifications within realistic molecular contexts.

  12. Forsmark site investigation. Assessment of the validity of the rock domain model, version 1.2, based on the modelling of gravity and petrophysical data

    International Nuclear Information System (INIS)

    Isaksson, Hans; Stephens, Michael B.

    2007-11-01

    This document reports the results gained by the geophysical modelling of rock domains based on gravity and petrophysical data, which is one of the activities performed within the site investigation work at Forsmark. The main objective with this activity is to assess the validity of the geological rock domain model version 1.2, and to identify discrepancies in the model that may indicate a need for revision of the model or a need for additional investigations. The verification is carried out by comparing the calculated gravity model response, which takes account of the geological model, with a local gravity anomaly that represents the measured data. The model response is obtained from the three-dimensional geometry and the petrophysical data provided for each rock domain in the geological model. Due to model boundary conditions, the study is carried out in a smaller area within the regional model area. Gravity model responses are calculated in three stages; an initial model, a base model and a refined base model. The refined base model is preferred and is used for comparison purposes. In general, there is a good agreement between the refined base model that makes use of the rock domain model, version 1.2 and the measured gravity data, not least where it concerns the depth extension of the critical rock domain RFM029. The most significant discrepancy occurs in the area extending from the SFR office to the SFR underground facility and further to the northwest. It is speculated that this discrepancy is caused by a combination of an overestimation of the volume of gabbro (RFM016) that plunges towards the southeast in the rock domain model, and an underestimation of the volume of occurrence of pegmatite and pegmatitic granite that are known to be present and occur as larger bodies around SFR. Other discrepancies are noted in rock domain RFM022, which is considered to be overestimated in the rock domain model, version 1.2, and in rock domain RFM017, where the gravity

  13. Simulating the 2012 High Plains Drought Using Three Single Column Model Versions of the Community Earth System Model (SCM-CESM)

    Science.gov (United States)

    Medina, I. D.; Denning, S.

    2014-12-01

    The impact of changes in the frequency and severity of drought on fresh water sustainability is a great concern for many regions of the world. One such location is the High Plains, where the local economy is primarily driven by fresh water withdrawals from the Ogallala Aquifer, which accounts for approximately 30% of total irrigation withdrawals from all U.S. aquifers combined. Modeling studies that focus on the feedback mechanisms that control the climate and eco-hydrology during times of drought are limited in the sense that they use conventional General Circulation Models (GCMs) with grid length scales ranging from one hundred to several hundred kilometers. Additionally, these models utilize crude statistical parameterizations of cloud processes for estimating sub-grid fluxes of heat and moisture and have a poor representation of land surface heterogeneity. For this research, we focus on the 2012 High Plains drought, and will perform numerical simulations using three single column model versions of the Community Earth System Model (SCM-CESM) at multiple sites overlying the Ogallala Aquifer for the 2010-2012 period. In the first version of SCM-CESM, CESM will be used in standard mode (Community Atmospheric Model (CAM) coupled to a single instance of the Community Land Model (CLM)), secondly, CESM will be used in Super-Parameterized mode (SP-CESM), where a cloud resolving model (CRM consists of 32 atmospheric columns) replaces the standard CAM atmospheric parameterization and is coupled to a single instance of CLM, and thirdly, CESM is used in "Multi Instance" SP-CESM mode, where an instance of CLM is coupled to each CRM column of SP-CESM (32 CRM columns coupled to 32 instances of CLM). To assess the physical realism of the land-atmosphere feedbacks simulated at each site by all versions of SCM-CESM, differences in simulated energy and moisture fluxes will be computed between years for the 2010-2012 period, and will be compared to differences calculated using

  14. Fuel Cell Power Model Version 2: Startup Guide, System Designs, and Case Studies. Modeling Electricity, Heat, and Hydrogen Generation from Fuel Cell-Based Distributed Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Steward, D.; Penev, M.; Saur, G.; Becker, W.; Zuboy, J.

    2013-06-01

    This guide helps users get started with the U.S. Department of Energy/National Renewable Energy Laboratory Fuel Cell Power (FCPower) Model Version 2, which is a Microsoft Excel workbook that analyzes the technical and economic aspects of high-temperature fuel cell-based distributed energy systems with the aim of providing consistent, transparent, comparable results. This type of energy system would provide onsite-generated heat and electricity to large end users such as hospitals and office complexes. The hydrogen produced could be used for fueling vehicles or stored for later conversion to electricity.

  15. A Method and a Model for Describing Competence and Adjustment: A Preschool Version of the Classroom Behavior Inventory.

    Science.gov (United States)

    Schaefer, Earl S.; Edgerton, Marianna D.

    A preschool version of the Classroom Behavior Inventory which provides a method for collecting valid data on a child's classroom behavior from day care and preschool teachers, was developed to complement the earlier form which was developed and validated for elementary school populations. The new version was tested with a pilot group of twenty-two…

  16. Realisation of dosimetric studies for workplaces with a risk of exposure to ionizing radiations (version 2). Practical guide

    International Nuclear Information System (INIS)

    Donadille, L.; Rehel, J.L.; Deligne, J.M.; Queinnec, F.; Aubert, B.; Bottollier-Depois, J.F.; Clairand, I.; Jourdain, J.R.; Rannou, A.

    2010-01-01

    This guide proposes a methodological approach to help carry out dosimetric workplace studies complying with the french regulation, and necessary to identify risks of radiological exposure, optimize radiation protection, classify the workers into different categories and the workplaces into different areas. Additional information is provided relating the main objectives of a workplace study, the French regulatory context, main sources and pathways of exposure to ionizing radiation. Radiation protection and operational quantities are reminded. Recommendations about the selection and use of detectors and about the implementation of calculation methods are also provided. The general methodological approach is applied and developed into 'workplace sheets', each one devoted to a particular type of workplace. (authors)

  17. Sensitivity of precipitation to parameter values in the community atmosphere model version 5

    Energy Technology Data Exchange (ETDEWEB)

    Johannesson, Gardar; Lucas, Donald; Qian, Yun; Swiler, Laura Painton; Wildey, Timothy Michael

    2014-03-01

    One objective of the Climate Science for a Sustainable Energy Future (CSSEF) program is to develop the capability to thoroughly test and understand the uncertainties in the overall climate model and its components as they are being developed. The focus on uncertainties involves sensitivity analysis: the capability to determine which input parameters have a major influence on the output responses of interest. This report presents some initial sensitivity analysis results performed by Lawrence Livermore National Laboratory (LNNL), Sandia National Laboratories (SNL), and Pacific Northwest National Laboratory (PNNL). In the 2011-2012 timeframe, these laboratories worked in collaboration to perform sensitivity analyses of a set of CAM5, 2° runs, where the response metrics of interest were precipitation metrics. The three labs performed their sensitivity analysis (SA) studies separately and then compared results. Overall, the results were quite consistent with each other although the methods used were different. This exercise provided a robustness check of the global sensitivity analysis metrics and identified some strongly influential parameters.

  18. The natural defense system and the normative self model [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Philippe Kourilsky

    2016-05-01

    Full Text Available Infectious agents are not the only agressors, and the immune system is not the sole defender of the organism. In an enlarged perspective, the ‘normative self model’ postulates that a ‘natural defense system’ protects man and other complex organisms against the environmental and internal hazards of life, including infections and cancers. It involves multiple error detection and correction mechanisms that confer robustness to the body at all levels of its organization. According to the model, the self relies on a set of physiological norms, and NONself (meaning : Non Obedient to the Norms of the self is anything ‘off-norms’. The natural defense system comprises a set of ‘civil defenses’ (to which all cells in organs and tissues contribute, and a ‘professional army ‘, made of a smaller set of mobile cells. Mobile and non mobile cells differ in their tuning abilities. Tuning extends the recognition capabilities of NONself by the mobile cells, which increase their defensive function. To prevent them to drift, which would compromise self/NONself discrimination, the more plastic mobile cells need to periodically refer to the more stable non mobile cells to keep within physiological standards.

  19. Expanding pedestrian injury risk to the body region level: how to model passive safety systems in pedestrian injury risk functions.

    Science.gov (United States)

    Niebuhr, Tobias; Junge, Mirko; Achmus, Stefanie

    2015-01-01

    Assessment of the effectiveness of advanced driver assistance systems (ADAS) plays a crucial role in accident research. A common way to evaluate the effectiveness of new systems is to determine the potentials for injury severity reduction. Because injury risk functions describe the probability of an injury of a given severity conditional on a technical accident severity (closing speed, delta V, barrier equivalent speed, etc.), they are predestined for such evaluations. Recent work has stated an approach on how to model the pedestrian injury risk in pedestrian-to-passenger car accidents as a family of functions. This approach gave explicit and easily interpretable formulae for the injury risk conditional on the closing speed of the car. These results are extended to injury risk functions for pedestrian body regions. Starting with a double-checked German In-depth Accident Study (GIDAS) pedestrian-to-car accident data set (N = 444) and a functional-anatomical definition of the body regions, investigations on the influence of specific body regions on the overall injury severity will be presented. As the measure of injury severity, the ISSx, a rescaled version of the well-known Injury Severity Score (ISS), was used. Though traditional ISS is computed by summation of the squares of the 3 most severe injured body regions, ISSx is computed by the summation of the exponentials of the Abbreviated Injury Scale (AIS) severities of the 3 most severely injured body regions. The exponentials used are scaled to fit the ISS range of values between 0 and 75. Three body regions (head/face/neck, thorax, hip/legs) clearly dominated abdominal and upper extremity injuries; that is, the latter 2 body regions had no influence at all on the overall injury risk over the range of technical accident severities. Thus, the ISSx is well described by use of the injury codes from the same body regions for any pedestrian injury severity. As a mathematical consequence, the ISSx becomes explicitly

  20. Columbia River Statistical Update Model, Version 4. 0 (COLSTAT4): Background documentation and user's guide

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, G.; Damschen, D.W.; Brockhaus, R.D.

    1987-08-01

    Daily-averaged temperature and flow information on the Columbia River just downstream of Priest Rapids Dam and upstream of river mile 380 were collected and stored in a data base. The flow information corresponds to discharges that were collected daily from October 1, 1959, through July 28, 1986. The temperature information corresponds to values that were collected daily from January 1, 1965, through May 27, 1986. The computer model, COLSTAT4 (Columbia River Statistical Update - Version 4.0 model), uses the temperature-discharge data base to statistically analyze temperature and flow conditions by computing the frequency of occurrence and duration of selected temperatures and flow rates for the Columbia River. The COLSTAT4 code analyzes the flow and temperature information in a sequential time frame (i.e., a continuous analysis over a given time period); it also analyzes this information in a seasonal time frame (i.e., a periodic analysis over a specific season from year to year). A provision is included to enable the user to edit and/or extend the data base of temperature and flow information. This report describes the COLSTAT4 code and the information contained in its data base.

  1. THE MODEL FOR RISK ASSESSMENT ERP-SYSTEMS INFORMATION SECURITY

    Directory of Open Access Journals (Sweden)

    V. S. Oladko

    2016-12-01

    Full Text Available The article deals with the problem assessment of information security risks in the ERP-system. ERP-system functions and architecture are studied. The model malicious impacts on levels of ERP-system architecture are composed. Model-based risk assessment, which is the quantitative and qualitative approach to risk assessment, built on the partial unification 3 methods for studying the risks of information security - security models with full overlapping technique CRAMM and FRAP techniques developed.

  2. Neck keloids: evaluation of risk factors and recommendation for keloid staging system [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Michael H. Tirgan

    2016-08-01

    Full Text Available Importance: Health care providers have long struggled with recurrent and hard to treat keloids. Advancing our understanding of natural history and risk factors for development of large, very large and massive neck keloids can lead to improved treatment outcomes. Clinical staging system for the categorization of keloid lesions, as well as grouping of keloid patients according to the extent of skin involvement is both fundamental for design and delivery of proper plan of care and an absolute necessity for methodical trial design and interpretation of the results thereof. Objective: To review clinical presentation and natural history of neck keloids; to explore risk factors for development of large, very large and massive neck keloids; and to propose a clinical staging system that allows for categorization of keloid lesions by their size and grouping of keloid patients by the extent of their skin involvement.  Setting: This is a retrospective analysis of 82 consecutive patients with neck keloids who were seen by the author in his keloid specialty medical practice.    Intervention: Non-surgical treatment was offered to all patients.  Results: Neck-area keloids were found to have several unique characteristics. All 65 African Americans in this study had keloidal lesions elsewhere on their skin. Very large and massive neck keloids appear to be race-specific and almost exclusively seen among African Americans. Submandibular and submental skin was the most commonly involved area of the neck. Keloid removal surgery was found to be the main risk factor for development of very large and massive neck keloids.  Conclusions and relevance: Surgical removal of neck keloids results in wounding of the skin and triggering a pathological wound-healing response that often leads to formation of a much larger keloid.  Given the potential for greater harm from surgery, the author proposes non-surgical approach for treatment of all primary neck keloids. Author

  3. Updating sea spray aerosol emissions in the Community Multiscale Air Quality (CMAQ) model version 5.0.2

    Science.gov (United States)

    Gantt, B.; Kelly, J. T.; Bash, J. O.

    2015-11-01

    Sea spray aerosols (SSAs) impact the particle mass concentration and gas-particle partitioning in coastal environments, with implications for human and ecosystem health. Model evaluations of SSA emissions have mainly focused on the global scale, but regional-scale evaluations are also important due to the localized impact of SSAs on atmospheric chemistry near the coast. In this study, SSA emissions in the Community Multiscale Air Quality (CMAQ) model were updated to enhance the fine-mode size distribution, include sea surface temperature (SST) dependency, and reduce surf-enhanced emissions. Predictions from the updated CMAQ model and those of the previous release version, CMAQv5.0.2, were evaluated using several coastal and national observational data sets in the continental US. The updated emissions generally reduced model underestimates of sodium, chloride, and nitrate surface concentrations for coastal sites in the Bay Regional Atmospheric Chemistry Experiment (BRACE) near Tampa, Florida. Including SST dependency to the SSA emission parameterization led to increased sodium concentrations in the southeastern US and decreased concentrations along parts of the Pacific coast and northeastern US. The influence of sodium on the gas-particle partitioning of nitrate resulted in higher nitrate particle concentrations in many coastal urban areas due to increased condensation of nitric acid in the updated simulations, potentially affecting the predicted nitrogen deposition in sensitive ecosystems. Application of the updated SSA emissions to the California Research at the Nexus of Air Quality and Climate Change (CalNex) study period resulted in a modest improvement in the predicted surface concentration of sodium and nitrate at several central and southern California coastal sites. This update of SSA emissions enabled a more realistic simulation of the atmospheric chemistry in coastal environments where marine air mixes with urban pollution.

  4. A numerical 4D Collision Risk Model

    Science.gov (United States)

    Schmitt, Pal; Culloch, Ross; Lieber, Lilian; Kregting, Louise

    2017-04-01

    With the growing number of marine renewable energy (MRE) devices being installed across the world, some concern has been raised about the possibility of harming mobile, marine fauna by collision. Although physical contact between a MRE device and an organism has not been reported to date, these novel sub-sea structures pose a challenge for accurately estimating collision risks as part of environmental impact assessments. Even if the animal motion is simplified to linear translation, ignoring likely evasive behaviour, the mathematical problem of establishing an impact probability is not trivial. We present a numerical algorithm to obtain such probability distributions using transient, four-dimensional simulations of a novel marine renewable device concept, Deep Green, Minesto's power plant and hereafter referred to as the 'kite' that flies in a figure-of-eight configuration. Simulations were carried out altering several configurations including kite depth, kite speed and kite trajectory while keeping the speed of the moving object constant. Since the kite assembly is defined as two parts in the model, a tether (attached to the seabed) and the kite, collision risk of each part is reported independently. By comparing the number of collisions with the number of collision-free simulations, a probability of impact for each simulated position in the cross- section of the area is considered. Results suggest that close to the bottom, where the tether amplitude is small, the path is always blocked and the impact probability is 100% as expected. However, higher up in the water column, the collision probability is twice as high in the mid line, where the tether passes twice per period than at the extremes of its trajectory. The collision probability distribution is much more complex in the upper end of the water column, where the kite and tether can simultaneously collide with the object. Results demonstrate the viability of such models, which can also incorporate empirical

  5. The global aerosol-climate model ECHAM-HAM, version 2: sensitivity to improvements in process representations

    Directory of Open Access Journals (Sweden)

    K. Zhang

    2012-10-01

    Full Text Available This paper introduces and evaluates the second version of the global aerosol-climate model ECHAM-HAM. Major changes have been brought into the model, including new parameterizations for aerosol nucleation and water uptake, an explicit treatment of secondary organic aerosols, modified emission calculations for sea salt and mineral dust, the coupling of aerosol microphysics to a two-moment stratiform cloud microphysics scheme, and alternative wet scavenging parameterizations. These revisions extend the model's capability to represent details of the aerosol lifecycle and its interaction with climate. Nudged simulations of the year 2000 are carried out to compare the aerosol properties and global distribution in HAM1 and HAM2, and to evaluate them against various observations. Sensitivity experiments are performed to help identify the impact of each individual update in model formulation.

    Results indicate that from HAM1 to HAM2 there is a marked weakening of aerosol water uptake in the lower troposphere, reducing the total aerosol water burden from 75 Tg to 51 Tg. The main reason is the newly introduced κ-Köhler-theory-based water uptake scheme uses a lower value for the maximum relative humidity cutoff. Particulate organic matter loading in HAM2 is considerably higher in the upper troposphere, because the explicit treatment of secondary organic aerosols allows highly volatile oxidation products of the precursors to be vertically transported to regions of very low temperature and to form aerosols there. Sulfate, black carbon, particulate organic matter and mineral dust in HAM2 have longer lifetimes than in HAM1 because of weaker in-cloud scavenging, which is in turn related to lower autoconversion efficiency in the newly introduced two-moment cloud microphysics scheme. Modification in the sea salt emission scheme causes a significant increase in the ratio (from 1.6 to 7.7 between accumulation mode and coarse mode emission fluxes of

  6. Estimating internal exposure risks by the relative risk and the National Institute of Health risk models

    International Nuclear Information System (INIS)

    Mehta, S.K.; Sarangapani, R.

    1995-01-01

    This paper presents tabulations of risk (R) and person-years of life lost (PYLL) for acute exposures of individual organs at ages 20 and 40 yrs for the Indian and Japanese populations to illustrate the effect of age at exposure in the two models. Results are also presented for the organ wise nominal probability coefficients (NPC) and PYLL for individual organs for the age distributed Indian population by the two models. The results presented show that for all organs the estimates of PYLL and NPC for the Indian population are lower than those for the Japanese population by both models except for oesophagus, breast and ovary by the relative risk (RR) model, where the opposite trend is observed. The results also show that the Indian all-cancer values of NPC averaged over the two models is 2.9 x 10 -2 Sv -1 , significantly lower than the world average value of 5x10 -2 Sv -1 estimated by the ICRP. (author). 9 refs., 2 figs., 2 tabs

  7. Interrater and test-retest reliability and validity of the Norwegian version of the BESTest and mini-BESTest in people with increased risk of falling.

    Science.gov (United States)

    Hamre, Charlotta; Botolfsen, Pernille; Tangen, Gro Gujord; Helbostad, Jorunn L

    2017-04-20

    The Balance Evaluation Systems Test (BESTest) was developed to assess underlying systems for balance control in order to be able to individually tailor rehabilitation interventions to people with balance disorders. A short form, the Mini-BESTest, was developed as a screening test. The study aimed to assess interrater and test-retest reliability of the Norwegian version of the BESTest and the Mini-BESTest in community-dwelling people with increased risk of falling and to assess concurrent validity with the Fall Efficacy Scale-International (FES-I), and it was an observational study with a cross-sectional design. Forty-two persons with increased risk of falling (elderly over 65 years of age, persons with a history of stroke or Multiple Sclerosis) were assessed twice by two raters. Relative reliability was analysed with Intraclass Correlation Coefficient (ICC), and absolute reliability with standard error of measurement (SEM) and smallest detectable change (SDC). Concurrent validity was assessed against the FES-I using Spearman's rho. The BESTest showed very good interrater reliability (ICC = 0.98, SEM = 1.79, SDC 95  = 5.0) and test-retest reliability (rater A/rater B = ICC = 0.89/0.89, SEM = 3.9/4.3, SDC 95  = 10.8/11.8). The Mini-BESTest also showed very good interrater reliability (ICC = 0.95, SEM = 1.19, SDC 95  = 3.3) and test-retest reliability (rater A/rater B = ICC = 0.85/0.84, SEM = 1.8/1.9, SDC 95  = 4.9/5.2). The correlations were moderate between the FES-I and both the BESTest and the Mini-BESTest (Spearman's rho -0.51 and-0.50, p test-retest reliability when assessed in a heterogeneous sample of people with increased risk of falling. The concurrent validity measured against the FES-I showed moderate correlation. The results are comparable with earlier studies and indicate that the Norwegian versions can be used in daily clinic and in research.

  8. EIA model documentation: World oil refining logistics demand model,``WORLD`` reference manual. Version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    1994-04-11

    This manual is intended primarily for use as a reference by analysts applying the WORLD model to regional studies. It also provides overview information on WORLD features of potential interest to managers and analysts. Broadly, the manual covers WORLD model features in progressively increasing detail. Section 2 provides an overview of the WORLD model, how it has evolved, what its design goals are, what it produces, and where it can be taken with further enhancements. Section 3 reviews model management covering data sources, managing over-optimization, calibration and seasonality, check-points for case construction and common errors. Section 4 describes in detail the WORLD system, including: data and program systems in overview; details of mainframe and PC program control and files;model generation, size management, debugging and error analysis; use with different optimizers; and reporting and results analysis. Section 5 provides a detailed description of every WORLD model data table, covering model controls, case and technology data. Section 6 goes into the details of WORLD matrix structure. It provides an overview, describes how regional definitions are controlled and defines the naming conventions for-all model rows, columns, right-hand sides, and bounds. It also includes a discussion of the formulation of product blending and specifications in WORLD. Several Appendices supplement the main sections.

  9. Study of the Eco-Economic Indicators by Means of the New Version of the Merge Integrated Model Part 2

    Directory of Open Access Journals (Sweden)

    Boris Vadimovich Digas

    2016-03-01

    Full Text Available One of the most relevant issues of the day is the forecasting problem of climatic changes and mitigation of their consequences. The official point of view reflected in the Climate doctrine of the Russian Federation consists in the recognition of the need of the development of the state approach to the climatic problems and related issues on the basis of the comprehensive scientific analysis of ecological, economic and social factors. For this purpose, the integrated estimation models of interdisciplinary character are attracted. Their functionality is characterized by the possibility of construction and testing of various dynamic scenarios of complex systems. The main purposes of the computing experiments described in the article are a review of the consequences of hypothetical participation of Russia in initiatives for greenhouse gas reduction as the Kyoto Protocol and approbation of one of the calculation methods of the green gross domestic product representing the efficiency of environmental management in the modelling. To implement the given goals, the MERGE optimization model is used, its classical version is intended for the quantitative estimation of the application results of nature protection strategies. The components of the model are the eco-power module, climatic module and the module of loss estimates. In the work, the main attention is paid to the adaptation of the MERGE model to a current state of the world economy in the conditions of a complicated geopolitical situation and introduction of a new component to the model, realizing a simplified method for calculation the green gross domestic product. The Project of scenario conditions and the key macroeconomic forecast parameters of the socio-economic development of Russia for 2016 and the schedule date of 2017−2018 made by the Ministry of Economic Development of the Russian Federation are used as a basic source of entrance data for the analysis of possible trajectories of the

  10. Study of the Eco-Economic Indicators by Means of the New Version of the Merge Integrated Model. Part 1

    Directory of Open Access Journals (Sweden)

    Boris Vadimovich Digas

    2015-12-01

    Full Text Available One of the most relevant issues of the day is the forecasting problem of climatic changes and mitigation of their consequences. The official point of view reflected in the Climate doctrine of the Russian Federation consists in the recognition of the need of the development of the state approach to the climatic problems and related issues on the basis of the comprehensive scientific analysis of ecological, economic and social factors. For this purpose, the integrated estimation models of interdisciplinary character are attracted. Their functionality is characterized by the possibility of construction and testing of various dynamic scenarios of complex systems. The main purposes of the computing experiments described in the article are a review of the consequences of hypothetical participation of Russia in initiatives for greenhouse gas reduction as the Kyoto Protocol and approbation of one of the calculation methods of the green GDP representing the efficiency of environmental management in the modelling. To implement the given goals, the MERGE optimization model is used, its classical version is intended for the quantitative estimation of the application results of nature protection strategies. The components of the model are the eco-power module, climatic module and the module of loss estimates. In the work, the main attention is paid to the adaptation of the MERGE model to a current state of the world economy in the conditions of a complicated geopolitical situation and introduction of a new component to the model, realizing a simplified method for calculation the green GDP. The Project of scenario conditions and the key macroeconomic forecast parameters of the socio-economic development of Russia for 2016 and the schedule date of 2017−2018 made by the Ministry of Economic Development of the Russian Federation are used as a basic source of entrance data for the analysis of possible trajectories of the economic development of Russia and the

  11. Thermal Site Descriptive Model. A strategy for the model development during site investigations. Version 1.0

    International Nuclear Information System (INIS)

    Sundberg, Jan

    2003-04-01

    Site investigations are in progress for the siting of a deep repository for spent nuclear fuel. As part of the planning work, strategies are developed for site descriptive modelling regarding different disciplines, amongst them the thermal conditions. The objective of the strategy for a thermal site descriptive model is to guide the practical implementation of evaluating site specific data during the site investigations. It is understood that further development may be needed. The model describes the thermal properties and other thermal parameters of intact rock, fractures and fracture zones, and of the rock mass. The methodology is based on estimation of thermal properties of intact rock and discontinuities, using both empirical and theoretical/numerical approaches, and estimation of thermal processes using mathematical modelling. The methodology will be used and evaluated for the thermal site descriptive modelling at the Aespoe Hard Rock Laboratory

  12. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models. Final Technical Report

    International Nuclear Information System (INIS)

    Hoel, David G.

    2012-01-01

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival function and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact

  13. VELMA Ecohydrological Model, Version 2.0 -- Analyzing Green Infrastructure Options for Enhancing Water Quality and Ecosystem Service Co-Benefits

    Science.gov (United States)

    This 2-page factsheet describes an enhanced version (2.0) of the VELMA eco-hydrological model. VELMA – Visualizing Ecosystem Land Management Assessments – has been redesigned to assist communities, land managers, policy makers and other decision makers in evaluataing the effecti...

  14. MODIFIED N.R.C. VERSION OF THE U.S.G.S. SOLUTE TRANSPORT MODEL. VOLUME 2. INTERACTIVE PREPROCESSOR PROGRAM

    Science.gov (United States)

    The methods described in the report can be used with the modified N.R.C. version of the U.S.G.S. Solute Transport Model to predict the concentration of chemical parameters in a contaminant plume. The two volume report contains program documentation and user's manual. The program ...

  15. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  16. Bankruptcy risk model and empirical tests

    Science.gov (United States)

    Podobnik, Boris; Horvatic, Davor; Petersen, Alexander M.; Urošević, Branko; Stanley, H. Eugene

    2010-01-01

    We analyze the size dependence and temporal stability of firm bankruptcy risk in the US economy by applying Zipf scaling techniques. We focus on a single risk factor—the debt-to-asset ratio R—in order to study the stability of the Zipf distribution of R over time. We find that the Zipf exponent increases during market crashes, implying that firms go bankrupt with larger values of R. Based on the Zipf analysis, we employ Bayes’s theorem and relate the conditional probability that a bankrupt firm has a ratio R with the conditional probability of bankruptcy for a firm with a given R value. For 2,737 bankrupt firms, we demonstrate size dependence in assets change during the bankruptcy proceedings. Prepetition firm assets and petition firm assets follow Zipf distributions but with different exponents, meaning that firms with smaller assets adjust their assets more than firms with larger assets during the bankruptcy process. We compare bankrupt firms with nonbankrupt firms by analyzing the assets and liabilities of two large subsets of the US economy: 2,545 Nasdaq members and 1,680 New York Stock Exchange (NYSE) members. We find that both assets and liabilities follow a Pareto distribution. The finding is not a trivial consequence of the Zipf scaling relationship of firm size quantified by employees—although the market capitalization of Nasdaq stocks follows a Pareto distribution, the same distribution does not describe NYSE stocks. We propose a coupled Simon model that simultaneously evolves both assets and debt with the possibility of bankruptcy, and we also consider the possibility of firm mergers. PMID:20937903

  17. Environmental risk assessment using the Persian version of the Home Falls And Screening Tool (HOME FAST in Iranian elderly

    Directory of Open Access Journals (Sweden)

    Bahareh Maghfouri

    2013-05-01

    Full Text Available Introduction: One of the common problems among older people is falling. Falling inside the houses and streets makes up a large incidence between Iranian elderly, then the effort to identify environmental factors at home and home modification can reduce falls and injury in the elderly. The aim of this study is identifying elderly at risk of fall with using screening tool (HOME FAST and define reliability of this tool.Material and Methods: As a reliability, through the health housing of the town councils in five geographical regions of Tehran, 60 old person were selected. Participants aged 60 to 65 years and the HOME FAST tool was used in the two stages (inter rater and test-retest.Results: Test-retest reliability in the study showed that agreement between the items is over than 0.8, which shows very good reliability. The results showed that the relative of the each item in the Agreement between the domain is 1 - 0.65, which shows moderate to high reliability. And the results in this study showed that agreement between the items in Inter rater reliability is over than 0.8, which shows the level of reliability is very good. Also it showed that the relative of the each item in the agreement between the domain is 1 - 0.01, which shows poor to high reliability.Conclusion: This study shows that the reliability of the HOME FAST is high. The findings of these comments have been expected that the test objectives were appropriate to prevent falls and the tools showed acceptable reliability, then this test can be used as a tool for to professionals.

  18. Lord-Wingersky Algorithm Version 2.0 for Hierarchical Item Factor Models with Applications in Test Scoring, Scale Alignment, and Model Fit Testing.

    Science.gov (United States)

    Cai, Li

    2015-06-01

    Lord and Wingersky's (Appl Psychol Meas 8:453-461, 1984) recursive algorithm for creating summed score based likelihoods and posteriors has a proven track record in unidimensional item response theory (IRT) applications. Extending the recursive algorithm to handle multidimensionality is relatively simple, especially with fixed quadrature because the recursions can be defined on a grid formed by direct products of quadrature points. However, the increase in computational burden remains exponential in the number of dimensions, making the implementation of the recursive algorithm cumbersome for truly high-dimensional models. In this paper, a dimension reduction method that is specific to the Lord-Wingersky recursions is developed. This method can take advantage of the restrictions implied by hierarchical item factor models, e.g., the bifactor model, the testlet model, or the two-tier model, such that a version of the Lord-Wingersky recursive algorithm can operate on a dramatically reduced set of quadrature points. For instance, in a bifactor model, the dimension of integration is always equal to 2, regardless of the number of factors. The new algorithm not only provides an effective mechanism to produce summed score to IRT scaled score translation tables properly adjusted for residual dependence, but leads to new applications in test scoring, linking, and model fit checking as well. Simulated and empirical examples are used to illustrate the new applications.

  19. The globalization of risk and risk perception: why we need a new model of risk communication for vaccines.

    Science.gov (United States)

    Larson, Heidi; Brocard Paterson, Pauline; Erondu, Ngozi

    2012-11-01

    Risk communication and vaccines is complex and the nature of risk perception is changing, with perceptions converging, evolving and having impacts well beyond specific geographic localities and points in time, especially when amplified through the Internet and other modes of global communication. This article examines the globalization of risk perceptions and their impacts, including the example of measles and the globalization of measles, mumps and rubella (MMR) vaccine risk perceptions, and calls for a new, more holistic model of risk assessment, risk communication and risk mitigation, embedded in an ongoing process of risk management for vaccines and immunization programmes. It envisions risk communication as an ongoing process that includes trust-building strategies hand-in-hand with operational and policy strategies needed to mitigate and manage vaccine-related risks, as well as perceptions of risk.

  20. SHEDS-Multimedia Model Version 3 (a) Technical Manual; (b) User Guide; and (c) Executable File to Launch SAS Program and Install Model

    Science.gov (United States)

    Reliable models for assessing human exposures are important for understanding health risks from chemicals. The Stochastic Human Exposure and Dose Simulation model for multimedia, multi-route/pathway chemicals (SHEDS-Multimedia), developed by EPA’s Office of Research and Developm...

  1. National Insect and Disease Risk Map (NIDRM)--cutting edge software for rapid insect and disease risk model development

    Science.gov (United States)

    Frank J. Krist

    2010-01-01

    The Forest Health Technology Enterprise Team (FHTET) of the U.S. Forest Service is leading an effort to produce the next version of the National Insect and Disease Risk Map (NIDRM) for targeted release in 2011. The goal of this effort is to update spatial depictions of risk of tree mortality based on: (1) newly derived 240-m geospatial information depicting the...

  2. Modeling issues in nuclear plant fire risk analysis

    International Nuclear Information System (INIS)

    Siu, N.

    1989-01-01

    This paper discusses various issues associated with current models for analyzing the risk due to fires in nuclear power plants. Particular emphasis is placed on the fire growth and suppression models, these being unique to the fire portion of the overall risk analysis. Potentially significant modeling improvements are identified; also discussed are a variety of modeling issues where improvements will help the credibility of the analysis, without necessarily changing the computed risk significantly. The mechanistic modeling of fire initiation is identified as a particularly promising improvement for reducing the uncertainties in the predicted risk. 17 refs., 5 figs. 2 tabs

  3. A Knowledge-Based Model of Audit Risk

    OpenAIRE

    Dhar, Vasant; Lewis, Barry; Peters, James

    1988-01-01

    Within the academic and professional auditing communities, there has been growing concern about how to accurately assess the various risks associated with performing an audit. These risks are difficult to conceptualize in terms of numeric estimates. This article discusses the development of a prototype computational model (computer program) that assesses one of the major audit risks -- inherent risk. This program bases most of its inferencing activities on a qualitative model of a typical bus...

  4. Enhanced leak detection risk model development

    Energy Technology Data Exchange (ETDEWEB)

    Harron, Lorna; Barlow, Rick; Farquhar, Ted [Enbridge Pipelines Inc., Edmonton, Alberta (Canada)

    2010-07-01

    Increasing concerns and attention to pipeline safety have engaged pipeline companies and regulatory agencies to extend their approaches to pipeline integrity. The implementation of High Consequence Areas (HCAs) has especially had an impact on the development of integrity management protocols (IMPs) for pipelines. These IMPs can require that a risk based assessment of integrity issues be applied to specific HCA risk factors. This paper addresses the development of an operational risk assessment approach for pipeline leak detection requirements for HCAs. A detailed risk assessment algorithm that includes 25 risk variables and 28 consequence variables was developed for application to all HCA areas. This paper describes the consultative process that was used to workshop the development of this algorithm. Included in this description is how the process addressed various methods of leak detection across a wide variety of pipelines. The paper also looks at development challenges and future steps in applying operation risk assessment techniques to mainline leak detection risk management.

  5. Brayton Cycle Numerical Modeling using the RELAP5-3D code, version 4.3.4

    Energy Technology Data Exchange (ETDEWEB)

    Longhini, Eduardo P.; Lobo, Paulo D.C.; Guimarães, Lamartine N.F.; Filho, Francisco A.B.; Ribeiro, Guilherme B., E-mail: edu_longhini@yahoo.com.br [Instituto de Estudos Avançados (IEAv), São José dos Campos, SP (Brazil). Divisão de Energia Nuclear

    2017-07-01

    This work contributes to enable and develop technologies to mount fast micro reactors, to generate heat and electric energy, for the purpose to warm and to supply electrically spacecraft equipment and, also, the production of nuclear space propulsion effect. So, for this purpose, the Brayton Cycle demonstrates to be an optimum approach for space nuclear power. The Brayton thermal cycle gas has as characteristic to be a closed cycle, with two adiabatic processes and two isobaric processes. The components performing the cycle's processes are compressor, turbine, heat source, cold source and recuperator. Therefore, the working fluid's mass flow runs the thermal cycle that converts thermal energy into electrical energy, able to use in spaces and land devices. The objective is numerically to model the Brayton thermal cycle gas on nominal operation with one turbomachine composed for a radial-inflow compressor and turbine of a 40.8 kWe Brayton Rotating Unit (BRU). The Brayton cycle numerical modeling is being performed with the program RELAP5-3D, version 4.3.4. The nominal operation uses as working fluid a mixture 40 g/mole He-Xe with a flow rate of 1.85 kg/s, shaft rotational speed of 45 krpm, compressor and turbine inlet temperature of 400 K and 1149 K, respectively, and compressor exit pressure 0.931 MPa. Then, the aim is to get physical corresponding data to operate each cycle component and the general cycle on this nominal operation. (author)

  6. Brayton Cycle Numerical Modeling using the RELAP5-3D code, version 4.3.4

    International Nuclear Information System (INIS)

    Longhini, Eduardo P.; Lobo, Paulo D.C.; Guimarães, Lamartine N.F.; Filho, Francisco A.B.; Ribeiro, Guilherme B.

    2017-01-01

    This work contributes to enable and develop technologies to mount fast micro reactors, to generate heat and electric energy, for the purpose to warm and to supply electrically spacecraft equipment and, also, the production of nuclear space propulsion effect. So, for this purpose, the Brayton Cycle demonstrates to be an optimum approach for space nuclear power. The Brayton thermal cycle gas has as characteristic to be a closed cycle, with two adiabatic processes and two isobaric processes. The components performing the cycle's processes are compressor, turbine, heat source, cold source and recuperator. Therefore, the working fluid's mass flow runs the thermal cycle that converts thermal energy into electrical energy, able to use in spaces and land devices. The objective is numerically to model the Brayton thermal cycle gas on nominal operation with one turbomachine composed for a radial-inflow compressor and turbine of a 40.8 kWe Brayton Rotating Unit (BRU). The Brayton cycle numerical modeling is being performed with the program RELAP5-3D, version 4.3.4. The nominal operation uses as working fluid a mixture 40 g/mole He-Xe with a flow rate of 1.85 kg/s, shaft rotational speed of 45 krpm, compressor and turbine inlet temperature of 400 K and 1149 K, respectively, and compressor exit pressure 0.931 MPa. Then, the aim is to get physical corresponding data to operate each cycle component and the general cycle on this nominal operation. (author)

  7. Modeling the structure of the attitudes and belief scale 2 using CFA and bifactor approaches: Toward the development of an abbreviated version.

    Science.gov (United States)

    Hyland, Philip; Shevlin, Mark; Adamson, Gary; Boduszek, Daniel

    2014-01-01

    The Attitudes and Belief Scale-2 (ABS-2: DiGiuseppe, Leaf, Exner, & Robin, 1988. The development of a measure of rational/irrational thinking. Paper presented at the World Congress of Behavior Therapy, Edinburg, Scotland.) is a 72-item self-report measure of evaluative rational and irrational beliefs widely used in Rational Emotive Behavior Therapy research contexts. However, little psychometric evidence exists regarding the measure's underlying factor structure. Furthermore, given the length of the ABS-2 there is a need for an abbreviated version that can be administered when there are time demands on the researcher, such as in clinical settings. This study sought to examine a series of theoretical models hypothesized to represent the latent structure of the ABS-2 within an alternative models framework using traditional confirmatory factor analysis as well as utilizing a bifactor modeling approach. Furthermore, this study also sought to develop a psychometrically sound abbreviated version of the ABS-2. Three hundred and thirteen (N = 313) active emergency service personnel completed the ABS-2. Results indicated that for each model, the application of bifactor modeling procedures improved model fit statistics, and a novel eight-factor intercorrelated solution was identified as the best fitting model of the ABS-2. However, the observed fit indices failed to satisfy commonly accepted standards. A 24-item abbreviated version was thus constructed and an intercorrelated eight-factor solution yielded satisfactory model fit statistics. Current results support the use of a bifactor modeling approach to determining the factor structure of the ABS-2. Furthermore, results provide empirical support for the psychometric properties of the newly developed abbreviated version.

  8. Adequacy of relative and absolute risk models for lifetime risk estimate of radiation-induced cancer

    International Nuclear Information System (INIS)

    McBride, M.; Coldman, A.J.

    1988-03-01

    This report examines the applicability of the relative (multiplicative) and absolute (additive) models in predicting lifetime risk of radiation-induced cancer. A review of the epidemiologic literature, and a discussion of the mathematical models of carcinogenesis and their relationship to these models of lifetime risk, are included. Based on the available data, the relative risk model for the estimation of lifetime risk is preferred for non-sex-specific epithelial tumours. However, because of lack of knowledge concerning other determinants of radiation risk and of background incidence rates, considerable uncertainty in modelling lifetime risk still exists. Therefore, it is essential that follow-up of exposed cohorts be continued so that population-based estimates of lifetime risk are available

  9. Sigmoidal response model for radiation risk

    International Nuclear Information System (INIS)

    Kondo, Sohei

    1995-01-01

    From epidemiologic studies, we find no measurable increase in the incidences of birth defects and cancer after low-level exposure to radiation. Based on modern understanding of the molecular basis of teratogenesis and cancer, I attempt to explain thresholds observed in atomic bomb survivors, radium painters, uranium workers and patients injected with Thorotrast. Teratogenic injury induced by doses below threshold will be completely eliminated as a result of altruistic death (apoptosis) of injured cells. Various lines of evidence obtained show that oncomutations produced in cancerous cells after exposure to radiation are of spontaneous origin and that ionizing radiation acts not as an oncomutation inducer but as a tumor promoter by induction of chronic wound-healing activity. The tissue damage induced by radiation has to be repaired by cell growth and this creates opportunity for clonal expansion of a spontaneously occurring preneoplastic cell. If the wound-healing error model is correct, there must be a threshold dose range of radiation giving no increase in cancer risk. (author)

  10. A comparative review of radiation-induced cancer risk models

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Hee; Kim, Ju Youl [FNC Technology Co., Ltd., Yongin (Korea, Republic of); Han, Seok Jung [Risk and Environmental Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-06-15

    With the need for a domestic level 3 probabilistic safety assessment (PSA), it is essential to develop a Korea-specific code. Health effect assessments study radiation-induced impacts; in particular, long-term health effects are evaluated in terms of cancer risk. The objective of this study was to analyze the latest cancer risk models developed by foreign organizations and to compare the methodology of how they were developed. This paper also provides suggestions regarding the development of Korean cancer risk models. A review of cancer risk models was carried out targeting the latest models: the NUREG model (1993), the BEIR VII model (2006), the UNSCEAR model (2006), the ICRP 103 model (2007), and the U.S. EPA model (2011). The methodology of how each model was developed is explained, and the cancer sites, dose and dose rate effectiveness factor (DDREF) and mathematical models are also described in the sections presenting differences among the models. The NUREG model was developed by assuming that the risk was proportional to the risk coefficient and dose, while the BEIR VII, UNSCEAR, ICRP, and U.S. EPA models were derived from epidemiological data, principally from Japanese atomic bomb survivors. The risk coefficient does not consider individual characteristics, as the values were calculated in terms of population-averaged cancer risk per unit dose. However, the models derived by epidemiological data are a function of sex, exposure age, and attained age of the exposed individual. Moreover, the methodologies can be used to apply the latest epidemiological data. Therefore, methodologies using epidemiological data should be considered first for developing a Korean cancer risk model, and the cancer sites and DDREF should also be determined based on Korea-specific studies. This review can be used as a basis for developing a Korean cancer risk model in the future.

  11. User's guide to the MESOI diffusion model: Version 1. 1 (for Data General Eclipse S/230 with AFOS)

    Energy Technology Data Exchange (ETDEWEB)

    Athey, G.F.; Ramsdell, J.V.

    1982-09-01

    MESOI is an interactive, Langrangian puff trajectory model. The model theory is documented separately (Ramsdell and Athey, 1981). Version 1.1 is a modified form of the original 1.0. It is designed to run on a Data General Eclipse computer. The model has improved support features which make it useful as an emergency response tool. This report is intended to provide the user with the information necessary to successfully conduct model simulations using MESOI Version 1.1 and to use the support programs STAPREP and EXPLT. The user is also provided information on the use of the data file maintenance and review program UPDATE. Examples are given for the operation of the program. Test data sets are described which allow the user to practice with the programs and to confirm proper implementation and execution.

  12. Advances in Disaster Modeling, Simulation and Visualization for Sandstorm Risk Management in North China

    Directory of Open Access Journals (Sweden)

    Hang Lei

    2012-05-01

    Full Text Available Dust storms in North China result in high concentrations of airborne dust particles, which cause detrimental effects on human health as well as social and economic losses and environmental degradation. To investigate the impact of land surface processes on dust storms, we simulate two dust storm events in North China during spring 2002 using two versions of a dust storm prediction system developed by the Institute for Atmospheric Physics (IAP in Beijing, China. The primary difference between the IAP Sandstorm Prediction System (IAPS 1.0 and more recent version (IAPS 2.0 is the land surface modeling. IAPS 1.0 is based on the Oregon State University (OSU land surface model, whereas the latest version of the dust storm prediction (IAPS 2.0 uses NOAH land surface schemes for land surface modeling within a meteorological model, MM5. This work investigates whether the improved land surface modeling affects modeling of sandstorms. It is shown that an integrated sandstorm management system can be used to aid the following tasks: ensure sandstorm monitoring and warning; incorporate weather forecasts; ascertain the risk of a sandstorm disaster; integrate multiple technologies (for example, GIS, remote sensing, and information processing technology; track the progress of the storm in real-time; exhibit flexibility, accuracy and reliability (by using multiple sources of data, including in-situ meteorological observations; and monitor PM10 and PM2.5 dust concentrations in airborne dustfalls. The results indicate that with the new land surface scheme, the simulation of soil moisture is greatly improved, leading to a better estimate of the threshold frictional velocity, a key parameter for the estimating surface dust emissions. In this study, we also discuss specific mechanisms by which land surface processes affect dust storm modeling and make recommendations for further improvements to numerical dust storm simulations.

  13. Probabilistic Model for Integrated Assessment of the Behavior at the T.D.P. Version 2; Modelo Probabilista de Evaluación Integrada del Comportamiento de la P.D.T. Versión 2

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, A.; Eguilior, S.; Recreo, F

    2015-07-01

    This report documents the completion of the first phase of the implementation of the methodology ABACO2G (Bayes Application to Geological Storage of CO2) and the final version of the ABACO2G probabilistic model for the injection phase before its future validation in the experimental field of the Technology Development Plant in Hontom (Burgos). The model, which is based on the determination of the probabilistic risk component of a geological storage of CO2 using the formalism of Bayesian networks and Monte Carlo probability yields quantitative probability functions of the total system CO2 storage and of each one of their subsystems (storage subsystem and the primary seal; secondary containment subsystem and dispersion subsystem or tertiary one); the implementation of the stochastic time evolution of the CO2 plume during the injection period, the stochastic time evolution of the drying front, the probabilistic evolution of the pressure front, decoupled from the CO2 plume progress front, and the implementation of submodels and leakage probability functions through major leakage risk elements (fractures / faults and wells / deep boreholes) which together define the space of events to estimate the risks associated with the CO2 geological storage system. The activities included in this report have been to replace the previous qualitative estimation submodels of former ABACO2G version developed during Phase I of the project ALM-10-017, by analytical, semi-analytical or numerical submodels for the main elements of risk (wells and fractures), to obtain an integrated probabilistic model of a CO2 storage complex in carbonate formations that meets the needs of the integrated behavior evaluation of the Technology Development Plant in Hontomín.

  14. Tutorial in biostatistics: competing risks and multi-state models

    NARCIS (Netherlands)

    Putter, H.; Fiocco, M.; Geskus, R. B.

    2007-01-01

    Standard survival data measure the time span from some time origin until the occurrence of one type of event. If several types of events occur, a model describing progression to each of these competing risks is needed. Multi-state models generalize competing risks models by also describing

  15. Temperature and Humidity Profiles in the TqJoint Data Group of AIRS Version 6 Product for the Climate Model Evaluation

    Science.gov (United States)

    Ding, Feng; Fang, Fan; Hearty, Thomas J.; Theobald, Michael; Vollmer, Bruce; Lynnes, Christopher

    2014-01-01

    The Atmospheric Infrared Sounder (AIRS) mission is entering its 13th year of global observations of the atmospheric state, including temperature and humidity profiles, outgoing long-wave radiation, cloud properties, and trace gases. Thus AIRS data have been widely used, among other things, for short-term climate research and observational component for model evaluation. One instance is the fifth phase of the Coupled Model Intercomparison Project (CMIP5) which uses AIRS version 5 data in the climate model evaluation. The NASA Goddard Earth Sciences Data and Information Services Center (GES DISC) is the home of processing, archiving, and distribution services for data from the AIRS mission. The GES DISC, in collaboration with the AIRS Project, released data from the version 6 algorithm in early 2013. The new algorithm represents a significant improvement over previous versions in terms of greater stability, yield, and quality of products. The ongoing Earth System Grid for next generation climate model research project, a collaborative effort of GES DISC and NASA JPL, will bring temperature and humidity profiles from AIRS version 6. The AIRS version 6 product adds a new "TqJoint" data group, which contains data for a common set of observations across water vapor and temperature at all atmospheric levels and is suitable for climate process studies. How different may the monthly temperature and humidity profiles in "TqJoint" group be from the "Standard" group where temperature and water vapor are not always valid at the same time? This study aims to answer the question by comprehensively comparing the temperature and humidity profiles from the "TqJoint" group and the "Standard" group. The comparison includes mean differences at different levels globally and over land and ocean. We are also working on examining the sampling differences between the "TqJoint" and "Standard" group using MERRA data.

  16. Statistical model of fractures and deformations zones for Forsmark. Preliminary site description Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, Paul R. [Golder Associate Inc., Redmond, WA (United States); Olofsson, Isabelle; Hermanson, Jan [Golder Associates AB, Uppsala (Sweden)

    2005-04-01

    Compared to version 1.1, a much larger amount of data especially from boreholes is available. Both one-hole interpretation and Boremap indicate the presence of high and low fracture intensity intervals in the rock mass. The depth and width of these intervals varies from borehole to borehole but these constant fracture intensity intervals are contiguous and present quite sharp transitions. There is not a consistent pattern of intervals of high fracture intensity at or near to the surface. In many cases, the intervals of highest fracture intensity are considerably below the surface. While some fractures may have occurred or been reactivated in response to surficial stress relief, surficial stress relief does not appear to be a significant explanatory variable for the observed variations in fracture intensity. Data from the high fracture intensity intervals were extracted and statistical analyses were conducted in order to identify common geological factors. Stereoplots of fracture orientation versus depth for the different fracture intensity intervals were also produced for each borehole. Moreover percussion borehole data were analysed in order to identify the persistence of these intervals throughout the model volume. The main conclusions of these analyses are the following: The fracture intensity is conditioned by the rock domain, but inside a rock domain intervals of high and low fracture intensity are identified. The intervals of high fracture intensity almost always correspond to intervals with distinct fracture orientations (whether a set, most often the NW sub-vertical set, is highly dominant, or some orientation sets are missing). These high fracture intensity intervals are positively correlated to the presence of first and second generation minerals (epidote, calcite). No clear correlation for these fracture intensity intervals has been identified between holes. Based on these results the fracture frequency has been calculated in each rock domain for the

  17. Statistical model of fractures and deformations zones for Forsmark. Preliminary site description Forsmark area - version 1.2

    International Nuclear Information System (INIS)

    La Pointe, Paul R.; Olofsson, Isabelle; Hermanson, Jan

    2005-04-01

    Compared to version 1.1, a much larger amount of data especially from boreholes is available. Both one-hole interpretation and Boremap indicate the presence of high and low fracture intensity intervals in the rock mass. The depth and width of these intervals varies from borehole to borehole but these constant fracture intensity intervals are contiguous and present quite sharp transitions. There is not a consistent pattern of intervals of high fracture intensity at or near to the surface. In many cases, the intervals of highest fracture intensity are considerably below the surface. While some fractures may have occurred or been reactivated in response to surficial stress relief, surficial stress relief does not appear to be a significant explanatory variable for the observed variations in fracture intensity. Data from the high fracture intensity intervals were extracted and statistical analyses were conducted in order to identify common geological factors. Stereoplots of fracture orientation versus depth for the different fracture intensity intervals were also produced for each borehole. Moreover percussion borehole data were analysed in order to identify the persistence of these intervals throughout the model volume. The main conclusions of these analyses are the following: The fracture intensity is conditioned by the rock domain, but inside a rock domain intervals of high and low fracture intensity are identified. The intervals of high fracture intensity almost always correspond to intervals with distinct fracture orientations (whether a set, most often the NW sub-vertical set, is highly dominant, or some orientation sets are missing). These high fracture intensity intervals are positively correlated to the presence of first and second generation minerals (epidote, calcite). No clear correlation for these fracture intensity intervals has been identified between holes. Based on these results the fracture frequency has been calculated in each rock domain for the

  18. Conceptual Model of Offshore Wind Environmental Risk Evaluation System

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Richard M.; Copping, Andrea E.; Van Cleve, Frances B.; Unwin, Stephen D.; Hamilton, Erin L.

    2010-06-01

    In this report we describe the development of the Environmental Risk Evaluation System (ERES), a risk-informed analytical process for estimating the environmental risks associated with the construction and operation of offshore wind energy generation projects. The development of ERES for offshore wind is closely allied to a concurrent process undertaken to examine environmental effects of marine and hydrokinetic (MHK) energy generation, although specific risk-relevant attributes will differ between the MHK and offshore wind domains. During FY10, a conceptual design of ERES for offshore wind will be developed. The offshore wind ERES mockup described in this report will provide a preview of the functionality of a fully developed risk evaluation system that will use risk assessment techniques to determine priority stressors on aquatic organisms and environments from specific technology aspects, identify key uncertainties underlying high-risk issues, compile a wide-range of data types in an innovative and flexible data organizing scheme, and inform planning and decision processes with a transparent and technically robust decision-support tool. A fully functional version of ERES for offshore wind will be developed in a subsequent phase of the project.

  19. Enigma Version 12

    Science.gov (United States)

    Shores, David; Goza, Sharon P.; McKeegan, Cheyenne; Easley, Rick; Way, Janet; Everett, Shonn; Guerra, Mark; Kraesig, Ray; Leu, William

    2013-01-01

    Enigma Version 12 software combines model building, animation, and engineering visualization into one concise software package. Enigma employs a versatile user interface to allow average users access to even the most complex pieces of the application. Using Enigma eliminates the need to buy and learn several software packages to create an engineering visualization. Models can be created and/or modified within Enigma down to the polygon level. Textures and materials can be applied for additional realism. Within Enigma, these models can be combined to create systems of models that have a hierarchical relationship to one another, such as a robotic arm. Then these systems can be animated within the program or controlled by an external application programming interface (API). In addition, Enigma provides the ability to use plug-ins. Plugins allow the user to create custom code for a specific application and access the Enigma model and system data, but still use the Enigma drawing functionality. CAD files can be imported into Enigma and combined to create systems of computer graphics models that can be manipulated with constraints. An API is available so that an engineer can write a simulation and drive the computer graphics models with no knowledge of computer graphics. An animation editor allows an engineer to set up sequences of animations generated by simulations or by conceptual trajectories in order to record these to highquality media for presentation. Enigma Version 12 Lyndon B. Johnson Space Center, Houston, Texas 28 NASA Tech Briefs, September 2013 Planetary Protection Bioburden Analysis Program NASA's Jet Propulsion Laboratory, Pasadena, California This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous

  20. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  1. Rock mechanics site descriptive model-theoretical approach. Preliminary site description Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Fredriksson, Anders; Olofsson, Isabelle [Golder Associates AB, Uppsala (Sweden)

    2005-12-15

    The present report summarises the theoretical approach to estimate the mechanical properties of the rock mass in relation to the Preliminary Site Descriptive Modelling, version 1.2 Forsmark. The theoretical approach is based on a discrete fracture network (DFN) description of the fracture system in the rock mass and on the results of mechanical testing of intact rock and on rock fractures. To estimate the mechanical properties of the rock mass a load test on a rock block with fractures is simulated with the numerical code 3DEC. The location and size of the fractures are given by DFN-realisations. The rock block was loaded in plain strain condition. From the calculated relationship between stresses and deformations the mechanical properties of the rock mass were determined. The influence of the geometrical properties of the fracture system on the mechanical properties of the rock mass was analysed by loading 20 blocks based on different DFN-realisations. The material properties of the intact rock and the fractures were kept constant. The properties are set equal to the mean value of each measured material property. The influence of the variation of the properties of the intact rock and variation of the mechanical properties of the fractures are estimated by analysing numerical load tests on one specific block (one DFN-realisation) with combinations of properties for intact rock and fractures. Each parameter varies from its lowest values to its highest values while the rest of the parameters are held constant, equal to the mean value. The resulting distribution was expressed as a variation around the value determined with mean values on all parameters. To estimate the resulting distribution of the mechanical properties of the rock mass a Monte-Carlo simulation was performed by generating values from the two distributions independent of each other. The two values were added and the statistical properties of the resulting distribution were determined.

  2. Rock mechanics site descriptive model-theoretical approach. Preliminary site description Forsmark area - version 1.2

    International Nuclear Information System (INIS)

    Fredriksson, Anders; Olofsson, Isabelle

    2005-12-01

    The present report summarises the theoretical approach to estimate the mechanical properties of the rock mass in relation to the Preliminary Site Descriptive Modelling, version 1.2 Forsmark. The theoretical approach is based on a discrete fracture network (DFN) description of the fracture system in the rock mass and on the results of mechanical testing of intact rock and on rock fractures. To estimate the mechanical properties of the rock mass a load test on a rock block with fractures is simulated with the numerical code 3DEC. The location and size of the fractures are given by DFN-realisations. The rock block was loaded in plain strain condition. From the calculated relationship between stresses and deformations the mechanical properties of the rock mass were determined. The influence of the geometrical properties of the fracture system on the mechanical properties of the rock mass was analysed by loading 20 blocks based on different DFN-realisations. The material properties of the intact rock and the fractures were kept constant. The properties are set equal to the mean value of each measured material property. The influence of the variation of the properties of the intact rock and variation of the mechanical properties of the fractures are estimated by analysing numerical load tests on one specific block (one DFN-realisation) with combinations of properties for intact rock and fractures. Each parameter varies from its lowest values to its highest values while the rest of the parameters are held constant, equal to the mean value. The resulting distribution was expressed as a variation around the value determined with mean values on all parameters. To estimate the resulting distribution of the mechanical properties of the rock mass a Monte-Carlo simulation was performed by generating values from the two distributions independent of each other. The two values were added and the statistical properties of the resulting distribution were determined

  3. Assessment of radionuclide databases in CAP88 mainframe version 1.0 and Windows-based version 3.0.

    Science.gov (United States)

    LaBone, Elizabeth D; Farfán, Eduardo B; Lee, Patricia L; Jannik, G Timothy; Donnelly, Elizabeth H; Foley, Trevor Q

    2009-09-01

    In this study the radionuclide databases for two versions of the Clean Air Act Assessment Package-1988 (CAP88) computer model were assessed in detail. CAP88 estimates radiation dose and the risk of health effects to human populations from radionuclide emissions to air. This program is used by several U.S. Department of Energy (DOE) facilities to comply with National Emission Standards for Hazardous Air Pollutants regulations. CAP88 Mainframe, referred to as version 1.0 on the U.S. Environmental Protection Agency Web site (http://www.epa.gov/radiation/assessment/CAP88/), was the very first CAP88 version released in 1988. Some DOE facilities including the Savannah River Site still employ this version (1.0) while others use the more user-friendly personal computer Windows-based version 3.0 released in December 2007. Version 1.0 uses the program RADRISK based on International Commission on Radiological Protection Publication 30 as its radionuclide database. Version 3.0 uses half-life, dose, and risk factor values based on Federal Guidance Report 13. Differences in these values could cause different results for the same input exposure data (same scenario), depending on which version of CAP88 is used. Consequently, the differences between the two versions are being assessed in detail at Savannah River National Laboratory. The version 1.0 and 3.0 database files contain 496 and 838 radionuclides, respectively, and though one would expect the newer version to include all the 496 radionuclides, 35 radionuclides are listed in version 1.0 that are not included in version 3.0. The majority of these has either extremely short or long half-lives or is no longer in production; however, some of the short-lived radionuclides might produce progeny of great interest at DOE sites. In addition, 122 radionuclides were found to have different half-lives in the two versions, with 21 over 3 percent different and 12 over 10 percent different.

  4. Infrastructure Upgrades to Support Model Longevity and New Applications: The Variable Infiltration Capacity Model Version 5.0 (VIC 5.0)

    Science.gov (United States)

    Nijssen, B.; Hamman, J.; Bohn, T. J.

    2015-12-01

    The Variable Infiltration Capacity (VIC) model is a macro-scale semi-distributed hydrologic model. VIC development began in the early 1990s and it has been used extensively, applied from basin to global scales. VIC has been applied in a many use cases, including the construction of hydrologic data sets, trend analysis, data evaluation and assimilation, forecasting, coupled climate modeling, and climate change impact analysis. Ongoing applications of the VIC model include the University of Washington's drought monitor and forecast systems, and NASA's land data assimilation systems. The development of VIC version 5.0 focused on reconfiguring the legacy VIC source code to support a wider range of modern modeling applications. The VIC source code has been moved to a public Github repository to encourage participation by the model development community-at-large. The reconfiguration has separated the physical core of the model from the driver, which is responsible for memory allocation, pre- and post-processing and I/O. VIC 5.0 includes four drivers that use the same physical model core: classic, image, CESM, and Python. The classic driver supports legacy VIC configurations and runs in the traditional time-before-space configuration. The image driver includes a space-before-time configuration, netCDF I/O, and uses MPI for parallel processing. This configuration facilitates the direct coupling of streamflow routing, reservoir, and irrigation processes within VIC. The image driver is the foundation of the CESM driver; which couples VIC to CESM's CPL7 and a prognostic atmosphere. Finally, we have added a Python driver that provides access to the functions and datatypes of VIC's physical core from a Python interface. This presentation demonstrates how reconfiguring legacy source code extends the life and applicability of a research model.

  5. Risk communication: a mental models approach

    National Research Council Canada - National Science Library

    Morgan, M. Granger (Millett Granger)

    2002-01-01

    ... information about risks. The procedure uses approaches from risk and decision analysis to identify the most relevant information; it also uses approaches from psychology and communication theory to ensure that its message is understood. This book is written in nontechnical terms, designed to make the approach feasible for anyone willing to try it. It is illustrat...

  6. Sensitivity of a modified version of the 'timed get up and go' test to predict fall risk in the elderly: a pilot study.

    Science.gov (United States)

    Giné-Garriga, Maria; Guerra, Míriam; Marí-Dell'Olmo, Marc; Martin, Carme; Unnithan, Viswanath B

    2009-01-01

    The purpose of this study was to assess the sensitivity of a modified version of the 'Timed Get Up and Go' (TGUG) test in predicting fall risk in elderly individuals, using both a quantitative and qualitative approach in individuals older than 65 years. Ten subjects (83.4+/-4.5 years) undertook the test twice. To assess inter-rater reliability, three investigators timed the two trials using a stopwatch (quantitative). The reproducibility of a qualitative evaluation of the trials was accomplished by the completion of an assessment questionnaire (AQ) at each trial by three investigators. To assess the agreement between the three investigators, the coefficients of reliability (CR), intra-class correlation coefficients (ICC) and limits of agreement were determined for the total time to do the test (TT). The weighted Kappa K of Cohen and ICC was calculated for the AQ. Inter-group comparison: 60 subjects (74.2+/-4.9 years) were divided equally into four groups: (1) sedentary with previous history of falls, (2) sedentary without history of falls, (3) active with history of falls, and (4) active without history of falls. All of them undertook the modified TGUG test once. One investigator undertook the timing and completed the AQ. CR values for the TT were above 98% and with ICC of TT=0.999. The differences in TT between the three investigators' measures ranged from 0.19-0.55 s S.D. of the mean difference. Weighted Kappa K of Cohen ranged 0.835-0.976, with ICC of AQ=0.954. Inter-group comparison study. Significant differences (pfall risk in elderly individuals, and good inter-tester reliability from both a quantitative and qualitative perspective.

  7. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  8. Performance of advanced self-shielding models in DRAGON Version4 on analysis of a high conversion light water reactor lattice

    International Nuclear Information System (INIS)

    Karthikeyan, Ramamoorthy; Hebert, Alain

    2008-01-01

    A high conversion light water reactor lattice has been analysed using the code DRAGON Version4. This analysis was performed to test the performance of the advanced self-shielding models incorporated in DRAGON Version4. The self-shielding models are broadly classified into two groups - 'equivalence in dilution' and 'subgroup approach'. Under the 'equivalence in dilution' approach we have analysed the generalized Stamm'ler model with and without Nordheim model and Riemann integration. These models have been analysed also using the Livolant-Jeanpierre normalization. Under the 'subgroup approach', we have analysed Statistical self-shielding model based on physical probability tables and Ribon extended self-shielding model based on mathematical probability tables. This analysis will help in understanding the performance of advanced self-shielding models for a lattice that is tight and has a large fraction of fissions happening in the resonance region. The nuclear data for the analysis was generated in-house. NJOY99.90 was used for generating libraries in DRAGLIB format for analysis using DRAGON and A Compact ENDF libraries for analysis using MCNP5. The evaluated datafiles were chosen based on the recommendations of the IAEA Co-ordinated Research Project on the WIMS Library Update Project. The reference solution for the problem was obtained using Monte Carlo code MCNP5. It was found that the Ribon extended self-shielding model based on mathematical probability tables using correlation model performed better than all other models

  9. Operational risk quantification and modelling within Romanian insurance industry

    Directory of Open Access Journals (Sweden)

    Tudor Răzvan

    2017-07-01

    Full Text Available This paper aims at covering and describing the shortcomings of various models used to quantify and model the operational risk within insurance industry with a particular focus on Romanian specific regulation: Norm 6/2015 concerning the operational risk issued by IT systems. While most of the local insurers are focusing on implementing the standard model to compute the Operational Risk solvency capital required, the local regulator has issued a local norm that requires to identify and assess the IT based operational risks from an ISO 27001 perspective. The challenges raised by the correlations assumed in the Standard model are substantially increased by this new regulation that requires only the identification and quantification of the IT operational risks. The solvency capital requirement stipulated by the implementation of Solvency II doesn’t recommend a model or formula on how to integrate the newly identified risks in the Operational Risk capital requirements. In this context we are going to assess the academic and practitioner’s understanding in what concerns: The Frequency-Severity approach, Bayesian estimation techniques, Scenario Analysis and Risk Accounting based on risk units, and how they could support the modelling of operational risk that are IT based. Developing an internal model only for the operational risk capital requirement proved to be, so far, costly and not necessarily beneficial for the local insurers. As the IT component will play a key role in the future of the insurance industry, the result of this analysis will provide a specific approach in operational risk modelling that can be implemented in the context of Solvency II, in a particular situation when (internal or external operational risk databases are scarce or not available.

  10. Risk Modeling Approaches in Terms of Volatility Banking Transactions

    Directory of Open Access Journals (Sweden)

    Angelica Cucşa (Stratulat

    2016-01-01

    Full Text Available The inseparability of risk and banking activity is one demonstrated ever since banking systems, the importance of the topic being presend in current life and future equally in the development of banking sector. Banking sector development is done in the context of the constraints of nature and number of existing risks and those that may arise, and serves as limiting the risk of banking activity. We intend to develop approaches to analyse risk through mathematical models by also developing a model for the Romanian capital market 10 active trading picks that will test investor reaction in controlled and uncontrolled conditions of risk aggregated with harmonised factors.

  11. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    Science.gov (United States)

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2017-10-01

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  12. Version 2.0 of the European Gas Model. Changes and their impact on the German gas sector; Das europaeische Gas Target Model 2.0. Aenderungen und Auswirkungen auf den deutschen Gassektor

    Energy Technology Data Exchange (ETDEWEB)

    Balmert, David; Petrov, Konstantin [DNV GL, Bonn (Germany)

    2015-06-15

    In January 2015 ACER, the European Agency for the Cooperation of Energy Regulators, presented an updated version of its target model for the inner-European natural gas market, also referred to as version 2.0 of the Gas Target Model. During 2014 the existing model, originally developed by the Council of European Energy Regulators (CEER) and launched in 2011, had been analysed, revised and updated in preparation of the new version. While it has few surprises to offer, the new Gas Target Model contains specifies and goes into greater detail on many elements of the original model. Some of the new content is highly relevant to the German gas sector, not least the deliberations on the current key issues, which are security of supply and the ability of the gas markets to function.

  13. Calibration plots for risk prediction models in the presence of competing risks

    DEFF Research Database (Denmark)

    Gerds, Thomas A; Andersen, Per K; Kattan, Michael W

    2014-01-01

    A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks...... prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves...

  14. The study of the risk management model of construction project

    International Nuclear Information System (INIS)

    Jiang Bo; Feng Yanping; Liu Changbin

    2010-01-01

    The paper first analyzed the development of the risk management of construction project and the risk management processes, and then briefly introduced the risk management experience of foreign project management. From the project management by objectives point of view, the greatest risk came from the lack of clarity of the objectives in the project management, which led to the project's risk emergence. In the analysis of the principles of the project objectives identification and risk allocation, the paper set up a project management model which insurance companies involved in the whole process of the project management, and simply analyzed the roles of insurance company at last. (authors)

  15. Conceptualizing a Dynamic Fall Risk Model Including Intrinsic Risks and Exposures.

    Science.gov (United States)

    Klenk, Jochen; Becker, Clemens; Palumbo, Pierpaolo; Schwickert, Lars; Rapp, Kilan; Helbostad, Jorunn L; Todd, Chris; Lord, Stephen R; Kerse, Ngaire

    2017-11-01

    Falls are a major cause of injury and disability in older people, leading to serious health and social consequences including fractures, poor quality of life, loss of independence, and institutionalization. To design and provide adequate prevention measures, accurate understanding and identification of person's individual fall risk is important. However, to date, the performance of fall risk models is weak compared with models estimating, for example, cardiovascular risk. This deficiency may result from 2 factors. First, current models consider risk factors to be stable for each person and not change over time, an assumption that does not reflect real-life experience. Second, current models do not consider the interplay of individual exposure including type of activity (eg, walking, undertaking transfers) and environmental risks (eg, lighting, floor conditions) in which activity is performed. Therefore, we posit a dynamic fall risk model consisting of intrinsic risk factors that vary over time and exposure (activity in context). eHealth sensor technology (eg, smartphones) begins to enable the continuous measurement of both the above factors. We illustrate our model with examples of real-world falls from the FARSEEING database. This dynamic framework for fall risk adds important aspects that may improve understanding of fall mechanisms, fall risk models, and the development of fall prevention interventions. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  16. Risk matrix model applied to the outsourcing of logistics' activities

    Directory of Open Access Journals (Sweden)

    Fouad Jawab

    2015-09-01

    Full Text Available Purpose: This paper proposes the application of the risk matrix model in the field of logistics outsourcing. Such an application can serve as the basis for decision making regarding the conduct of a risk management in the logistics outsourcing process and allow its prevention. Design/methodology/approach: This study is based on the risk management of logistics outsourcing in the field of the retail sector in Morocco. The authors identify all possible risks and then classify and prioritize them using the Risk Matrix Model. Finally, we have come to four possible decisions for the identified risks. The analysis was made possible through interviews and discussions with the heads of departments and agents who are directly involved in each outsourced activity. Findings and Originality/value: It is possible to improve the risk matrix model by proposing more personalized prevention measures according to each company that operates in the mass-market retailing. Originality/value: This study is the only one made in the process of logistics outsourcing in the retail sector in Morocco through Label’vie as a case study. First, we had identified as thorough as we could all possible risks, then we applied the Risk Matrix Model to sort them out in an ascending order of importance and criticality. As a result, we could hand out to the decision-makers the mapping for an effective control of risks and a better guiding of the process of risk management.

  17. A Process Model for Assessing Adolescent Risk for Suicide.

    Science.gov (United States)

    Stoelb, Matt; Chiriboga, Jennifer

    1998-01-01

    This comprehensive assessment process model includes primary, secondary, and situational risk factors and their combined implications and significance in determining an adolescent's level or risk for suicide. Empirical data and clinical intuition are integrated to form a working client model that guides the professional in continuously reassessing…

  18. Tests of control in the Audit Risk Model : Effective? Efficient?

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans)

    2004-01-01

    Lately, the Audit Risk Model has been subject to criticism. To gauge its validity, t