WorldWideScience

Sample records for event-based risk model

  1. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    Directory of Open Access Journals (Sweden)

    Ninna Reitzel Jensen

    2015-06-01

    Full Text Available Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance and unit-linked insurance. By use of a two-account model, we are able to illustrate general concepts without making the model too abstract. To allow for complicated financial markets without dramatically increasing the mathematical complexity, we focus on economic scenarios. We illustrate the use of our model by conducting scenario analysis based on Monte Carlo simulation, but the model applies to scenarios in general and to worst-case and best-estimate scenarios in particular. In addition to easy computations, our model offers a common framework for the valuation of life insurance payments across product types. This enables comparison of participating life insurance products and unit-linked insurance products, thus building a bridge between the two different ways of formalizing life insurance products. Finally, our model distinguishes itself from the existing literature by taking into account the Markov model for the state of the policyholder and, hereby, facilitating event risk.

  2. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  3. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  4. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    DEFF Research Database (Denmark)

    Jensen, Ninna Reitzel; Schomacker, Kristian Juul

    2015-01-01

    Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death......, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance...... product types. This enables comparison of participating life insurance products and unit-linked insurance products, thus building a bridge between the two different ways of formalizing life insurance products. Finally, our model distinguishes itself from the existing literature by taking into account...

  5. Initiating Events Modeling for On-Line Risk Monitoring Application

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.

    1998-01-01

    In order to make on-line risk monitoring application of Probabilistic Risk Assessment more complete and realistic, a special attention need to be dedicated to initiating events modeling. Two different issues are of special importance: one is how to model initiating events frequency according to current plant configuration (equipment alignment and out of service status) and operating condition (weather and various activities), and the second is how to preserve dependencies between initiating events model and rest of PRA model. First, the paper will discuss how initiating events can be treated in on-line risk monitoring application. Second, practical example of initiating events modeling in EPRI's Equipment Out of Service on-line monitoring tool will be presented. Gains from application and possible improvements will be discussed in conclusion. (author)

  6. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  7. Event-based historical value-at-risk

    NARCIS (Netherlands)

    Hogenboom, F.P.; Winter, Michael; Hogenboom, A.C.; Jansen, Milan; Frasincar, F.; Kaymak, U.

    2012-01-01

    Value-at-Risk (VaR) is an important tool to assess portfolio risk. When calculating VaR based on historical stock return data, we hypothesize that this historical data is sensitive to outliers caused by news events in the sampled period. In this paper, we research whether the VaR accuracy can be

  8. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  9. Contrasting safety assessments of a runway incursion scenario: Event sequence analysis versus multi-agent dynamic risk modelling

    International Nuclear Information System (INIS)

    Stroeve, Sybert H.; Blom, Henk A.P.; Bakker, G.J.

    2013-01-01

    In the safety literature it has been argued, that in a complex socio-technical system safety cannot be well analysed by event sequence based approaches, but requires to capture the complex interactions and performance variability of the socio-technical system. In order to evaluate the quantitative and practical consequences of these arguments, this study compares two approaches to assess accident risk of an example safety critical sociotechnical system. It contrasts an event sequence based assessment with a multi-agent dynamic risk model (MA-DRM) based assessment, both of which are performed for a particular runway incursion scenario. The event sequence analysis uses the well-known event tree modelling formalism and the MA-DRM based approach combines agent based modelling, hybrid Petri nets and rare event Monte Carlo simulation. The comparison addresses qualitative and quantitative differences in the methods, attained risk levels, and in the prime factors influencing the safety of the operation. The assessments show considerable differences in the accident risk implications of the performance of human operators and technical systems in the runway incursion scenario. In contrast with the event sequence based results, the MA-DRM based results show that the accident risk is not manifest from the performance of and relations between individual human operators and technical systems. Instead, the safety risk emerges from the totality of the performance and interactions in the agent based model of the safety critical operation considered, which coincides very well with the argumentation in the safety literature.

  10. Framework for Modeling High-Impact, Low-Frequency Power Grid Events to Support Risk-Informed Decisions

    Energy Technology Data Exchange (ETDEWEB)

    Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.; Dagle, Jeffery E.; Millard, W. David; Yao, Juan; Glantz, Clifford S.; Gourisetti, Sri Nikhil Gup

    2015-12-03

    Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the electric power grid’s security and resilience. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be problematic as there exists an insufficient statistical basis to directly estimate the probabilities and consequences of their occurrence. Since risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact low frequency events (HILFs) is essential. Insights from such a model can inform where resources are most rationally and effectively expended. The present effort is focused on development of a HILF risk assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision makers across numerous stakeholder sectors. The North American Electric Reliability Corporation (NERC) 2014 Standard TPL-001-4 considers severe events for transmission reliability planning, but does not address events of such severity that they have the potential to fail a substantial fraction of grid assets over a region, such as geomagnetic disturbances (GMD), extreme seismic events, and coordinated cyber-physical attacks. These are beyond current planning guidelines. As noted, the risks associated with such events cannot be statistically estimated based on historic experience; however, there does exist a stable of risk modeling techniques for rare events that have proven of value across a wide range of engineering application domains. There is an active and growing interest in evaluating the value of risk management techniques in the State transmission planning and emergency response communities, some of this interest in the context of

  11. Agent based models for testing city evacuation strategies under a flood event as strategy to reduce flood risk

    Science.gov (United States)

    Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran

    2016-04-01

    This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city

  12. Risk based modelling

    International Nuclear Information System (INIS)

    Chapman, O.J.V.; Baker, A.E.

    1993-01-01

    Risk based analysis is a tool becoming available to both engineers and managers to aid decision making concerning plant matters such as In-Service Inspection (ISI). In order to develop a risk based method, some form of Structural Reliability Risk Assessment (SRRA) needs to be performed to provide a probability of failure ranking for all sites around the plant. A Probabilistic Risk Assessment (PRA) can then be carried out to combine these possible events with the capability of plant safety systems and procedures, to establish the consequences of failure for the sites. In this way the probability of failures are converted into a risk based ranking which can be used to assist the process of deciding which sites should be included in an ISI programme. This paper reviews the technique and typical results of a risk based ranking assessment carried out for nuclear power plant pipework. (author)

  13. Risk-based ranking of dominant contributors to maritime pollution events

    International Nuclear Information System (INIS)

    Wheeler, T.A.

    1993-01-01

    This report describes a conceptual approach for identifying dominant contributors to risk from maritime shipping of hazardous materials. Maritime transportation accidents are relatively common occurrences compared to more frequently analyzed contributors to public risk. Yet research on maritime safety and pollution incidents has not been guided by a systematic, risk-based approach. Maritime shipping accidents can be analyzed using event trees to group the accidents into 'bins,' or groups, of similar characteristics such as type of cargo, location of accident (e.g., harbor, inland waterway), type of accident (e.g., fire, collision, grounding), and size of release. The importance of specific types of events to each accident bin can be quantified. Then the overall importance of accident events to risk can be estimated by weighting the events' individual bin importance measures by the risk associated with each accident bin. 4 refs., 3 figs., 6 tabs

  14. Mutiple simultaneous event model for radiation carcinogenesis

    International Nuclear Information System (INIS)

    Baum, J.W.

    1979-01-01

    Theoretical Radiobiology and Risk Estimates includes reports on: Multiple Simultaneous Event Model for Radiation Carcinogenesis; Cancer Risk Estimates and Neutron RBE Based on Human Exposures; A Rationale for Nonlinear Dose Response Functions of Power Greater or Less Than One; and Rationale for One Double Event in Model for Radiation Carcinogenesis

  15. Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)

    Science.gov (United States)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  16. Development of a new risk model for predicting cardiovascular events among hemodialysis patients: Population-based hemodialysis patients from the Japan Dialysis Outcome and Practice Patterns Study (J-DOPPS.

    Directory of Open Access Journals (Sweden)

    Yukiko Matsubara

    Full Text Available Cardiovascular (CV events are the primary cause of death and becoming bedridden among hemodialysis (HD patients. The Framingham risk score (FRS is useful for predicting incidence of CV events in the general population, but is considerd to be unsuitable for the prediction of the incidence of CV events in HD patients, given their characteristics due to atypical relationships between conventional risk factors and outcomes. We therefore aimed to develop a new prognostic prediction model for prevention and early detection of CV events among hemodialysis patients.We enrolled 3,601 maintenance HD patients based on their data from the Japan Dialysis Outcomes and Practice Patterns Study (J-DOPPS, phases 3 and 4. We longitudinaly assessed the association between several potential candidate predictors and composite CV events in the year after study initiation. Potential candidate predictors included the component factors of FRS and other HD-specific risk factors. We used multivariable logistic regression with backward stepwise selection to develop our new prediction model and generated a calibration plot. Additinially, we performed bootstrapping to assess the internal validity.We observed 328 composite CV events during 1-year follow-up. The final prediction model contained six variables: age, diabetes status, history of CV events, dialysis time per session, and serum phosphorus and albumin levels. The new model showed significantly better discrimination than the FRS, in both men (c-statistics: 0.76 for new model, 0.64 for FRS and women (c-statistics: 0.77 for new model, 0.60 for FRS. Additionally, we confirmed the consistency between the observed results and predicted results using the calibration plot. Further, we found similar discrimination and calibration to the derivation model in the bootstrapping cohort.We developed a new risk model consisting of only six predictors. Our new model predicted CV events more accurately than the FRS.

  17. Assessing uncertainty in extreme events: Applications to risk-based decision making in interdependent infrastructure sectors

    International Nuclear Information System (INIS)

    Barker, Kash; Haimes, Yacov Y.

    2009-01-01

    Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input-output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences

  18. Risk and sensitivity analysis in relation to external events

    International Nuclear Information System (INIS)

    Alzbutas, R.; Urbonas, R.; Augutis, J.

    2001-01-01

    This paper presents risk and sensitivity analysis of external events impacts on the safe operation in general and in particular the Ignalina Nuclear Power Plant safety systems. Analysis is based on the deterministic and probabilistic assumptions and assessment of the external hazards. The real statistic data are used as well as initial external event simulation. The preliminary screening criteria are applied. The analysis of external event impact on the NPP safe operation, assessment of the event occurrence, sensitivity analysis, and recommendations for safety improvements are performed for investigated external hazards. Such events as aircraft crash, extreme rains and winds, forest fire and flying parts of the turbine are analysed. The models are developed and probabilities are calculated. As an example for sensitivity analysis the model of aircraft impact is presented. The sensitivity analysis takes into account the uncertainty features raised by external event and its model. Even in case when the external events analysis show rather limited danger, the sensitivity analysis can determine the highest influence causes. These possible variations in future can be significant for safety level and risk based decisions. Calculations show that external events cannot significantly influence the safety level of the Ignalina NPP operation, however the events occurrence and propagation can be sufficiently uncertain.(author)

  19. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    Science.gov (United States)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  20. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  1. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  2. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  3. Events per variable for risk differences and relative risks using pseudo-observations

    DEFF Research Database (Denmark)

    Hansen, Stefan Nygaard; Andersen, Per Kragh; Parner, Erik Thorlund

    2014-01-01

    A method based on pseudo-observations has been proposed for direct regression modeling of functionals of interest with right-censored data, including the survival function, the restricted mean and the cumulative incidence function in competing risks. The models, once the pseudo-observations have...... been computed, can be fitted using standard generalized estimating equation software. Regression models can however yield problematic results if the number of covariates is large in relation to the number of events observed. Guidelines of events per variable are often used in practice. These rules...

  4. Using competing risks model and competing events in outcome of pulmonary tuberculosis patients

    Directory of Open Access Journals (Sweden)

    Mehdi Kazempour Dizaji

    2016-01-01

    Conclusions: Use of competing risks model with competing events can provide a better way to understand the associated risk factors co-related with outcome of the pulmonary TB process, especially among DR-TB patients.

  5. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  6. Multilevel joint competing risk models

    Science.gov (United States)

    Karunarathna, G. H. S.; Sooriyarachchi, M. R.

    2017-09-01

    Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).

  7. Direct risk standardisation: a new method for comparing casemix adjusted event rates using complex models.

    Science.gov (United States)

    Nicholl, Jon; Jacques, Richard M; Campbell, Michael J

    2013-10-29

    Comparison of outcomes between populations or centres may be confounded by any casemix differences and standardisation is carried out to avoid this. However, when the casemix adjustment models are large and complex, direct standardisation has been described as "practically impossible", and indirect standardisation may lead to unfair comparisons. We propose a new method of directly standardising for risk rather than standardising for casemix which overcomes these problems. Using a casemix model which is the same model as would be used in indirect standardisation, the risk in individuals is estimated. Risk categories are defined, and event rates in each category for each centre to be compared are calculated. A weighted sum of the risk category specific event rates is then calculated. We have illustrated this method using data on 6 million admissions to 146 hospitals in England in 2007/8 and an existing model with over 5000 casemix combinations, and a second dataset of 18,668 adult emergency admissions to 9 centres in the UK and overseas and a published model with over 20,000 casemix combinations and a continuous covariate. Substantial differences between conventional directly casemix standardised rates and rates from direct risk standardisation (DRS) were found. Results based on DRS were very similar to Standardised Mortality Ratios (SMRs) obtained from indirect standardisation, with similar standard errors. Direct risk standardisation using our proposed method is as straightforward as using conventional direct or indirect standardisation, always enables fair comparisons of performance to be made, can use continuous casemix covariates, and was found in our examples to have similar standard errors to the SMR. It should be preferred when there is a risk that conventional direct or indirect standardisation will lead to unfair comparisons.

  8. Quantifying the predictive accuracy of time-to-event models in the presence of competing risks.

    Science.gov (United States)

    Schoop, Rotraut; Beyersmann, Jan; Schumacher, Martin; Binder, Harald

    2011-02-01

    Prognostic models for time-to-event data play a prominent role in therapy assignment, risk stratification and inter-hospital quality assurance. The assessment of their prognostic value is vital not only for responsible resource allocation, but also for their widespread acceptance. The additional presence of competing risks to the event of interest requires proper handling not only on the model building side, but also during assessment. Research into methods for the evaluation of the prognostic potential of models accounting for competing risks is still needed, as most proposed methods measure either their discrimination or calibration, but do not examine both simultaneously. We adapt the prediction error proposal of Graf et al. (Statistics in Medicine 1999, 18, 2529–2545) and Gerds and Schumacher (Biometrical Journal 2006, 48, 1029–1040) to handle models with competing risks, i.e. more than one possible event type, and introduce a consistent estimator. A simulation study investigating the behaviour of the estimator in small sample size situations and for different levels of censoring together with a real data application follows.

  9. Population-based absolute risk estimation with survey data

    Science.gov (United States)

    Kovalchik, Stephanie A.; Pfeiffer, Ruth M.

    2013-01-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  10. Coronary risk assessment by point-based vs. equation-based Framingham models: significant implications for clinical care.

    Science.gov (United States)

    Gordon, William J; Polansky, Jesse M; Boscardin, W John; Fung, Kathy Z; Steinman, Michael A

    2010-11-01

    US cholesterol guidelines use original and simplified versions of the Framingham model to estimate future coronary risk and thereby classify patients into risk groups with different treatment strategies. We sought to compare risk estimates and risk group classification generated by the original, complex Framingham model and the simplified, point-based version. We assessed 2,543 subjects age 20-79 from the 2001-2006 National Health and Nutrition Examination Surveys (NHANES) for whom Adult Treatment Panel III (ATP-III) guidelines recommend formal risk stratification. For each subject, we calculated the 10-year risk of major coronary events using the original and point-based Framingham models, and then compared differences in these risk estimates and whether these differences would place subjects into different ATP-III risk groups (20% risk). Using standard procedures, all analyses were adjusted for survey weights, clustering, and stratification to make our results nationally representative. Among 39 million eligible adults, the original Framingham model categorized 71% of subjects as having "moderate" risk (20%) risk. Estimates of coronary risk by the original and point-based models often differed substantially. The point-based system classified 15% of adults (5.7 million) into different risk groups than the original model, with 10% (3.9 million) misclassified into higher risk groups and 5% (1.8 million) into lower risk groups, for a net impact of classifying 2.1 million adults into higher risk groups. These risk group misclassifications would impact guideline-recommended drug treatment strategies for 25-46% of affected subjects. Patterns of misclassifications varied significantly by gender, age, and underlying CHD risk. Compared to the original Framingham model, the point-based version misclassifies millions of Americans into risk groups for which guidelines recommend different treatment strategies.

  11. CATASTROPHIC EVENTS MODELING

    Directory of Open Access Journals (Sweden)

    Ciumas Cristina

    2013-07-01

    Full Text Available This paper presents the emergence and evolution of catastrophe models (cat models. Starting with the present context of extreme weather events and features of catastrophic risk (cat risk we’ll make a chronological illustration from a theoretical point of view of the main steps taken for building such models. In this way the importance of interdisciplinary can be observed. The first cat model considered contains three modules. For each of these indentified modules: hazard, vulnerability and financial losses a detailed overview and also an exemplification of a potential case of an earthquake that measures more than 7 on Richter scale occurring nowadays in Bucharest will be provided. The key areas exposed to earthquake in Romania will be identified. Then, based on past catastrophe data and taking into account present conditions of housing stock, insurance coverage and the population of Bucharest the impact will be quantified by determining potential losses. In order to accomplish this work we consider a scenario with data representing average values for: dwelling’s surface, location, finishing works. On each step we’ll make a reference to the earthquake on March 4 1977 to see what would happen today if a similar event occurred. The value of Bucharest housing stock will be determined taking firstly the market value, then the replacement value and ultimately the real value to quantify potential damages. Through this approach we can find the insurance coverage of potential losses and also the uncovered gap. A solution that may be taken into account by public authorities, for example by Bucharest City Hall will be offered: in case such an event occurs the impossibility of paying compensations to insured people, rebuilding infrastructure and public buildings and helping the suffering persons should be avoided. An actively public-private partnership should be created between government authorities, the Natural Disaster Insurance Pool, private

  12. An Agent-Based Model of Evolving Community Flood Risk.

    Science.gov (United States)

    Tonn, Gina L; Guikema, Seth D

    2017-11-17

    Although individual behavior plays a major role in community flood risk, traditional flood risk models generally do not capture information on how community policies and individual decisions impact the evolution of flood risk over time. The purpose of this study is to improve the understanding of the temporal aspects of flood risk through a combined analysis of the behavioral, engineering, and physical hazard aspects of flood risk. Additionally, the study aims to develop a new modeling approach for integrating behavior, policy, flood hazards, and engineering interventions. An agent-based model (ABM) is used to analyze the influence of flood protection measures, individual behavior, and the occurrence of floods and near-miss flood events on community flood risk. The ABM focuses on the following decisions and behaviors: dissemination of flood management information, installation of community flood protection, elevation of household mechanical equipment, and elevation of homes. The approach is place based, with a case study area in Fargo, North Dakota, but is focused on generalizable insights. Generally, community mitigation results in reduced future damage, and individual action, including mitigation and movement into and out of high-risk areas, can have a significant influence on community flood risk. The results of this study provide useful insights into the interplay between individual and community actions and how it affects the evolution of flood risk. This study lends insight into priorities for future work, including the development of more in-depth behavioral and decision rules at the individual and community level. © 2017 Society for Risk Analysis.

  13. Can Regional Climate Models be used in the assessment of vulnerability and risk caused by extreme events?

    Science.gov (United States)

    Nunes, Ana

    2015-04-01

    Extreme meteorological events played an important role in catastrophic occurrences observed in the past over densely populated areas in Brazil. This motived the proposal of an integrated system for analysis and assessment of vulnerability and risk caused by extreme events in urban areas that are particularly affected by complex topography. That requires a multi-scale approach, which is centered on a regional modeling system, consisting of a regional (spectral) climate model coupled to a land-surface scheme. This regional modeling system employs a boundary forcing method based on scale-selective bias correction and assimilation of satellite-based precipitation estimates. Scale-selective bias correction is a method similar to the spectral nudging technique for dynamical downscaling that allows internal modes to develop in agreement with the large-scale features, while the precipitation assimilation procedure improves the modeled deep-convection and drives the land-surface scheme variables. Here, the scale-selective bias correction acts only on the rotational part of the wind field, letting the precipitation assimilation procedure to correct moisture convergence, in order to reconstruct South American current climate within the South American Hydroclimate Reconstruction Project. The hydroclimate reconstruction outputs might eventually produce improved initial conditions for high-resolution numerical integrations in metropolitan regions, generating more reliable short-term precipitation predictions, and providing accurate hidrometeorological variables to higher resolution geomorphological models. Better representation of deep-convection from intermediate scales is relevant when the resolution of the regional modeling system is refined by any method to meet the scale of geomorphological dynamic models of stability and mass movement, assisting in the assessment of risk areas and estimation of terrain stability over complex topography. The reconstruction of past extreme

  14. Modeling Documents with Event Model

    Directory of Open Access Journals (Sweden)

    Longhui Wang

    2015-08-01

    Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.

  15. Flood-risk mapping: contributions towards an enhanced assessment of extreme events and associated risks

    Directory of Open Access Journals (Sweden)

    B. Büchele

    2006-01-01

    Full Text Available Currently, a shift from classical flood protection as engineering task towards integrated flood risk management concepts can be observed. In this context, a more consequent consideration of extreme events which exceed the design event of flood protection structures and failure scenarios such as dike breaches have to be investigated. Therefore, this study aims to enhance existing methods for hazard and risk assessment for extreme events and is divided into three parts. In the first part, a regionalization approach for flood peak discharges was further developed and substantiated, especially regarding recurrence intervals of 200 to 10 000 years and a large number of small ungauged catchments. Model comparisons show that more confidence in such flood estimates for ungauged areas and very long recurrence intervals may be given as implied by statistical analysis alone. The hydraulic simulation in the second part is oriented towards hazard mapping and risk analyses covering the whole spectrum of relevant flood events. As the hydrodynamic simulation is directly coupled with a GIS, the results can be easily processed as local inundation depths for spatial risk analyses. For this, a new GIS-based software tool was developed, being presented in the third part, which enables estimations of the direct flood damage to single buildings or areas based on different established stage-damage functions. Furthermore, a new multifactorial approach for damage estimation is presented, aiming at the improvement of damage estimation on local scale by considering factors like building quality, contamination and precautionary measures. The methods and results from this study form the base for comprehensive risk analyses and flood management strategies.

  16. Modeling risks: effects of area deprivation, family socio-economic disadvantage and adverse life events on young children's psychopathology.

    Science.gov (United States)

    Flouri, Eirini; Mavroveli, Stella; Tzavidis, Nikos

    2010-06-01

    The effects of contextual risk on young children's behavior are not appropriately modeled. To model the effects of area and family contextual risk on young children's psychopathology. The final study sample consisted of 4,618 Millennium Cohort Study (MCS) children, who were 3 years old, clustered in lower layer super output areas in nine strata in the UK. Contextual risk was measured by socio-economic disadvantage (SED) at both area and family level, and by distal and proximal adverse life events at family level. Multivariate response multilevel models that allowed for correlated residuals at both individual and area level, and univariate multilevel models estimated the effect of contextual risk on specific and broad psychopathology measured by the Strengths and Difficulties Questionnaire. The area SED/broad psychopathology association remained significant after family SED was controlled, but not after maternal qualifications and family adverse life events were added to the model. Adverse life events predicted psychopathology in all models. Family SED did not predict emotional symptoms or hyperactivity after child characteristics were added to the model with the family-level controls. Area-level SED predicts child psychopathology via family characteristics; family-level SED predicts psychopathology largely by its impact on development; and adverse life events predict psychopathology independently of earlier adversity, SED and child characteristics, as well as maternal psychopathology, parenting and education.

  17. Developing points-based risk-scoring systems in the presence of competing risks.

    Science.gov (United States)

    Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P

    2016-09-30

    Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  18. Event-based soil loss models for construction sites

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  19. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  20. Using an extended 2D hydrodynamic model for evaluating damage risk caused by extreme rain events: Flash-Flood-Risk-Map (FFRM) Upper Austria

    Science.gov (United States)

    Humer, Günter; Reithofer, Andreas

    2016-04-01

    Using an extended 2D hydrodynamic model for evaluating damage risk caused by extreme rain events: Flash-Flood-Risk-Map (FFRM) Upper Austria Considering the increase in flash flood events causing massive damage during the last years in urban but also rural areas [1-4], the requirement for hydrodynamic calculation of flash flood prone areas and possible countermeasures has arisen to many municipalities and local governments. Besides the German based URBAS project [1], also the EU-funded FP7 research project "SWITCH-ON" [5] addresses the damage risk caused by flash floods in the sub-project "FFRM" (Flash Flood Risk Map Upper Austria) by calculating damage risk for buildings and vulnerable infrastructure like schools and hospitals caused by flash-flood driven inundation. While danger zones in riverine flooding are established as an integral part of spatial planning, flash floods caused by overland runoff from extreme rain events have been for long an underrated safety hazard not only for buildings and infrastructure, but man and animals as well. Based on the widespread 2D-model "hydro_as-2D", an extension was developed, which calculates the runoff formation from a spatially and temporally variable precipitation and determines two dimensionally the land surface area runoff and its concentration. The conception of the model is to preprocess the precipitation data and calculate the effective runoff-volume for a short time step of e.g. five minutes. This volume is applied to the nodes of the 2D-model and the calculation of the hydrodynamic model is started. At the end of each time step, the model run is stopped, the preprocessing step is repeated and the hydraulic model calculation is continued. In view of the later use for the whole of Upper Austria (12.000 km²) a model grid of 25x25 m² was established using digital elevation data. Model parameters could be estimated for the small catchment of river Ach, which was hit by an intense rain event with up to 109 mm per hour

  1. The determination of risk areas for muddy floods based on a worst-case erosion modelling

    Science.gov (United States)

    Saathoff, Ulfert; Schindewolf, Marcus; Annika Arévalo, Sarah

    2013-04-01

    Soil erosion and muddy floods are a frequently occurring hazard in the German state of Saxony, because of the topography and the high relief energy together with the high proportion of arable land. Still, the events are rather heterogeneously distributed and we do not know where damage is likely to occur. The goal of this study is to locate hot spots for the risk of muddy floods, with the objective to prevent high economic damage in future. We applied a soil erosion and deposition map of Saxony, calculated with the process based soil erosion model EROSION 3D. This map shows the potential soil erosion and transported sediment for worst case soil conditions and a 10 year rain storm event. Furthermore, a map of the current landuse in the state is used. From the landuse map, we extracted those areas that are especially vulnerable to muddy floods, like residential and industrial areas, infrastructural facilities (e.g. power plants, hospitals) and highways. In combination with the output of the soil erosion model, the amount of sediment, that enters each single landuse entity, is calculated. Based on this data, a state-wide map with classified risks is created. The results are furthermore used to identify the risk of muddy floods for each single municipality in Saxony. The results are evaluated with data of real occurred muddy flood events with documented locations during the period between 2000 and 2010. Additionally, plausibility tests are performed for selected areas (examination of landuse, topography and soil). The results prove to be plausible and most of the documented events can be explained by the modelled risk map. The created map can be used by different institutions like city and traffic planners, to estimate the risk of muddy flood occurrence at specific locations. Furthermore, the risk map can serve insurance companies to evaluate the insurance risk of a building. To make them easily accessible, the risk map will be published online via a web GIS

  2. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  3. Risk Management Technologies With Logic and Probabilistic Models

    CERN Document Server

    Solozhentsev, E D

    2012-01-01

    This book presents intellectual, innovative, information technologies (I3-technologies) based on logical and probabilistic (LP) risk models. The technologies presented here consider such models for structurally complex systems and processes with logical links and with random events in economics and technology.  The volume describes the following components of risk management technologies: LP-calculus; classes of LP-models of risk and efficiency; procedures for different classes; special software for different classes; examples of applications; methods for the estimation of probabilities of events based on expert information. Also described are a variety of training courses in these topics. The classes of risk models treated here are: LP-modeling, LP-classification, LP-efficiency, and LP-forecasting. Particular attention is paid to LP-models of risk of failure to resolve difficult economic and technical problems. Amongst the  discussed  procedures of I3-technologies  are the construction of  LP-models,...

  4. Science-based risk assessments for rare events in a changing climate

    Science.gov (United States)

    Sobel, A. H.; Tippett, M. K.; Camargo, S. J.; Lee, C. Y.; Allen, J. T.

    2014-12-01

    History shows that substantial investments in protection against any specific type of natural disaster usually occur only after (usually shortly after) that specific type of disaster has happened in a given place. This is true even when it was well known before the event that there was a significant risk that it could occur. Presumably what psychologists Kahneman and Tversky have called "availability bias" is responsible, at least in part, for these failures to act on known but out-of-sample risks. While understandable, this human tendency prepares us poorly for events which are very rare (on the time scales of human lives) and even more poorly for a changing climate, as historical records become a poorer guide. A more forward-thinking and rational approach would require scientific risk assessments that can place meaningful probabilities on events that are rare enough to be absent from the historical record, and that can account for the influences of both anthropogenic climate change and low-frequency natural climate variability. The set of tools available for doing such risk assessments is still quite limited, particularly for some of the most extreme events such as tropical cyclones and tornadoes. We will briefly assess the state of the art for these events in particular, and describe some of our ongoing research to develop new tools for quantitative risk assessment using hybrids of statistical methods and physical understanding of the hazards.

  5. RISK LOAN PORTFOLIO OPTIMIZATION MODEL BASED ON CVAR RISK MEASURE

    Directory of Open Access Journals (Sweden)

    Ming-Chang LEE

    2015-07-01

    Full Text Available In order to achieve commercial banks liquidity, safety and profitability objective requirements, loan portfolio risk analysis based optimization decisions are rational allocation of assets.  The risk analysis and asset allocation are the key technology of banking and risk management.  The aim of this paper, build a loan portfolio optimization model based on risk analysis.  Loan portfolio rate of return by using Value-at-Risk (VaR and Conditional Value-at-Risk (CVaR constraint optimization decision model reflects the bank's risk tolerance, and the potential loss of direct control of the bank.  In this paper, it analyze a general risk management model applied to portfolio problems with VaR and CVaR risk measures by using Using the Lagrangian Algorithm.  This paper solves the highly difficult problem by matrix operation method.  Therefore, the combination of this paper is easy understanding the portfolio problems with VaR and CVaR risk model is a hyperbola in mean-standard deviation space.  It is easy calculation in proposed method.

  6. Event-based model diagnosis of rainfall-runoff model structures

    International Nuclear Information System (INIS)

    Stanzel, P.

    2012-01-01

    The objective of this research is a comparative evaluation of different rainfall-runoff model structures. Comparative model diagnostics facilitate the assessment of strengths and weaknesses of each model. The application of multiple models allows an analysis of simulation uncertainties arising from the selection of model structure, as compared with effects of uncertain parameters and precipitation input. Four different model structures, including conceptual and physically based approaches, are compared. In addition to runoff simulations, results for soil moisture and the runoff components of overland flow, interflow and base flow are analysed. Catchment runoff is simulated satisfactorily by all four model structures and shows only minor differences. Systematic deviations from runoff observations provide insight into model structural deficiencies. While physically based model structures capture some single runoff events better, they do not generally outperform conceptual model structures. Contributions to uncertainty in runoff simulations stemming from the choice of model structure show similar dimensions to those arising from parameter selection and the representation of precipitation input. Variations in precipitation mainly affect the general level and peaks of runoff, while different model structures lead to different simulated runoff dynamics. Large differences between the four analysed models are detected for simulations of soil moisture and, even more pronounced, runoff components. Soil moisture changes are more dynamical in the physically based model structures, which is in better agreement with observations. Streamflow contributions of overland flow are considerably lower in these models than in the more conceptual approaches. Observations of runoff components are rarely made and are not available in this study, but are shown to have high potential for an effective selection of appropriate model structures (author) [de

  7. A point-based prediction model for cardiovascular risk in orthotopic liver transplantation: The CAR-OLT score.

    Science.gov (United States)

    VanWagner, Lisa B; Ning, Hongyan; Whitsett, Maureen; Levitsky, Josh; Uttal, Sarah; Wilkins, John T; Abecassis, Michael M; Ladner, Daniela P; Skaro, Anton I; Lloyd-Jones, Donald M

    2017-12-01

    Cardiovascular disease (CVD) complications are important causes of morbidity and mortality after orthotopic liver transplantation (OLT). There is currently no preoperative risk-assessment tool that allows physicians to estimate the risk for CVD events following OLT. We sought to develop a point-based prediction model (risk score) for CVD complications after OLT, the Cardiovascular Risk in Orthotopic Liver Transplantation risk score, among a cohort of 1,024 consecutive patients aged 18-75 years who underwent first OLT in a tertiary-care teaching hospital (2002-2011). The main outcome measures were major 1-year CVD complications, defined as death from a CVD cause or hospitalization for a major CVD event (myocardial infarction, revascularization, heart failure, atrial fibrillation, cardiac arrest, pulmonary embolism, and/or stroke). The bootstrap method yielded bias-corrected 95% confidence intervals for the regression coefficients of the final model. Among 1,024 first OLT recipients, major CVD complications occurred in 329 (32.1%). Variables selected for inclusion in the model (using model optimization strategies) included preoperative recipient age, sex, race, employment status, education status, history of hepatocellular carcinoma, diabetes, heart failure, atrial fibrillation, pulmonary or systemic hypertension, and respiratory failure. The discriminative performance of the point-based score (C statistic = 0.78, bias-corrected C statistic = 0.77) was superior to other published risk models for postoperative CVD morbidity and mortality, and it had appropriate calibration (Hosmer-Lemeshow P = 0.33). The point-based risk score can identify patients at risk for CVD complications after OLT surgery (available at www.carolt.us); this score may be useful for identification of candidates for further risk stratification or other management strategies to improve CVD outcomes after OLT. (Hepatology 2017;66:1968-1979). © 2017 by the American Association for the Study of Liver

  8. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Science.gov (United States)

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy. Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  9. Bayesian Analysis for Risk Assessment of Selected Medical Events in Support of the Integrated Medical Model Effort

    Science.gov (United States)

    Gilkey, Kelly M.; Myers, Jerry G.; McRae, Michael P.; Griffin, Elise A.; Kallrui, Aditya S.

    2012-01-01

    The Exploration Medical Capability project is creating a catalog of risk assessments using the Integrated Medical Model (IMM). The IMM is a software-based system intended to assist mission planners in preparing for spaceflight missions by helping them to make informed decisions about medical preparations and supplies needed for combating and treating various medical events using Probabilistic Risk Assessment. The objective is to use statistical analyses to inform the IMM decision tool with estimated probabilities of medical events occurring during an exploration mission. Because data regarding astronaut health are limited, Bayesian statistical analysis is used. Bayesian inference combines prior knowledge, such as data from the general U.S. population, the U.S. Submarine Force, or the analog astronaut population located at the NASA Johnson Space Center, with observed data for the medical condition of interest. The posterior results reflect the best evidence for specific medical events occurring in flight. Bayes theorem provides a formal mechanism for combining available observed data with data from similar studies to support the quantification process. The IMM team performed Bayesian updates on the following medical events: angina, appendicitis, atrial fibrillation, atrial flutter, dental abscess, dental caries, dental periodontal disease, gallstone disease, herpes zoster, renal stones, seizure, and stroke.

  10. Risk prediction models for major adverse cardiac event (MACE) following percutaneous coronary intervention (PCI): A review

    Science.gov (United States)

    Manan, Norhafizah A.; Abidin, Basir

    2015-02-01

    Five percent of patients who went through Percutaneous Coronary Intervention (PCI) experienced Major Adverse Cardiac Events (MACE) after PCI procedure. Risk prediction of MACE following a PCI procedure therefore is helpful. This work describes a review of such prediction models currently in use. Literature search was done on PubMed and SCOPUS database. Thirty literatures were found but only 4 studies were chosen based on the data used, design, and outcome of the study. Particular emphasis was given and commented on the study design, population, sample size, modeling method, predictors, outcomes, discrimination and calibration of the model. All the models had acceptable discrimination ability (C-statistics >0.7) and good calibration (Hosmer-Lameshow P-value >0.05). Most common model used was multivariate logistic regression and most popular predictor was age.

  11. The cardiovascular event reduction tool (CERT)--a simplified cardiac risk prediction model developed from the West of Scotland Coronary Prevention Study (WOSCOPS).

    Science.gov (United States)

    L'Italien, G; Ford, I; Norrie, J; LaPuerta, P; Ehreth, J; Jackson, J; Shepherd, J

    2000-03-15

    The clinical decision to treat hypercholesterolemia is premised on an awareness of patient risk, and cardiac risk prediction models offer a practical means of determining such risk. However, these models are based on observational cohorts where estimates of the treatment benefit are largely inferred. The West of Scotland Coronary Prevention Study (WOSCOPS) provides an opportunity to develop a risk-benefit prediction model from the actual observed primary event reduction seen in the trial. Five-year Cox model risk estimates were derived from all WOSCOPS subjects (n = 6,595 men, aged 45 to 64 years old at baseline) using factors previously shown to be predictive of definite fatal coronary heart disease or nonfatal myocardial infarction. Model risk factors included age, diastolic blood pressure, total cholesterol/ high-density lipoprotein ratio (TC/HDL), current smoking, diabetes, family history of fatal coronary heart disease, nitrate use or angina, and treatment (placebo/ 40-mg pravastatin). All risk factors were expressed as categorical variables to facilitate risk assessment. Risk estimates were incorporated into a simple, hand-held slide rule or risk tool. Risk estimates were identified for 5-year age bands (45 to 65 years), 4 categories of TC/HDL ratio ( or = 7.5), 2 levels of diastolic blood pressure ( or = 90 mm Hg), from 0 to 3 additional risk factors (current smoking, diabetes, family history of premature fatal coronary heart disease, nitrate use or angina), and pravastatin treatment. Five-year risk estimates ranged from 2% in very low-risk subjects to 61% in the very high-risk subjects. Risk reduction due to pravastatin treatment averaged 31%. Thus, the Cardiovascular Event Reduction Tool (CERT) is a risk prediction model derived from the WOSCOPS trial. Its use will help physicians identify patients who will benefit from cholesterol reduction.

  12. Event-based rainfall-runoff modelling of the Kelantan River Basin

    Science.gov (United States)

    Basarudin, Z.; Adnan, N. A.; Latif, A. R. A.; Tahir, W.; Syafiqah, N.

    2014-02-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area.

  13. Event-based rainfall-runoff modelling of the Kelantan River Basin

    International Nuclear Information System (INIS)

    Basarudin, Z; Adnan, N A; Latif, A R A; Syafiqah, N; Tahir, W

    2014-01-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area

  14. A comparative study of machine learning methods for time-to-event survival data for radiomics risk modelling.

    Science.gov (United States)

    Leger, Stefan; Zwanenburg, Alex; Pilz, Karoline; Lohaus, Fabian; Linge, Annett; Zöphel, Klaus; Kotzerke, Jörg; Schreiber, Andreas; Tinhofer, Inge; Budach, Volker; Sak, Ali; Stuschke, Martin; Balermpas, Panagiotis; Rödel, Claus; Ganswindt, Ute; Belka, Claus; Pigorsch, Steffi; Combs, Stephanie E; Mönnich, David; Zips, Daniel; Krause, Mechthild; Baumann, Michael; Troost, Esther G C; Löck, Steffen; Richter, Christian

    2017-10-16

    Radiomics applies machine learning algorithms to quantitative imaging data to characterise the tumour phenotype and predict clinical outcome. For the development of radiomics risk models, a variety of different algorithms is available and it is not clear which one gives optimal results. Therefore, we assessed the performance of 11 machine learning algorithms combined with 12 feature selection methods by the concordance index (C-Index), to predict loco-regional tumour control (LRC) and overall survival for patients with head and neck squamous cell carcinoma. The considered algorithms are able to deal with continuous time-to-event survival data. Feature selection and model building were performed on a multicentre cohort (213 patients) and validated using an independent cohort (80 patients). We found several combinations of machine learning algorithms and feature selection methods which achieve similar results, e.g. C-Index = 0.71 and BT-COX: C-Index = 0.70 in combination with Spearman feature selection. Using the best performing models, patients were stratified into groups of low and high risk of recurrence. Significant differences in LRC were obtained between both groups on the validation cohort. Based on the presented analysis, we identified a subset of algorithms which should be considered in future radiomics studies to develop stable and clinically relevant predictive models for time-to-event endpoints.

  15. A Hybrid Methodology for Modeling Risk of Adverse Events in Complex Health-Care Settings.

    Science.gov (United States)

    Kazemi, Reza; Mosleh, Ali; Dierks, Meghan

    2017-03-01

    In spite of increased attention to quality and efforts to provide safe medical care, adverse events (AEs) are still frequent in clinical practice. Reports from various sources indicate that a substantial number of hospitalized patients suffer treatment-caused injuries while in the hospital. While risk cannot be entirely eliminated from health-care activities, an important goal is to develop effective and durable mitigation strategies to render the system "safer." In order to do this, though, we must develop models that comprehensively and realistically characterize the risk. In the health-care domain, this can be extremely challenging due to the wide variability in the way that health-care processes and interventions are executed and also due to the dynamic nature of risk in this particular domain. In this study, we have developed a generic methodology for evaluating dynamic changes in AE risk in acute care hospitals as a function of organizational and nonorganizational factors, using a combination of modeling formalisms. First, a system dynamics (SD) framework is used to demonstrate how organizational-level and policy-level contributions to risk evolve over time, and how policies and decisions may affect the general system-level contribution to AE risk. It also captures the feedback of organizational factors and decisions over time and the nonlinearities in these feedback effects. SD is a popular approach to understanding the behavior of complex social and economic systems. It is a simulation-based, differential equation modeling tool that is widely used in situations where the formal model is complex and an analytical solution is very difficult to obtain. Second, a Bayesian belief network (BBN) framework is used to represent patient-level factors and also physician-level decisions and factors in the management of an individual patient, which contribute to the risk of hospital-acquired AE. BBNs are networks of probabilities that can capture probabilistic relations

  16. A data-based model to locate mass movements triggered by seismic events in Sichuan, China.

    Science.gov (United States)

    de Souza, Fabio Teodoro

    2014-01-01

    Earthquakes affect the entire world and have catastrophic consequences. On May 12, 2008, an earthquake of magnitude 7.9 on the Richter scale occurred in the Wenchuan area of Sichuan province in China. This event, together with subsequent aftershocks, caused many avalanches, landslides, debris flows, collapses, and quake lakes and induced numerous unstable slopes. This work proposes a methodology that uses a data mining approach and geographic information systems to predict these mass movements based on their association with the main and aftershock epicenters, geologic faults, riverbeds, and topography. A dataset comprising 3,883 mass movements is analyzed, and some models to predict the location of these mass movements are developed. These predictive models could be used by the Chinese authorities as an important tool for identifying risk areas and rescuing survivors during similar events in the future.

  17. Risk evaluation system for operational events and inspection findings

    International Nuclear Information System (INIS)

    Lopez G, A.; Godinez S, V.; Lopez M, R.

    2010-10-01

    The Mexican Nuclear Regulatory Commission has developed an adaptation of the US NRC Significance Determination Process (SDP) to evaluate the risk significance of operational events and inspection findings in Laguna Verde nuclear power plant. The Mexican Nuclear Regulatory Commission developed a plant specific flow chart for preliminary screening instead of the open questionnaire used by the US NRC-SDP, with the aim to improve the accuracy of the screening process. Also, the work sheets and support information tables required by the SDP were built up in an Excel application which allows to perform the risk evaluation in an automatic way, focusing the regulator staff efforts in the risk significance analysis instead of the risk calculation tasks. In order to construct this tool a simplified PRA model was developed and validated with the individual plant examination model. This paper shows the Mexican Nuclear Regulatory Commission process and some risk events evaluations performed using the Risk Evaluation System for Operational Events and Inspection Findings (SERHE, by its acronyms in Spanish). (Author)

  18. Holistic flood risk assessment using agent-based modelling: the case of Sint Maarten Island

    Science.gov (United States)

    Abayneh Abebe, Yared; Vojinovic, Zoran; Nikolic, Igor; Hammond, Michael; Sanchez, Arlex; Pelling, Mark

    2015-04-01

    Floods in coastal regions are regarded as one of the most dangerous and harmful disasters. Though commonly referred to as natural disasters, coastal floods are also attributable to various social, economic, historical and political issues. Rapid urbanisation in coastal areas combined with climate change and poor governance can lead to a significant increase in the risk of pluvial flooding coinciding with fluvial and coastal flooding posing a greater risk of devastation in coastal communities. Disasters that can be triggered by hydro-meteorological events are interconnected and interrelated with both human activities and natural processes. They, therefore, require holistic approaches to help understand their complexity in order to design and develop adaptive risk management approaches that minimise social and economic losses and environmental impacts, and increase resilience to such events. Being located in the North Atlantic Ocean, Sint Maarten is frequently subjected to hurricanes. In addition, the stormwater catchments and streams on Sint Maarten have several unique characteristics that contribute to the severity of flood-related impacts. Urban environments are usually situated in low-lying areas, with little consideration for stormwater drainage, and as such are subject to flash flooding. Hence, Sint Maarten authorities drafted policies to minimise the risk of flood-related disasters on the island. In this study, an agent-based model is designed and applied to understand the implications of introduced policies and regulations, and to understand how different actors' behaviours influence the formation, propagation and accumulation of flood risk. The agent-based model built for this study is based on the MAIA meta-model, which helps to decompose, structure and conceptualize socio-technical systems with an agent-oriented perspective, and is developed using the NetLogo simulation environment. The agents described in this model are households and businesses, and

  19. Recurrence models of volcanic events: Applications to volcanic risk assessment

    International Nuclear Information System (INIS)

    Crowe, B.M.; Picard, R.; Valentine, G.; Perry, F.V.

    1992-01-01

    An assessment of the risk of future volcanism has been conducted for isolation of high-level radioactive waste at the potential Yucca Mountain site in southern Nevada. Risk used in this context refers to a combined assessment of the probability and consequences of future volcanic activity. Past studies established bounds on the probability of magmatic disruption of a repository. These bounds were revised as additional data were gathered from site characterization studies. The probability of direct intersection of a potential repository located in an eight km 2 area of Yucca Mountain by ascending basalt magma was bounded by the range of 10 -8 to 10 -10 yr -1 2 . The consequences of magmatic disruption of a repository were estimated in previous studies to be limited. The exact releases from such an event are dependent on the strike of an intruding basalt dike relative to the repository geometry, the timing of the basaltic event relative to the age of the radioactive waste and the mechanisms of release and dispersal of the waste radionuclides in the accessible environment. The combined low probability of repository disruption and the limited releases associated with this event established the basis for the judgement that the risk of future volcanism was relatively low. It was reasoned that that risk of future volcanism was not likely to result in disqualification of the potential Yucca Mountain site

  20. A Risk Assessment System with Automatic Extraction of Event Types

    Science.gov (United States)

    Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula

    In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.

  1. Modelling and Simulating of Risk Behaviours in Virtual Environments Based on Multi-Agent and Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Linqin Cai

    2013-11-01

    Full Text Available Due to safety and ethical issues, traditional experimental approaches to modelling underground risk behaviours can be costly, dangerous and even impossible to realize. Based on multi-agent technology, a virtual coalmine platform for risk behaviour simulation is presented to model and simulate the human-machine-environment related risk factors in underground coalmines. To reveal mine workers' risk behaviours, a fuzzy emotional behaviour model is proposed to simulate underground miners' responding behaviours to potential hazardous events based on cognitive appraisal theories and fuzzy logic techniques. The proposed emotion model can generate more believable behaviours for virtual miners according to personalized emotion states, internal motivation needs and behaviour selection thresholds. Finally, typical accident cases of underground hazard spotting and locomotive transport were implemented. The behaviour believability of virtual miners was evaluated with a user assessment method. Experimental results show that the proposed models can create more realistic and reasonable behaviours in virtual coalmine environments, which can improve miners' risk awareness and further train miners' emergent decision-making ability when facing unexpected underground situations.

  2. Modelling of risk events with uncertain likelihoods and impacts in large infrastructure projects

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2010-01-01

    to prevent future budget overruns. One of the central ideas is to introduce improved risk management processes and the present paper addresses this particular issue. A relevant cost function in terms of unit prices and quantities is developed and an event impact matrix with uncertain impacts from independent......This paper presents contributions to the mathematical core of risk and uncertainty management in compliance with the principles of New Budgeting laid out in 2008 by the Danish Ministry of Transport to be used in large infrastructure projects. Basically, the new principles are proposed in order...... uncertain risk events is used to calculate the total uncertain risk budget. Cost impacts from the individual risk events on the individual project activities are kept precisely track of in order to comply with the requirements of New Budgeting. Additionally, uncertain likelihoods for the occurrence of risk...

  3. Identification of the timing-of-events model with multiple competing exit risks from single-spell data

    DEFF Research Database (Denmark)

    Drepper, Bettina; Effraimidis, G.

    2016-01-01

    The identification result of the timing-of-events model (Abbring and Van den Berg, 2003b) is extended to a model with several competing exit risk equations. This extension allows e.g. to simultaneously identify the different effects a benefit sanction has on the rate of finding work and leaving t...

  4. A Basis Function Approach to Simulate Storm Surge Events for Coastal Flood Risk Assessment

    Science.gov (United States)

    Wu, Wenyan; Westra, Seth; Leonard, Michael

    2017-04-01

    Storm surge is a significant contributor to flooding in coastal and estuarine regions, especially when it coincides with other flood producing mechanisms, such as extreme rainfall. Therefore, storm surge has always been a research focus in coastal flood risk assessment. Often numerical models have been developed to understand storm surge events for risk assessment (Kumagai et al. 2016; Li et al. 2016; Zhang et al. 2016) (Bastidas et al. 2016; Bilskie et al. 2016; Dalledonne and Mayerle 2016; Haigh et al. 2014; Kodaira et al. 2016; Lapetina and Sheng 2015), and assess how these events may change or evolve in the future (Izuru et al. 2015; Oey and Chou 2016). However, numeric models often require a lot of input information and difficulties arise when there are not sufficient data available (Madsen et al. 2015). Alternative, statistical methods have been used to forecast storm surge based on historical data (Hashemi et al. 2016; Kim et al. 2016) or to examine the long term trend in the change of storm surge events, especially under climate change (Balaguru et al. 2016; Oh et al. 2016; Rueda et al. 2016). In these studies, often the peak of surge events is used, which result in the loss of dynamic information within a tidal cycle or surge event (i.e. a time series of storm surge values). In this study, we propose an alternative basis function (BF) based approach to examine the different attributes (e.g. peak and durations) of storm surge events using historical data. Two simple two-parameter BFs were used: the exponential function and the triangular function. High quality hourly storm surge record from 15 tide gauges around Australia were examined. It was found that there are significantly location and seasonal variability in the peak and duration of storm surge events, which provides additional insights in coastal flood risk. In addition, the simple form of these BFs allows fast simulation of storm surge events and minimises the complexity of joint probability

  5. Solar Energetic Particle Event Risks for Future Human Missions within the Inner Heliosphere

    Science.gov (United States)

    Over, S.; Ford, J.

    2017-12-01

    As astronauts travel beyond low-Earth orbit (LEO), space weather research will play a key role in determining risks from space radiation. Of interest are the rare, large solar energetic particle (SEP) events that can cause significant medical effects during flight. Historical SEP data were analyzed from the Geostationary Operational Environmental Satellites (GOES) program covering the time period of 1986 to 2016 for SEP events. The SEP event data were combined with a Monte Carlo approach to develop a risk model to determine maximum expected doses for missions within the inner heliosphere. Presented here are results from risk assessments for proposed Mars transits as compared to a geostationary Earth-bound mission. Overall, the greatest risk was for the return from Mars with a Venus swing-by, due to the additional transit length and decreased distance from the Sun as compared to traditional Hohmann transfers. The overall results do not indicate that the effects of SEP events alone would prohibit these missions based on current radiation limits alone, but the combination of doses from SEP events and galactic cosmic radiation may be significant, and should be considered in all phases of mission design.

  6. Predictive Accuracy of a Cardiovascular Disease Risk Prediction Model in Rural South India – A Community Based Retrospective Cohort Study

    Directory of Open Access Journals (Sweden)

    Farah N Fathima

    2015-03-01

    Full Text Available Background: Identification of individuals at risk of developing cardiovascular diseases by risk stratification is the first step in primary prevention. Aims & Objectives: To assess the five year risk of developing a cardiovascular event from retrospective data and to assess the predictive accuracy of the non laboratory based National Health and Nutrition Examination Survey (NHANES risk prediction model among individuals in a rural South Indian population. Materials & Methods: A community based retrospective cohort study was conducted in three villages where risk stratification was done for all eligible adults aged between 35-74 years at the time of initial assessment using the NHANES risk prediction charts. Household visits were made after a period of five years by trained doctors to determine cardiovascular outcomes. Results: 521 people fulfilled the eligibility criteria of whom 486 (93.3% could be traced after five years. 56.8% were in low risk, 36.6% were in moderate risk and 6.6% were in high risk categories. 29 persons (5.97% had had cardiovascular events over the last five years of which 24 events (82.7% were nonfatal and five (17.25% were fatal. The mean age of the people who developed cardiovascular events was 57.24 ± 9.09 years. The odds ratios for the three levels of risk showed a linear trend with the odds ratios for the moderate risk and high risk category being 1.35 and 1.94 respectively with the low risk category as baseline. Conclusion: The non laboratory based NHANES charts did not accurately predict the occurrence of cardiovascular events in any of the risk categories.

  7. A Knowledge-Based Model of Audit Risk

    OpenAIRE

    Dhar, Vasant; Lewis, Barry; Peters, James

    1988-01-01

    Within the academic and professional auditing communities, there has been growing concern about how to accurately assess the various risks associated with performing an audit. These risks are difficult to conceptualize in terms of numeric estimates. This article discusses the development of a prototype computational model (computer program) that assesses one of the major audit risks -- inherent risk. This program bases most of its inferencing activities on a qualitative model of a typical bus...

  8. Incremental value of a genetic risk score for the prediction of new vascular events in patients with clinically manifest vascular disease.

    Science.gov (United States)

    Weijmans, Maaike; de Bakker, Paul I W; van der Graaf, Yolanda; Asselbergs, Folkert W; Algra, Ale; Jan de Borst, Gert; Spiering, Wilko; Visseren, Frank L J

    2015-04-01

    Several genetic markers are related to incidence of cardiovascular events. We evaluated whether a genetic risk score (GRS) based on 30 single-nucleotide-polymorphisms associated with coronary artery disease (CAD) can improve prediction of 10-year risk of new cardiovascular events in patients with clinical manifest vascular disease. In 5742 patients with symptomatic vascular disease enrolled in the SMART study, we developed Cox regression models based on the SMART Risk Score (SRS) and based on the SRS plus the GRS in all patients, in patients with a history of acute arterial thrombotic events and in patients with a history of more stable atherosclerosis and without CAD. The discriminatory ability was expressed by the c-statistic. Model calibration was evaluated by calibration plots. The incremental value of adding the GRS was assessed by net reclassification index (NRI) and decision curve analysis. During a median follow-up of 6.5 years (IQR4.0-9.5), the composite outcome of myocardial infarction, stroke, or vascular death occurred in 933 patients. Hazard ratios of GRS ranging from 0.86 to 1.35 were observed. The discriminatory capacity of the SRS for prediction of 10-year risk of cardiovascular events was fairly good (c-statistic 0.70, 95%CI 0.68-0.72), similar to the model based on the SRS plus the GRS. Calibration of the models based on SRS and SRS plus GRS was adequate. No increase in c-statistics, categorical NRIs and decision curves was observed when adding the GRS. The continuous NRI improved only in patients with stable atherosclerosis (0.14, 95%CI 0.03-0.25), increasing further excluding patients with a history of CAD (0.21, 95%CI 0.06-0.36). In patients with symptomatic vascular disease, a GRS did not improve risk prediction of 10-year risk of cardiovascular events beyond clinical characteristics. The GRS might improve risk prediction of first vascular events in the subgroup of patients with a history of stable atherosclerosis. Copyright © 2015 Elsevier

  9. Diet Activity Characteristic of Large-scale Sports Events Based on HACCP Management Model

    OpenAIRE

    Xiao-Feng Su; Li Guo; Li-Hua Gao; Chang-Zhuan Shao

    2015-01-01

    The study proposed major sports events dietary management based on "HACCP" management model. According to the characteristic of major sports events catering activities. Major sports events are not just showcase level of competitive sports activities which have become comprehensive special events including social, political, economic, cultural and other factors, complex. Sporting events conferred reach more diverse goals and objectives of economic, political, cultural, technological and other ...

  10. Calibration plots for risk prediction models in the presence of competing risks.

    Science.gov (United States)

    Gerds, Thomas A; Andersen, Per K; Kattan, Michael W

    2014-08-15

    A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks such as death due to other causes. For personalized medicine and patient counseling, it is necessary to check that the model is calibrated in the sense that it provides reliable predictions for all subjects. There are three often encountered practical problems when the aim is to display or test if a risk prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves for competing risks models based on jackknife pseudo-values that are combined with a nearest neighborhood smoother and a cross-validation approach to deal with all three problems. Copyright © 2014 John Wiley & Sons, Ltd.

  11. A Method to Quantify Plant Availability and Initiating Event Frequency Using a Large Event Tree, Small Fault Tree Model

    International Nuclear Information System (INIS)

    Kee, Ernest J.; Sun, Alice; Rodgers, Shawn; Popova, ElmiraV; Nelson, Paul; Moiseytseva, Vera; Wang, Eric

    2006-01-01

    South Texas Project uses a large fault tree to produce scenarios (minimal cut sets) used in quantification of plant availability and event frequency predictions. On the other hand, the South Texas Project probabilistic risk assessment model uses a large event tree, small fault tree for quantifying core damage and radioactive release frequency predictions. The South Texas Project is converting its availability and event frequency model to use a large event tree, small fault in an effort to streamline application support and to provide additional detail in results. The availability and event frequency model as well as the applications it supports (maintenance and operational risk management, system engineering health assessment, preventive maintenance optimization, and RIAM) are briefly described. A methodology to perform availability modeling in a large event tree, small fault tree framework is described in detail. How the methodology can be used to support South Texas Project maintenance and operations risk management is described in detail. Differences with other fault tree methods and other recently proposed methods are discussed in detail. While the methods described are novel to the South Texas Project Risk Management program and to large event tree, small fault tree models, concepts in the area of application support and availability modeling have wider applicability to the industry. (authors)

  12. Competing Risks and Multistate Models with R

    CERN Document Server

    Beyersmann, Jan; Schumacher, Martin

    2012-01-01

    This book covers competing risks and multistate models, sometimes summarized as event history analysis. These models generalize the analysis of time to a single event (survival analysis) to analysing the timing of distinct terminal events (competing risks) and possible intermediate events (multistate models). Both R and multistate methods are promoted with a focus on nonparametric methods.

  13. Toward risk assessment 2.0: Safety supervisory control and model-based hazard monitoring for risk-informed safety interventions

    International Nuclear Information System (INIS)

    Favarò, Francesca M.; Saleh, Joseph H.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a staple in the engineering risk community, and it has become to some extent synonymous with the entire quantitative risk assessment undertaking. Limitations of PRA continue to occupy researchers, and workarounds are often proposed. After a brief review of this literature, we propose to address some of PRA's limitations by developing a novel framework and analytical tools for model-based system safety, or safety supervisory control, to guide safety interventions and support a dynamic approach to risk assessment and accident prevention. Our work shifts the emphasis from the pervading probabilistic mindset in risk assessment toward the notions of danger indices and hazard temporal contingency. The framework and tools here developed are grounded in Control Theory and make use of the state-space formalism in modeling dynamical systems. We show that the use of state variables enables the definition of metrics for accident escalation, termed hazard levels or danger indices, which measure the “proximity” of the system state to adverse events, and we illustrate the development of such indices. Monitoring of the hazard levels provides diagnostic information to support both on-line and off-line safety interventions. For example, we show how the application of the proposed tools to a rejected takeoff scenario provides new insight to support pilots’ go/no-go decisions. Furthermore, we augment the traditional state-space equations with a hazard equation and use the latter to estimate the times at which critical thresholds for the hazard level are (b)reached. This estimation process provides important prognostic information and produces a proxy for a time-to-accident metric or advance notice for an impending adverse event. The ability to estimate these two hazard coordinates, danger index and time-to-accident, offers many possibilities for informing system control strategies and improving accident prevention and risk mitigation

  14. Treatment-dependent and treatment-independent risk factors associated with the risk of diabetes-related events

    DEFF Research Database (Denmark)

    Wilke, Thomas; Mueller, Sabrina; Groth, Antje

    2015-01-01

    BACKGROUND: The aim of this study was to analyse which factors predict the real-world macro-/microvascular event, hospitalisation and death risk in patients with type 2 diabetes mellitus. Furthermore, we aimed to investigate whether there exists both an under- and over-treatment risk...... of these patients. METHODS: We used a German claims/clinical data set covering the years 2010-12. Diabetes-related events were defined as (1) macro-, (2) microvascular events leading to inpatient hospitalisation, (3) other hospitalisations with type 2 diabetes mellitus as main diagnosis, (4) all-cause death and (5......) a composite outcome including all event categories 1-4. Factors associated with event risk were analysed by a Kaplan-Meier curve analysis and by multivariable Cox regression models. RESULTS: 229,042 patients with type 2 diabetes mellitus (mean age 70.2 years; mean CCI 6.03) were included. Among factors...

  15. Assessment of initial soil moisture conditions for event-based rainfall-runoff modelling

    OpenAIRE

    Tramblay, Yves; Bouvier, Christophe; Martin, C.; Didon-Lescot, J. F.; Todorovik, D.; Domergue, J. M.

    2010-01-01

    Flash floods are the most destructive natural hazards that occur in the Mediterranean region. Rainfall-runoff models can be very useful for flash flood forecasting and prediction. Event-based models are very popular for operational purposes, but there is a need to reduce the uncertainties related to the initial moisture conditions estimation prior to a flood event. This paper aims to compare several soil moisture indicators: local Time Domain Reflectometry (TDR) measurements of soil moisture,...

  16. Risk analysis of urban gas pipeline network based on improved bow-tie model

    Science.gov (United States)

    Hao, M. J.; You, Q. J.; Yue, Z.

    2017-11-01

    Gas pipeline network is a major hazard source in urban areas. In the event of an accident, there could be grave consequences. In order to understand more clearly the causes and consequences of gas pipeline network accidents, and to develop prevention and mitigation measures, the author puts forward the application of improved bow-tie model to analyze risks of urban gas pipeline network. The improved bow-tie model analyzes accident causes from four aspects: human, materials, environment and management; it also analyzes the consequences from four aspects: casualty, property loss, environment and society. Then it quantifies the causes and consequences. Risk identification, risk analysis, risk assessment, risk control, and risk management will be clearly shown in the model figures. Then it can suggest prevention and mitigation measures accordingly to help reduce accident rate of gas pipeline network. The results show that the whole process of an accident can be visually investigated using the bow-tie model. It can also provide reasons for and predict consequences of an unfortunate event. It is of great significance in order to analyze leakage failure of gas pipeline network.

  17. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  18. Some implications of an event-based definition of exposure to the risk of road accident.

    Science.gov (United States)

    Elvik, Rune

    2015-03-01

    This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving the roadway. A typology of events representing a potential for an accident is proposed. Each event can be interpreted as a trial as defined in probability theory. Risk is the proportion of events that result in an accident. Defining exposure as events demanding the attention of road users implies that road users will learn from repeated exposure to these events, which in turn implies that there will normally be a negative relationship between exposure and risk. Four hypotheses regarding the relationship between exposure and risk are proposed. Preliminary tests support these hypotheses. Advantages and disadvantages of defining exposure as specific events are discussed. It is argued that developments in vehicle technology are likely to make events both observable and countable, thus ensuring that exposure is an operational concept. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  20. Integrated Monitoring and Modeling of Carbon Dioxide Leakage Risk Using Remote Sensing, Ground-Based Monitoring, Atmospheric Models and Risk-Indexing Tools

    Science.gov (United States)

    Burton, E. A.; Pickles, W. L.; Gouveia, F. J.; Bogen, K. T.; Rau, G. H.; Friedmann, J.

    2006-12-01

    estimating its associated risk, spatially and temporally. This requires integration of subsurface, surface and atmospheric data and models. To date, we have developed techniques to map risk based on predicted atmospheric plumes and GIS/MT (meteorologic- topographic) risk-indexing tools. This methodology was derived from study of large CO2 releases from an abandoned well penetrating a natural CO2 reservoir at Crystal Geyser, Utah. This integrated approach will provide a powerful tool to screen for high-risk zones at proposed sequestration sites, to design and optimize surface networks for site monitoring and/or to guide setting science-based regulatory compliance requirements for monitoring sequestration sites, as well as to target critical areas for first responders should a catastrophic-release event occur. This work was performed under the auspices of the U.S. Dept. of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  1. A risk-based multi-objective model for optimal placement of sensors in water distribution system

    Science.gov (United States)

    Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein

    2018-02-01

    In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value

  2. Application of declarative modeling approaches for external events

    International Nuclear Information System (INIS)

    Anoba, R.C.

    2005-01-01

    Probabilistic Safety Assessments (PSAs) are increasingly being used as a tool for supporting the acceptability of design, procurement, construction, operation, and maintenance activities at Nuclear Power Plants. Since the issuance of Generic Letter 88-20 and subsequent IPE/IPEEE assessments, the NRC has issued several Regulatory Guides such as RG 1.174 to describe the use of PSA in risk-informed regulation activities. Most PSA have the capability to address internal events including internal floods. As the more demands are being placed for using the PSA to support risk-informed applications, there has been a growing need to integrate other eternal events (Seismic, Fire, etc.) into the logic models. Most external events involve spatial dependencies and usually impact the logic models at the component level. Therefore, manual insertion of external events impacts into a complex integrated fault tree model may be too cumbersome for routine uses of the PSA. Within the past year, a declarative modeling approach has been developed to automate the injection of external events into the PSA. The intent of this paper is to introduce the concept of declarative modeling in the context of external event applications. A declarative modeling approach involves the definition of rules for injection of external event impacts into the fault tree logic. A software tool such as the EPRI's XInit program can be used to interpret the pre-defined rules and automatically inject external event elements into the PSA. The injection process can easily be repeated, as required, to address plant changes, sensitivity issues, changes in boundary conditions, etc. External event elements may include fire initiating events, seismic initiating events, seismic fragilities, fire-induced hot short events, special human failure events, etc. This approach has been applied at a number of US nuclear power plants including a nuclear power plant in Romania. (authors)

  3. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    Science.gov (United States)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  4. Hygrothermal modelling of flooding events within historic buildings

    NARCIS (Netherlands)

    Huijbregts, Z.; Schellen, H.L.; Schijndel, van A.W.M.; Blades, N.

    2014-01-01

    Flooding events pose a high risk to valuable monumental buildings and their interiors. Due to higher river discharges and sea level rise, flooding events may occur more often in future. Hygrothermal building simulation models can be applied to investigate the impact of a flooding event on the

  5. Hygrothermal modelling of flooding events within historic buildings

    NARCIS (Netherlands)

    Huijbregts, Z.; Schijndel, van A.W.M.; Schellen, H.L.; Blades, N.; Mahdavi, A.; Mertens, B.

    2013-01-01

    Flooding events pose a high risk to valuable monumental buildings and their interiors. Due to higher river discharges and sea level rise, flooding events may occur more often in future. Hygrothermal building simulation models can be applied to investigate the impact of a flooding event on the

  6. Risk Based Milk Pricing Model at Dairy Farmers Level

    Directory of Open Access Journals (Sweden)

    W. Septiani

    2017-12-01

    Full Text Available The milk price from a cooperative institution to farmer does not fully cover the production cost. Though, dairy farmers encounter various risks and uncertainties in conducting their business. The highest risk in milk supply lies in the activities at the farm. This study was designed to formulate a model for calculating milk price at farmer’s level based on risk. Risks that occur on farms include the risk of cow breeding, sanitation, health care, cattle feed management, milking and milk sales. This research used the location of the farm in West Java region. There were five main stages in the preparation of this model, (1 identification and analysis of influential factors, (2 development of a conceptual model, (3 structural analysis and the amount of production costs, (4 model calculation of production cost with risk factors, and (5 risk based milk pricing model. This research built a relationship between risks on smallholder dairy farms with the production costs to be incurred by the farmers. It was also obtained the formulation of risk adjustment factor calculation for the variable costs of production in dairy cattle farm. The difference in production costs with risk and the total production cost without risk was about 8% to 10%. It could be concluded that the basic price of milk proposed based on the research was around IDR 4,250-IDR 4,350/L for 3 to 4 cows ownership. Increasing farmer income was expected to be obtained by entering the value of this risk in the calculation of production costs. 

  7. Development and validation of multivariable predictive model for thromboembolic events in lymphoma patients.

    Science.gov (United States)

    Antic, Darko; Milic, Natasa; Nikolovski, Srdjan; Todorovic, Milena; Bila, Jelena; Djurdjevic, Predrag; Andjelic, Bosko; Djurasinovic, Vladislava; Sretenovic, Aleksandra; Vukovic, Vojin; Jelicic, Jelena; Hayman, Suzanne; Mihaljevic, Biljana

    2016-10-01

    Lymphoma patients are at increased risk of thromboembolic events but thromboprophylaxis in these patients is largely underused. We sought to develop and validate a simple model, based on individual clinical and laboratory patient characteristics that would designate lymphoma patients at risk for thromboembolic event. The study population included 1,820 lymphoma patients who were treated in the Lymphoma Departments at the Clinics of Hematology, Clinical Center of Serbia and Clinical Center Kragujevac. The model was developed using data from a derivation cohort (n = 1,236), and further assessed in the validation cohort (n = 584). Sixty-five patients (5.3%) in the derivation cohort and 34 (5.8%) patients in the validation cohort developed thromboembolic events. The variables independently associated with risk for thromboembolism were: previous venous and/or arterial events, mediastinal involvement, BMI>30 kg/m(2) , reduced mobility, extranodal localization, development of neutropenia and hemoglobin level 3). For patients classified at risk (intermediate and high-risk scores), the model produced negative predictive value of 98.5%, positive predictive value of 25.1%, sensitivity of 75.4%, and specificity of 87.5%. A high-risk score had positive predictive value of 65.2%. The diagnostic performance measures retained similar values in the validation cohort. Developed prognostic Thrombosis Lymphoma - ThroLy score is more specific for lymphoma patients than any other available score targeting thrombosis in cancer patients. Am. J. Hematol. 91:1014-1019, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  8. Symptom-Hemodynamic Mismatch and Heart Failure Event Risk

    Science.gov (United States)

    Lee, Christopher S.; Hiatt, Shirin O.; Denfeld, Quin E.; Mudd, James O.; Chien, Christopher; Gelow, Jill M.

    2014-01-01

    Background Heart failure (HF) is a heterogeneous condition of both symptoms and hemodynamics. Objective The goal of this study was to identify distinct profiles among integrated data on physical and psychological symptoms and hemodynamics, and quantify differences in 180-day event-risk among observed profiles. Methods A secondary analysis of data collected during two prospective cohort studies by a single group of investigators was performed. Latent class mixture modeling was used to identify distinct symptom-hemodynamic profiles. Cox proportional hazards modeling was used to quantify difference in event-risk (HF emergency visit, hospitalization or death) among profiles. Results The mean age (n=291) was 57±13 years, 38% were female, and 61% had class III/IV HF. Three distinct symptom-hemodynamic profiles were identified. 17.9% of patients had concordant symptoms and hemodynamics (i.e. moderate physical and psychological symptoms matched the comparatively hemodynamic profile), 17.9% had severe symptoms and average hemodynamics, and 64.2% had poor hemodynamics and mild symptoms. Compared to those in the concordant profile, both profiles of symptom-hemodynamic mismatch were associated with a markedly increased event-risk (severe symptoms hazards ratio = 3.38, p=0.033; poor hemodynamics hazards ratio = 3.48, p=0.016). Conclusions A minority of adults with HF have concordant symptoms and hemodynamics. Either profile of symptom-hemodynamic mismatch in HF is associated with a greater risk of healthcare utilization for HF or death. PMID:24988323

  9. Potential impact of single-risk-factor versus total risk management for the prevention of cardiovascular events in Seychelles.

    Science.gov (United States)

    Ndindjock, Roger; Gedeon, Jude; Mendis, Shanthi; Paccaud, Fred; Bovet, Pascal

    2011-04-01

    To assess the prevalence of cardiovascular (CV) risk factors in Seychelles, a middle-income African country, and compare the cost-effectiveness of single-risk-factor management (treating individuals with arterial blood pressure ≥ 140/90 mmHg and/or total serum cholesterol ≥ 6.2 mmol/l) with that of management based on total CV risk (treating individuals with a total CV risk ≥ 10% or ≥ 20%). CV risk factor prevalence and a CV risk prediction chart for Africa were used to estimate the 10-year risk of suffering a fatal or non-fatal CV event among individuals aged 40-64 years. These figures were used to compare single-risk-factor management with total risk management in terms of the number of people requiring treatment to avert one CV event and the number of events potentially averted over 10 years. Treatment for patients with high total CV risk (≥ 20%) was assumed to consist of a fixed-dose combination of several drugs (polypill). Cost analyses were limited to medication. A total CV risk of ≥ 10% and ≥ 20% was found among 10.8% and 5.1% of individuals, respectively. With single-risk-factor management, 60% of adults would need to be treated and 157 cardiovascular events per 100000 population would be averted per year, as opposed to 5% of adults and 92 events with total CV risk management. Management based on high total CV risk optimizes the balance between the number requiring treatment and the number of CV events averted. Total CV risk management is much more cost-effective than single-risk-factor management. These findings are relevant for all countries, but especially for those economically and demographically similar to Seychelles.

  10. Declarative event based models of concurrency and refinement in psi-calculi

    DEFF Research Database (Denmark)

    Normann, Håkon; Johansen, Christian; Hildebrandt, Thomas

    2015-01-01

    Psi-calculi constitute a parametric framework for nominal process calculi, where constraint based process calculi and process calculi for mobility can be defined as instances. We apply here the framework of psi-calculi to provide a foundation for the exploration of declarative event-based process...... calculi with support for run-time refinement. We first provide a representation of the model of finite prime event structures as an instance of psi-calculi and prove that the representation respects the semantics up to concurrency diamonds and action refinement. We then proceed to give a psi......-calculi representation of Dynamic Condition Response Graphs, which conservatively extends prime event structures to allow finite representations of (omega) regular finite (and infinite) behaviours and have been shown to support run-time adaptation and refinement. We end by outlining the final aim of this research, which...

  11. Event Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2001-01-01

    The purpose of this chapter is to discuss conceptual event modeling within a context of information modeling. Traditionally, information modeling has been concerned with the modeling of a universe of discourse in terms of information structures. However, most interesting universes of discourse...... are dynamic and we present a modeling approach that can be used to model such dynamics.We characterize events as both information objects and change agents (Bækgaard 1997). When viewed as information objects events are phenomena that can be observed and described. For example, borrow events in a library can...

  12. Framework for event-based semidistributed modeling that unifies the SCS-CN method, VIC, PDM, and TOPMODEL

    Science.gov (United States)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-09-01

    Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of "prethreshold" and "threshold-excess" runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics.

  13. Dynamic occupational risk model for offshore operations in harsh environments

    International Nuclear Information System (INIS)

    Song, Guozheng; Khan, Faisal; Wang, Hangzhou; Leighton, Shelly; Yuan, Zhi; Liu, Hanwen

    2016-01-01

    The expansion of offshore oil exploitation into remote areas (e.g., Arctic) with harsh environments has significantly increased occupational risks. Among occupational accidents, slips, trips and falls from height (STFs) account for a significant portion. Thus, a dynamic risk assessment of the three main occupational accidents is meaningful to decrease offshore occupational risks. Bow-tie Models (BTs) were established in this study for the risk analysis of STFs considering extreme environmental factors. To relax the limitations of BTs, Bayesian networks (BNs) were developed based on BTs to dynamically assess risks of STFs. The occurrence and consequence probabilities of STFs were respectively calculated using BTs and BNs, and the obtained probabilities verified BNs' rationality and advantage. Furthermore, the probability adaptation for STFs was accomplished in a specific scenario with BNs. Finally, posterior probabilities of basic events were achieved through diagnostic analysis, and critical basic events were analyzed based on their posterior likelihood to cause occupational accidents. The highlight is systematically analyzing STF accidents for offshore operations and dynamically assessing their risks considering the harsh environmental factors. This study can guide the allocation of prevention resources and benefit the safety management of offshore operations. - Highlights: • A novel dynamic risk model for occupational accidents. • First time consideration of harsh environment in occupational accident modeling. • A Bayesian network based model for risk management strategies.

  14. Automatic identification of web-based risk markers for health events

    DEFF Research Database (Denmark)

    Yom-Tov, Elad; Borsa, Diana; Hayward, Andrew C.

    2015-01-01

    but these are often limited in size and cost and can fail to take full account of diseases where there are social stigmas or to identify transient acute risk factors. Objective: Here we report that Web search engine queries coupled with information on Wikipedia access patterns can be used to infer health events...

  15. Applications of modelling historical catastrophic events with implications for catastrophe risk management

    Science.gov (United States)

    Sorby, A.; Grossi, P.; Pomonis, A.; Williams, C.; Nyst, M.; Onur, T.; Seneviratna, P.; Baca, A.

    2009-04-01

    The management of catastrophe risk is concerned with the quantification of financial losses, and their associated probabilities, for potential future catastrophes that might impact a region. Modelling of historical catastrophe events and, in particular, the potential consequences if a similar event were to occur at the present day can provide insight to help bridge the gap between what we know can happen from historical experience and what potential losses might be out there in the "universe" of potential catastrophes. The 1908 Messina Earthquake (and accompanying local tsunami) was one of the most destructive earthquakes to have occurred in Europe and by most accounts remains Europe's most fatal with over 70,000 casualties estimated. However, what would the potential consequences be, in terms of financial and human losses, if a similar earthquake were to occur at the present day? Exposures, building stock and populations can change over time and, therefore, the consequences of a similar earthquake at the present day may sensibly differ from those observed in 1908. The city of Messina has been reconstructed several times in its history, including following the 1908 earthquake and again following the Second World War. The 1908 earthquake prompted the introduction of the first seismic design regulations in Italy and since 1909 parts of the Messina and Calabria regions have been in the zones of highest seismic coefficient. Utilizing commercial catastrophe loss modelling technology - which combines the modelling of hazard, vulnerability, and financial losses on a database of property exposures - a modelled earthquake scenario of M7.2 in the Messina Straits region of Southern Italy is considered. This modelled earthquake is used to assess the potential consequences in terms of financial losses that an earthquake similar to the 1908 earthquake might have if it were to occur at the present day. Loss results are discussed in the context of applications for the financial

  16. Risk of affective disorders following prenatal exposure to severe life events: a Danish population-based cohort study.

    LENUS (Irish Health Repository)

    Khashan, Ali S

    2012-01-31

    OBJECTIVE: To examine the effect of prenatal exposure to severe life events on risk of affective disorders in the offspring. METHODS: In a cohort of 1.1 million Danish births from May 1978 until December 1997, mothers were considered exposed if one (or more) of their close relatives died or was diagnosed with serious illness up to 6 months before conception or during pregnancy. Offspring were followed up from their 10th birthday until their death, migration, onset of affective disorder or 31 December 2007; hospital admissions were identified by linkage to the Central Psychiatric Register. Log-linear Poisson regression was used for data analysis. RESULTS: The risk of affective disorders was increased in male offspring whose mothers were exposed to severe life events during the second trimester (adjusted RR 1.55 [95% CI 1.05-2.28]). There was an increased risk of male offspring affective disorders in relation to maternal exposure to death of a relative in the second trimester (adjusted RR 1.74 [95% CI 1.06-2.84]) or serious illness in a relative before pregnancy (adjusted RR 1.44 [95% CI 1.02-2.05]). There was no evidence for an association between prenatal exposure to severe life events and risk of female offspring affective disorders. CONCLUSIONS: Our population-based study suggests that prenatal maternal exposure to severe life events may increase the risk of affective disorders in male offspring. These findings are consistent with studies of populations exposed to famine and earthquake disasters which indicate that prenatal environment may influence the neurodevelopment of the unborn child.

  17. A methodology for the quantitative risk assessment of major accidents triggered by seismic events

    International Nuclear Information System (INIS)

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-01-01

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units

  18. Immunological and cardiometabolic risk factors in the prediction of type 2 diabetes and coronary events: MONICA/KORA Augsburg case-cohort study.

    Directory of Open Access Journals (Sweden)

    Christian Herder

    Full Text Available BACKGROUND: This study compares inflammation-related biomarkers with established cardiometabolic risk factors in the prediction of incident type 2 diabetes and incident coronary events in a prospective case-cohort study within the population-based MONICA/KORA Augsburg cohort. METHODS AND FINDINGS: Analyses for type 2 diabetes are based on 436 individuals with and 1410 individuals without incident diabetes. Analyses for coronary events are based on 314 individuals with and 1659 individuals without incident coronary events. Mean follow-up times were almost 11 years. Areas under the receiver-operating characteristic curve (AUC, changes in Akaike's information criterion (ΔAIC, integrated discrimination improvement (IDI and net reclassification index (NRI were calculated for different models. A basic model consisting of age, sex and survey predicted type 2 diabetes with an AUC of 0.690. Addition of 13 inflammation-related biomarkers (CRP, IL-6, IL-18, MIF, MCP-1/CCL2, IL-8/CXCL8, IP-10/CXCL10, adiponectin, leptin, RANTES/CCL5, TGF-β1, sE-selectin, sICAM-1; all measured in nonfasting serum increased the AUC to 0.801, whereas addition of cardiometabolic risk factors (BMI, systolic blood pressure, ratio total/HDL-cholesterol, smoking, alcohol, physical activity, parental diabetes increased the AUC to 0.803 (ΔAUC [95% CI] 0.111 [0.092-0.149] and 0.113 [0.093-0.149], respectively, compared to the basic model. The combination of all inflammation-related biomarkers and cardiometabolic risk factors yielded a further increase in AUC to 0.847 (ΔAUC [95% CI] 0.044 [0.028-0.066] compared to the cardiometabolic risk model. Corresponding AUCs for incident coronary events were 0.807, 0.825 (ΔAUC [95% CI] 0.018 [0.013-0.038] compared to the basic model, 0.845 (ΔAUC [95% CI] 0.038 [0.028-0.059] compared to the basic model and 0.851 (ΔAUC [95% CI] 0.006 [0.003-0.021] compared to the cardiometabolic risk model, respectively. CONCLUSIONS: Inclusion of multiple

  19. Methodological issues in cardiovascular epidemiology: the risk of determining absolute risk through statistical models

    Directory of Open Access Journals (Sweden)

    Demosthenes B Panagiotakos

    2006-09-01

    Full Text Available Demosthenes B Panagiotakos, Vassilis StavrinosOffice of Biostatistics, Epidemiology, Department of Dietetics, Nutrition, Harokopio University, Athens, GreeceAbstract: During the past years there has been increasing interest in the development of cardiovascular disease functions that predict future events at individual level. However, this effort has not been so far very successful, since several investigators have reported large differences in the estimation of the absolute risk among different populations. For example, it seems that predictive models that have been derived from US or north European populations  overestimate the incidence of cardiovascular events in south European and Japanese populations. A potential explanation could be attributed to several factors such as geographical, cultural, social, behavioral, as well as genetic variations between the investigated populations in addition to various methodological, statistical, issues relating to the estimation of these predictive models. Based on current literature it can be concluded that, while risk prediction of future cardiovascular events is a useful tool and might be valuable in controlling the burden of the disease in a population, further work is required to improve the accuracy of the present predictive models.Keywords: cardiovascular disease, risk, models

  20. Mind the gap: modelling event-based and millennial-scale landscape dynamics

    NARCIS (Netherlands)

    Baartman, J.E.M.

    2012-01-01

    This research looks at landscape dynamics – erosion and deposition – from two different perspectives: long-term landscape evolution over millennial timescales on the one hand and short-term event-based erosion and deposition at the other hand. For the first, landscape evolution models (LEMs) are

  1. Risk-based and deterministic regulation

    International Nuclear Information System (INIS)

    Fischer, L.E.; Brown, N.W.

    1995-07-01

    Both risk-based and deterministic methods are used for regulating the nuclear industry to protect the public safety and health from undue risk. The deterministic method is one where performance standards are specified for each kind of nuclear system or facility. The deterministic performance standards address normal operations and design basis events which include transient and accident conditions. The risk-based method uses probabilistic risk assessment methods to supplement the deterministic one by (1) addressing all possible events (including those beyond the design basis events), (2) using a systematic, logical process for identifying and evaluating accidents, and (3) considering alternative means to reduce accident frequency and/or consequences. Although both deterministic and risk-based methods have been successfully applied, there is need for a better understanding of their applications and supportive roles. This paper describes the relationship between the two methods and how they are used to develop and assess regulations in the nuclear industry. Preliminary guidance is suggested for determining the need for using risk based methods to supplement deterministic ones. However, it is recommended that more detailed guidance and criteria be developed for this purpose

  2. LIFETIME LUNG CANCER RISKS ASSOCIATED WITH INDOOR RADON EXPOSURE BASED ON VARIOUS RADON RISK MODELS FOR CANADIAN POPULATION.

    Science.gov (United States)

    Chen, Jing

    2017-04-01

    This study calculates and compares the lifetime lung cancer risks associated with indoor radon exposure based on well-known risk models in the literature; two risk models are from joint studies among miners and the other three models were developed from pooling studies on residential radon exposure from China, Europe and North America respectively. The aim of this article is to make clear that the various models are mathematical descriptions of epidemiologically observed real risks in different environmental settings. The risk from exposure to indoor radon is real and it is normal that variations could exist among different risk models even when they were applied to the same dataset. The results show that lifetime risk estimates vary significantly between the various risk models considered here: the model based on the European residential data provides the lowest risk estimates, while models based on the European miners and Chinese residential pooling with complete dosimetry give the highest values. The lifetime risk estimates based on the EPA/BEIR-VI model lie within this range and agree reasonably well with the averages of risk estimates from the five risk models considered in this study. © Crown copyright 2016.

  3. Work stress and the risk of recurrent coronary heart disease events: A systematic review and meta-analysis.

    Science.gov (United States)

    Li, Jian; Zhang, Min; Loerbroks, Adrian; Angerer, Peter; Siegrist, Johannes

    2015-01-01

    Though much evidence indicates that work stress increases the risk of incident of coronary heart disease (CHD), little is known about the role of work stress in the development of recurrent CHD events. The objective of this study was to review and synthesize the existing epidemiological evidence on whether work stress increases the risk of recurrent CHD events in patients with the first CHD. A systematic literature search in the PubMed database (January 1990 - December 2013) for prospective studies was performed. Inclusion criteria included: peer-reviewed English papers with original data, studies with substantial follow-up (> 3 years), end points defined as cardiac death or nonfatal myocardial infarction, as well as work stress assessed with reliable and valid instruments. Meta-analysis using random-effects modeling was conducted in order to synthesize the observed effects across the studies. Five papers derived from 4 prospective studies conducted in Sweden and Canada were included in this systematic review. The measurement of work stress was based on the Demand- Control model (4 papers) or the Effort-Reward Imbalance model (1 paper). According to the estimation by meta-analysis based on 4 papers, a significant effect of work stress on the risk of recurrent CHD events (hazard ratio: 1.65, 95% confidence interval: 1.23-2.22) was observed. Our findings suggest that, in patients with the first CHD, work stress is associated with an increased relative risk of recurrent CHD events by 65%. Due to the limited literature, more well-designed prospective research is needed to examine this association, in particular, from other than western regions of the world. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  4. Flood modelling with a distributed event-based parsimonious rainfall-runoff model: case of the karstic Lez river catchment

    Directory of Open Access Journals (Sweden)

    M. Coustau

    2012-04-01

    Full Text Available Rainfall-runoff models are crucial tools for the statistical prediction of flash floods and real-time forecasting. This paper focuses on a karstic basin in the South of France and proposes a distributed parsimonious event-based rainfall-runoff model, coherent with the poor knowledge of both evaporative and underground fluxes. The model combines a SCS runoff model and a Lag and Route routing model for each cell of a regular grid mesh. The efficiency of the model is discussed not only to satisfactorily simulate floods but also to get powerful relationships between the initial condition of the model and various predictors of the initial wetness state of the basin, such as the base flow, the Hu2 index from the Meteo-France SIM model and the piezometric levels of the aquifer. The advantage of using meteorological radar rainfall in flood modelling is also assessed. Model calibration proved to be satisfactory by using an hourly time step with Nash criterion values, ranging between 0.66 and 0.94 for eighteen of the twenty-one selected events. The radar rainfall inputs significantly improved the simulations or the assessment of the initial condition of the model for 5 events at the beginning of autumn, mostly in September–October (mean improvement of Nash is 0.09; correction in the initial condition ranges from −205 to 124 mm, but were less efficient for the events at the end of autumn. In this period, the weak vertical extension of the precipitation system and the low altitude of the 0 °C isotherm could affect the efficiency of radar measurements due to the distance between the basin and the radar (~60 km. The model initial condition S is correlated with the three tested predictors (R2 > 0.6. The interpretation of the model suggests that groundwater does not affect the first peaks of the flood, but can strongly impact subsequent peaks in the case of a multi-storm event. Because this kind of model is based on a limited

  5. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  6. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  7. IT Operational Risk Measurement Model Based on Internal Loss Data of Banks

    Science.gov (United States)

    Hao, Xiaoling

    Business operation of banks relies increasingly on information technology (IT) and the most important role of IT is to guarantee the operational continuity of business process. Therefore, IT Risk management efforts need to be seen from the perspective of operational continuity. Traditional IT risk studies focused on IT asset-based risk analysis and risk-matrix based qualitative risk evaluation. In practice, IT risk management practices of banking industry are still limited to the IT department and aren't integrated into business risk management, which causes the two departments to work in isolation. This paper presents an improved methodology for dealing with IT operational risk. It adopts quantitative measurement method, based on the internal business loss data about IT events, and uses Monte Carlo simulation to predict the potential losses. We establish the correlation between the IT resources and business processes to make sure risk management of IT and business can work synergistically.

  8. Methods and Model Dependency of Extreme Event Attribution: The 2015 European Drought

    Science.gov (United States)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Vautard, Robert; van Oldenborgh, Geert J.; Wilcox, Laura; Seneviratne, Sonia I.

    2017-10-01

    Science on the role of anthropogenic influence on extreme weather events, such as heatwaves or droughts, has evolved rapidly in the past years. The approach of "event attribution" compares the occurrence-probability of an event in the present, factual climate with its probability in a hypothetical, counterfactual climate without human-induced climate change. Several methods can be used for event attribution, based on climate model simulations and observations, and usually researchers only assess a subset of methods and data sources. Here, we explore the role of methodological choices for the attribution of the 2015 meteorological summer drought in Europe. We present contradicting conclusions on the relevance of human influence as a function of the chosen data source and event attribution methodology. Assessments using the maximum number of models and counterfactual climates with pre-industrial greenhouse gas concentrations point to an enhanced drought risk in Europe. However, other evaluations show contradictory evidence. These results highlight the need for a multi-model and multi-method framework in event attribution research, especially for events with a low signal-to-noise ratio and high model dependency such as regional droughts.

  9. Design a Learning-Oriented Fall Event Reporting System Based on Kirkpatrick Model.

    Science.gov (United States)

    Zhou, Sicheng; Kang, Hong; Gong, Yang

    2017-01-01

    Patient fall has been a severe problem in healthcare facilities around the world due to its prevalence and cost. Routine fall prevention training programs are not as effective as expected. Using event reporting systems is the trend for reducing patient safety events such as falls, although some limitations of the systems exist at current stage. We summarized these limitations through literature review, and developed an improved web-based fall event reporting system. The Kirkpatrick model, widely used in the business area for training program evaluation, has been integrated during the design of our system. Different from traditional event reporting systems that only collect and store the reports, our system automatically annotates and analyzes the reported events, and provides users with timely knowledge support specific to the reported event. The paper illustrates the design of our system and how its features are intended to reduce patient falls by learning from previous errors.

  10. Surface water flood risk and management strategies for London: An Agent-Based Model approach

    Directory of Open Access Journals (Sweden)

    Jenkins Katie

    2016-01-01

    Full Text Available Flooding is recognised as one of the most common and costliest natural disasters in England. Flooding in urban areas during heavy rainfall is known as ‘surface water flooding’, considered to be the most likely cause of flood events and one of the greatest short-term climate risks for London. In this paper we present results from a novel Agent-Based Model designed to assess the interplay between different adaptation options, different agents, and the role of flood insurance and the flood insurance pool, Flood Re, in the context of climate change. The model illustrates how investment in adaptation options could reduce London’s surface water flood risk, today and in the future. However, benefits can be outweighed by continued development in high risk areas and the effects of climate change. Flood Re is beneficial in its function to provide affordable insurance, even under climate change. However, it offers no additional benefits in terms of overall risk reduction, and will face increasing pressure due to rising surface water flood risk in the future. The modelling approach and findings are highly relevant for reviewing the proposed Flood Re scheme, as well as for wider discussions on the potential of insurance schemes, and broader multi-sectoral partnerships, to incentivise flood risk management in the UK and internationally.

  11. Incidence of cardiovascular events and associated risk factors in kidney transplant patients: a competing risks survival analysis.

    Science.gov (United States)

    Seoane-Pillado, María Teresa; Pita-Fernández, Salvador; Valdés-Cañedo, Francisco; Seijo-Bestilleiro, Rocio; Pértega-Díaz, Sonia; Fernández-Rivera, Constantino; Alonso-Hernández, Ángel; González-Martín, Cristina; Balboa-Barreiro, Vanesa

    2017-03-07

    The high prevalence of cardiovascular risk factors among the renal transplant population accounts for increased mortality. The aim of this study is to determine the incidence of cardiovascular events and factors associated with cardiovascular events in these patients. An observational ambispective follow-up study of renal transplant recipients (n = 2029) in the health district of A Coruña (Spain) during the period 1981-2011 was completed. Competing risk survival analysis methods were applied to estimate the cumulative incidence of developing cardiovascular events over time and to identify which characteristics were associated with the risk of these events. Post-transplant cardiovascular events are defined as the presence of myocardial infarction, invasive coronary artery therapy, cerebral vascular events, new-onset angina, congestive heart failure, rhythm disturbances, peripheral vascular disease and cardiovascular disease and death. The cause of death was identified through the medical history and death certificate using ICD9 (390-459, except: 427.5, 435, 446, 459.0). The mean age of patients at the time of transplantation was 47.0 ± 14.2 years; 62% were male. 16.5% had suffered some cardiovascular disease prior to transplantation and 9.7% had suffered a cardiovascular event. The mean follow-up period for the patients with cardiovascular event was 3.5 ± 4.3 years. Applying competing risk methodology, it was observed that the accumulated incidence of the event was 5.0% one year after transplantation, 8.1% after five years, and 11.9% after ten years. After applying multivariate models, the variables with an independent effect for predicting cardiovascular events are: male sex, age of recipient, previous cardiovascular disorders, pre-transplant smoking and post-transplant diabetes. This study makes it possible to determine in kidney transplant patients, taking into account competitive events, the incidence of post-transplant cardiovascular events and

  12. Risk factors for hazardous events in olfactory-impaired patients.

    Science.gov (United States)

    Pence, Taylor S; Reiter, Evan R; DiNardo, Laurence J; Costanzo, Richard M

    2014-10-01

    Normal olfaction provides essential cues to allow early detection and avoidance of potentially hazardous situations. Thus, patients with impaired olfaction may be at increased risk of experiencing certain hazardous events such as cooking or house fires, delayed detection of gas leaks, and exposure to or ingestion of toxic substances. To identify risk factors and potential trends over time in olfactory-related hazardous events in patients with impaired olfactory function. Retrospective cohort study of 1047 patients presenting to a university smell and taste clinic between 1983 and 2013. A total of 704 patients had both clinical olfactory testing and a hazard interview and were studied. On the basis of olfactory function testing results, patients were categorized as normosmic (n = 161), mildly hyposmic (n = 99), moderately hyposmic (n = 93), severely hyposmic (n = 142), and anosmic (n = 209). Patient evaluation including interview, examination, and olfactory testing. Incidence of specific olfaction-related hazardous events (ie, burning pots and/or pans, starting a fire while cooking, inability to detect gas leaks, inability to detect smoke, and ingestion of toxic substances or spoiled foods) by degree of olfactory impairment. The incidence of having experienced any hazardous event progressively increased with degree of impairment: normosmic (18.0%), mildly hyposmic (22.2%), moderately hyposmic (31.2%), severely hyposmic (32.4%), and anosmic (39.2%). Over 3 decades there was no significant change in the overall incidence of hazardous events. Analysis of demographic data (age, sex, race, smoking status, and etiology) revealed significant differences in the incidence of hazardous events based on age (among 397 patients hazardous event, vs 31 of 146 patients ≥65 years [21.3%]; P hazardous event, vs 73 of 265 men [27.6%]; P = .009), and race (among 98 African Americans, 41 [41.8%] with hazardous event, vs 134 of 434 whites [30.9%]; P = .04

  13. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    Science.gov (United States)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  14. Prediction of First Cardiovascular Disease Event in Type 1 Diabetes Mellitus: The Steno Type 1 Risk Engine.

    Science.gov (United States)

    Vistisen, Dorte; Andersen, Gregers Stig; Hansen, Christian Stevns; Hulman, Adam; Henriksen, Jan Erik; Bech-Nielsen, Henning; Jørgensen, Marit Eika

    2016-03-15

    Patients with type 1 diabetes mellitus are at increased risk of developing cardiovascular disease (CVD), but they are currently undertreated. There are no risk scores used on a regular basis in clinical practice for assessing the risk of CVD in type 1 diabetes mellitus. From 4306 clinically diagnosed adult patients with type 1 diabetes mellitus, we developed a prediction model for estimating the risk of first fatal or nonfatal CVD event (ischemic heart disease, ischemic stroke, heart failure, and peripheral artery disease). Detailed clinical data including lifestyle factors were linked to event data from validated national registers. The risk prediction model was developed by using a 2-stage approach. First, a nonparametric, data-driven approach was used to identify potentially informative risk factors and interactions (random forest and survival tree analysis). Second, based on results from the first step, Poisson regression analysis was used to derive the final model. The final CVD prediction model was externally validated in a different population of 2119 patients with type 1 diabetes mellitus. During a median follow-up of 6.8 years (interquartile range, 2.9-10.9) a total of 793 (18.4%) patients developed CVD. The final prediction model included age, sex, diabetes duration, systolic blood pressure, low-density lipoprotein cholesterol, hemoglobin A1c, albuminuria, glomerular filtration rate, smoking, and exercise. Discrimination was excellent for a 5-year CVD event with a C-statistic of 0.826 (95% confidence interval, 0.807-0.845) in the derivation data and a C-statistic of 0.803 (95% confidence interval, 0.767-0.839) in the validation data. The Hosmer-Lemeshow test showed good calibration (P>0.05) in both cohorts. This high-performing CVD risk model allows for the implementation of decision rules in a clinical setting. © 2016 American Heart Association, Inc.

  15. Discrete event model-based simulation for train movement on a single-line railway

    International Nuclear Information System (INIS)

    Xu Xiao-Ming; Li Ke-Ping; Yang Li-Xing

    2014-01-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption. (general)

  16. Adverse life events as risk factors for behavioural and emotional problems in a 7-year follow-up of a population-based child cohort.

    Science.gov (United States)

    Rasmussen, Cathrine Skovmand; Nielsen, Louise Gramstrup; Petersen, Dorthe Janne; Christiansen, Erik; Bilenberg, Niels

    2014-04-01

    The aim of the study was to identify risk factors for significant changes in emotional and behavioural problem load in a community-based cohort of Danish children aged 9-16 years, the risk factors being seven parental and two child-related adverse life events. Data on emotional and behavioural problems was obtained from parents filling in the Child Behavior Checklist (CBCL) when the child was 8-9 and again when 15 years old. Data on risk factors was drawn from Danish registers. Analysis used was logistic regression for crude and adjusted change. Parental divorce significantly raised the odds ratio of an increase in emotional and behavioural problems; furthermore, the risk of deterioration in problem behaviour rose significantly with increasing number of adverse life events. By dividing the children into four groups based on the pathway in problem load (increasers, decreasers, high persisters and low persisters), we found that children with a consistently high level of behavioural problems also had the highest number of adverse life events compared with any other group. Family break-up was found to be a significant risk factor. This supports findings in previous studies. The fact that no other risk factor proved to be of significance might be due to lack of power in the study. Children experiencing high levels of adverse life events are at high risk of chronic problem behaviour. Thus these risk factors should be assessed in daily clinical practice.

  17. Race/Ethnic Differences in the Associations of the Framingham Risk Factors with Carotid IMT and Cardiovascular Events.

    Science.gov (United States)

    Gijsberts, Crystel M; Groenewegen, Karlijn A; Hoefer, Imo E; Eijkemans, Marinus J C; Asselbergs, Folkert W; Anderson, Todd J; Britton, Annie R; Dekker, Jacqueline M; Engström, Gunnar; Evans, Greg W; de Graaf, Jacqueline; Grobbee, Diederick E; Hedblad, Bo; Holewijn, Suzanne; Ikeda, Ai; Kitagawa, Kazuo; Kitamura, Akihiko; de Kleijn, Dominique P V; Lonn, Eva M; Lorenz, Matthias W; Mathiesen, Ellisiv B; Nijpels, Giel; Okazaki, Shuhei; O'Leary, Daniel H; Pasterkamp, Gerard; Peters, Sanne A E; Polak, Joseph F; Price, Jacqueline F; Robertson, Christine; Rembold, Christopher M; Rosvall, Maria; Rundek, Tatjana; Salonen, Jukka T; Sitzer, Matthias; Stehouwer, Coen D A; Bots, Michiel L; den Ruijter, Hester M

    2015-01-01

    Clinical manifestations and outcomes of atherosclerotic disease differ between ethnic groups. In addition, the prevalence of risk factors is substantially different. Primary prevention programs are based on data derived from almost exclusively White people. We investigated how race/ethnic differences modify the associations of established risk factors with atherosclerosis and cardiovascular events. We used data from an ongoing individual participant meta-analysis involving 17 population-based cohorts worldwide. We selected 60,211 participants without cardiovascular disease at baseline with available data on ethnicity (White, Black, Asian or Hispanic). We generated a multivariable linear regression model containing risk factors and ethnicity predicting mean common carotid intima-media thickness (CIMT) and a multivariable Cox regression model predicting myocardial infarction or stroke. For each risk factor we assessed how the association with the preclinical and clinical measures of cardiovascular atherosclerotic disease was affected by ethnicity. Ethnicity appeared to significantly modify the associations between risk factors and CIMT and cardiovascular events. The association between age and CIMT was weaker in Blacks and Hispanics. Systolic blood pressure associated more strongly with CIMT in Asians. HDL cholesterol and smoking associated less with CIMT in Blacks. Furthermore, the association of age and total cholesterol levels with the occurrence of cardiovascular events differed between Blacks and Whites. The magnitude of associations between risk factors and the presence of atherosclerotic disease differs between race/ethnic groups. These subtle, yet significant differences provide insight in the etiology of cardiovascular disease among race/ethnic groups. These insights aid the race/ethnic-specific implementation of primary prevention.

  18. Low-Dose Aspirin Discontinuation and Risk of Cardiovascular Events: A Swedish Nationwide, Population-Based Cohort Study.

    Science.gov (United States)

    Sundström, Johan; Hedberg, Jakob; Thuresson, Marcus; Aarskog, Pernilla; Johannesen, Kasper Munk; Oldgren, Jonas

    2017-09-26

    There are increasing concerns about risks associated with aspirin discontinuation in the absence of major surgery or bleeding. We investigated whether long-term low-dose aspirin discontinuation and treatment gaps increase the risk of cardiovascular events. We performed a cohort study of 601 527 users of low-dose aspirin for primary or secondary prevention in the Swedish prescription register between 2005 and 2009 who were >40 years of age, were free from previous cancer, and had ≥80% adherence during the first observed year of treatment. Cardiovascular events were identified with the Swedish inpatient and cause-of-death registers. The first 3 months after a major bleeding or surgical procedure were excluded from the time at risk. During a median of 3.0 years of follow-up, 62 690 cardiovascular events occurred. Patients who discontinued aspirin had a higher rate of cardiovascular events than those who continued (multivariable-adjusted hazard ratio, 1.37; 95% confidence interval, 1.34-1.41), corresponding to an additional cardiovascular event observed per year in 1 of every 74 patients who discontinue aspirin. The risk increased shortly after discontinuation and did not appear to diminish over time. In long-term users, discontinuation of low-dose aspirin in the absence of major surgery or bleeding was associated with a >30% increased risk of cardiovascular events. Adherence to low-dose aspirin treatment in the absence of major surgery or bleeding is likely an important treatment goal. © 2017 American Heart Association, Inc.

  19. Relative risk for cardiovascular atherosclerotic events after smoking cessation: 6–9 years excess risk in individuals with familial hypercholesterolemia

    Directory of Open Access Journals (Sweden)

    Kastelein John JP

    2006-10-01

    Full Text Available Abstract Background Smoking history is often di- or trichotomized into for example "never, ever or current smoking". However, smoking must be treated as a time-dependent covariate when lifetime data is available. In particular, individuals do not smoke at birth, there is usually a wide variation with respect to smoking history, and smoking cessation must also be considered. Methods Therefore we analyzed smoking as a time-dependent risk factor for cardiovascular atherosclerotic events in a cohort of 2400 individuals with familial hypercholesterolemia who were followed from birth until 2004. Excess risk after smoking-cessation was modelled in a Cox regression model with linear and exponential decaying trends. The model with the highest likelihood value was used to estimate the decay of the excess risk of smoking. Results Atherosclerotic events were observed in 779 patients with familial hypercholesterolemia and 1569 individuals had a smoking history. In the model with the highest likelihood value the risk reduction of smoking after cessation follows a linear pattern with time and it appears to take 6 to 9 years before the excess risk is reduced to zero. The risk of atherosclerotic events due to smoking was estimated as 2.1 (95% confidence interval 1.5; 2.9. Conclusion It was concluded that excess risk due to smoking declined linearly after cessation in at least six to nine years.

  20. Challenges of Modeling Flood Risk at Large Scales

    Science.gov (United States)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    Flood risk management is a major concern for many nations and for the insurance sector in places where this peril is insured. A prerequisite for risk management, whether in the public sector or in the private sector is an accurate estimation of the risk. Mitigation measures and traditional flood management techniques are most successful when the problem is viewed at a large regional scale such that all inter-dependencies in a river network are well understood. From an insurance perspective the jury is still out there on whether flood is an insurable peril. However, with advances in modeling techniques and computer power it is possible to develop models that allow proper risk quantification at the scale suitable for a viable insurance market for flood peril. In order to serve the insurance market a model has to be event-simulation based and has to provide financial risk estimation that forms the basis for risk pricing, risk transfer and risk management at all levels of insurance industry at large. In short, for a collection of properties, henceforth referred to as a portfolio, the critical output of the model is an annual probability distribution of economic losses from a single flood occurrence (flood event) or from an aggregation of all events in any given year. In this paper, the challenges of developing such a model are discussed in the context of Great Britain for which a model has been developed. The model comprises of several, physically motivated components so that the primary attributes of the phenomenon are accounted for. The first component, the rainfall generator simulates a continuous series of rainfall events in space and time over thousands of years, which are physically realistic while maintaining the statistical properties of rainfall at all locations over the model domain. A physically based runoff generation module feeds all the rivers in Great Britain, whose total length of stream links amounts to about 60,000 km. A dynamical flow routing

  1. Risk-averse decision-making for civil infrastructure exposed to low-probability, high-consequence events

    International Nuclear Information System (INIS)

    Cha, Eun Jeong; Ellingwood, Bruce R.

    2012-01-01

    Quantitative analysis and assessment of risk to civil infrastructure has two components: probability of a potentially damaging event and consequence of damage, measured in terms of financial or human losses. Decision models that have been utilized during the past three decades take into account the probabilistic component rationally, but address decision-maker attitudes toward consequences and risk only to a limited degree. The application of models reflecting these attitudes to decisions involving low-probability, high-consequence events that may impact civil infrastructure requires a fundamental understanding of risk acceptance attitudes and how they affect individual and group choices. In particular, the phenomenon of risk aversion may be a significant factor in decisions for civil infrastructure exposed to low-probability events with severe consequences, such as earthquakes, hurricanes or floods. This paper utilizes cumulative prospect theory to investigate the role and characteristics of risk-aversion in assurance of structural safety.

  2. Canadian population risk of radon induced lung cancer variation range assessment based on various radon risk models

    International Nuclear Information System (INIS)

    Chen, Jing

    2017-01-01

    To address public concerns regarding radon risk and variations in risk estimates based on various risk models available in the literature, lifetime lung cancer risks were calculated with five well-known risk models using more recent Canadian vital statistics (5-year averages from 2008 to 2012). Variations in population risk estimation among various models were assessed. The results showed that the Canadian population risk of radon induced lung cancer can vary from 5.0 to 17% for men and 5.1 to 18% for women based on different radon risk models. Averaged over the estimates from various risk models with better radon dosimetry, 13% of lung cancer deaths among Canadian males and 14% of lung cancer deaths among Canadian females were attributable to long-term indoor radon exposure. (authors)

  3. Stochastic Watershed Models for Risk Based Decision Making

    Science.gov (United States)

    Vogel, R. M.

    2017-12-01

    Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation

  4. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi......-dimensional schemes that are customized to serve specific information needs. EVER is based on an event concept that is very well suited for multi-dimensional modeling because measurement data often represent events in multi-dimensional databases...

  5. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  6. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  7. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  8. Stressful life events and cancer risk

    DEFF Research Database (Denmark)

    Bergelt, C; Prescott, E; Grønbaek, M

    2006-01-01

    In a prospective cohort study in Denmark of 8736 randomly selected people, no evidence was found among 1011 subjects who developed cancer that self-reported stressful major life events had increased their risk for cancer.......In a prospective cohort study in Denmark of 8736 randomly selected people, no evidence was found among 1011 subjects who developed cancer that self-reported stressful major life events had increased their risk for cancer....

  9. Calibration plots for risk prediction models in the presence of competing risks

    DEFF Research Database (Denmark)

    Gerds, Thomas A; Andersen, Per K; Kattan, Michael W

    2014-01-01

    A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks...... prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves...

  10. Model based climate information on drought risk in Africa

    Science.gov (United States)

    Calmanti, S.; Syroka, J.; Jones, C.; Carfagna, F.; Dell'Aquila, A.; Hoefsloot, P.; Kaffaf, S.; Nikulin, G.

    2012-04-01

    The United Nations World Food Programme (WFP) has embarked upon the endeavor of creating a sustainable Africa-wide natural disaster risk management system. A fundamental building block of this initiative is the setup of a drought impact modeling platform called Africa Risk-View that aims to quantify and monitor weather-related food security risk in Africa. The modeling approach is based the Water Requirement Satisfaction Index (WRSI), as the fundamental indicator of the performances of agriculture and uses historical records of food assistance operation to project future potential needs for livelihood protection. By using climate change scenarios as an input to Africa Risk-View it is possible, in principles, to evaluate the future impact of climate variability on critical issues such as food security and the overall performance of the envisaged risk management system. A necessary preliminary step to this challenging task is the exploration of the sources of uncertainties affecting the assessment based on modeled climate change scenarios. For this purpose, a limited set of climate models have been selected in order verify the relevance of using climate model output data with Africa Risk-View and to explore a minimal range of possible sources of uncertainty. This first evaluation exercise started before the setup of the CORDEX framework and has relied on model output available at the time. In particular only one regional downscaling was available for the entire African continent from the ENSEMBLES project. The analysis shows that current coarse resolution global climate models can not directly feed into the Africa RiskView risk-analysis tool. However, regional downscaling may help correcting the inherent biases observed in the datasets. Further analysis is performed by using the first data available under the CORDEX framework. In particular, we consider a set of simulation driven with boundary conditions from the reanalysis ERA-Interim to evaluate the skill drought

  11. Use of documentary sources on past flood events for flood risk management and land planning

    Science.gov (United States)

    Cœur, Denis; Lang, Michel

    2008-09-01

    The knowledge of past catastrophic events can improve flood risk mitigation policy, with a better awareness against risk. As such historical information is usually available in Europe for the past five centuries, historians are able to understand how past society dealt with flood risk, and hydrologists can include information on past floods into an adapted probabilistic framework. In France, Flood Risk Mitigation Maps are based either on the largest historical known flood event or on the 100-year flood event if it is greater. Two actions can be suggested in terms of promoting the use of historical information for flood risk management: (1) the development of a regional flood data base, with both historical and current data, in order to get a good feedback on recent events and to improve the flood risk education and awareness; (2) the commitment to keep a persistent/perennial management of a reference network of hydrometeorological observations for climate change studies.

  12. Assessing hail risk for a building portfolio by generating stochastic events

    Science.gov (United States)

    Nicolet, Pierrick; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel; Nguyen, Liliane; Voumard, Jérémie

    2015-04-01

    Among the natural hazards affecting buildings, hail is one of the most costly and is nowadays a major concern for building insurance companies. In Switzerland, several costly events were reported these last years, among which the July 2011 event, which cost around 125 million EUR to the Aargauer public insurance company (North-western Switzerland). This study presents the new developments in a stochastic model which aims at evaluating the risk for a building portfolio. Thanks to insurance and meteorological radar data of the 2011 Aargauer event, vulnerability curves are proposed by comparing the damage rate to the radar intensity (i.e. the maximum hailstone size reached during the event, deduced from the radar signal). From these data, vulnerability is defined by a two-step process. The first step defines the probability for a building to be affected (i.e. to claim damages), while the second, if the building is affected, attributes a damage rate to the building from a probability distribution specific to the intensity class. To assess the risk, stochastic events are then generated by summing a set of Gaussian functions with 6 random parameters (X and Y location, maximum hailstone size, standard deviation, eccentricity and orientation). The location of these functions is constrained by a general event shape and by the position of the previously defined functions of the same event. For each generated event, the total cost is calculated in order to obtain a distribution of event costs. The general events parameters (shape, size, …) as well as the distribution of the Gaussian parameters are inferred from two radar intensity maps, namely the one of the aforementioned event, and a second from an event which occurred in 2009. After a large number of simulations, the hailstone size distribution obtained in different regions is compared to the distribution inferred from pre-existing hazard maps, built from a larger set of radar data. The simulation parameters are then

  13. Probabilistic Models for Solar Particle Events

    Science.gov (United States)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  14. TERRITORIAL RISK ASSESMENT AFTER TERRORIST ACT: EXPRESS MODEL

    Directory of Open Access Journals (Sweden)

    M. M. Biliaiev

    2018-02-01

    Full Text Available Purpose. The paper involves the development of a method to assess the territorial risk in the event of a terrorist attack using a chemical agent. Methodology. To describe the process of chemical agent scattering in the atmosphere, ejected in the event of a terrorist attack, the equation of mass transfer of an impurity in atmospheric air is used. The equation takes into account the velocity of the wind flow, atmospheric diffusion, the intensity of chemical agent emission, the presence of buildings near the site of the emission of a chemically hazardous substance. For numerical integration of the modeling equation, a finite difference method is used. A feature of the developed numerical model is the possibility of assessing the territorial risk in the event of a terrorist attack under different weather conditions and the presence of buildings. Findings. A specialized numerical model and software package has been developed that can be used to assess the territorial risk, both in the case of terrorist attacks, with the use of chemical agents, and in case of extreme situations at chemically hazardous facilities and transport. The method can be implemented on small and medium-sized computers, which allows it to be widely used for solving the problems of the class under consideration. The results of a computational experiment are presented that allow estimating possibilities of the proposed method for assessing the territorial risk in the event of a terrorist attack using a chemical agent. Originality. An effective method of assessing the territorial risk in the event of a terrorist attack using a chemically hazardous substance is proposed. The method can be used to assess the territorial risk in an urban environment, which allows you to obtain adequate data on possible damage areas. The method is based on the numerical integration of the fundamental mass transfer equation, which expresses the law of conservation of mass in a liquid medium. Practical

  15. Drinking game play among first-year college student drinkers: an event-specific analysis of the risk for alcohol use and problems.

    Science.gov (United States)

    Ray, Anne E; Stapleton, Jerod L; Turrisi, Rob; Mun, Eun-Young

    2014-09-01

    College students who play drinking games (DGs) more frequently report higher levels of alcohol use and experience more alcohol-related harm. However, the extent to which they are at risk for increased consumption and harm as a result of DG play on a given event after accounting for their typical DG participation, and typical and event drinking, is unclear. We examined whether first-year students consumed more alcohol and were more likely to experience consequences on drinking occasions when they played DGs. Participants (n = 336) completed up to six web-based surveys following weekend drinking events in their first semester. Alcohol use, DG play, and consequences were reported for the Friday and Saturday prior to each survey. Typical DG tendencies were controlled in all models. Typical and event alcohol use were controlled in models predicting risk for consequences. Participants consumed more alcohol on DG versus non-DG events. All students were more likely to experience blackout drinking consequences when they played DGs. Women were more likely to experience social-interpersonal consequences when they played DGs. DG play is an event-specific risk factor for increased alcohol use among first-year students, regardless of individual DG play tendencies. Further, event DG play signals increased risk for blackout drinking consequences for all students, and social-interpersonal consequences for women, aside from the amount of alcohol consumed on those occasions as well as typical drinking behaviors. Prevention efforts to reduce high-risk drinking may be strengthened by highlighting both event- and person-specific risks of DG play.

  16. Weather based risks and insurances for agricultural production

    Science.gov (United States)

    Gobin, Anne

    2015-04-01

    Extreme weather events such as frost, drought, heat waves and rain storms can have devastating effects on cropping systems. According to both the agriculture and finance sectors, a risk assessment of extreme weather events and their impact on cropping systems is needed. The principle of return periods or frequencies of natural hazards is adopted in many countries as the basis of eligibility for the compensation of associated losses. For adequate risk management and eligibility, hazard maps for events with a 20-year return period are often used. Damages due to extreme events are strongly dependent on crop type, crop stage, soil type and soil conditions. The impact of extreme weather events particularly during the sensitive periods of the farming calendar therefore requires a modelling approach to capture the mixture of non-linear interactions between the crop, its environment and the occurrence of the meteorological event in the farming calendar. Physically based crop models such as REGCROP (Gobin, 2010) assist in understanding the links between different factors causing crop damage. Subsequent examination of the frequency, magnitude and impacts of frost, drought, heat stress and soil moisture stress in relation to the cropping season and crop sensitive stages allows for risk profiles to be confronted with yields, yield losses and insurance claims. The methodology is demonstrated for arable food crops, bio-energy crops and fruit. The perspective of rising risk-exposure is exacerbated further by limited aid received for agricultural damage, an overall reduction of direct income support to farmers and projected intensification of weather extremes with climate change. Though average yields have risen continuously due to technological advances, there is no evidence that relative tolerance to adverse weather events has improved. The research is funded by the Belgian Science Policy Organisation (Belspo) under contract nr SD/RI/03A.

  17. The use of biologically based cancer risk models in radiation epidemiology

    International Nuclear Information System (INIS)

    Krewski, D.; Zielinski, J.M.; Hazelton, W.D.; Garner, M.J.; Moolgavkar, S.H.

    2003-01-01

    Biologically based risk projection models for radiation carcinogenesis seek to describe the fundamental biological processes involved in neoplastic transformation of somatic cells into malignant cancer cells. A validated biologically based model, whose parameters have a direct biological interpretation, can also be used to extrapolate cancer risks to different exposure conditions with some confidence. In this article, biologically based models for radiation carcinogenesis, including the two-stage clonal expansion (TSCE) model and its extensions, are reviewed. The biological and mathematical bases for such models are described, and the implications of key model parameters for cancer risk assessment examined. Specific applications of versions of the TSCE model to important epidemiologic datasets are discussed, including the Colorado uranium miners' cohort; a cohort of Chinese tin miners; the lifespan cohort of atomic bomb survivors in Hiroshima and Nagasaki; and a cohort of over 200,000 workers included in the National Dose Registry (NDR) of Canada. (author)

  18. Physics-based Entry, Descent and Landing Risk Model

    Science.gov (United States)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  19. Limits of Risk Predictability in a Cascading Alternating Renewal Process Model.

    Science.gov (United States)

    Lin, Xin; Moussawi, Alaa; Korniss, Gyorgy; Bakdash, Jonathan Z; Szymanski, Boleslaw K

    2017-07-27

    Most risk analysis models systematically underestimate the probability and impact of catastrophic events (e.g., economic crises, natural disasters, and terrorism) by not taking into account interconnectivity and interdependence of risks. To address this weakness, we propose the Cascading Alternating Renewal Process (CARP) to forecast interconnected global risks. However, assessments of the model's prediction precision are limited by lack of sufficient ground truth data. Here, we establish prediction precision as a function of input data size by using alternative long ground truth data generated by simulations of the CARP model with known parameters. We illustrate the approach on a model of fires in artificial cities assembled from basic city blocks with diverse housing. The results confirm that parameter recovery variance exhibits power law decay as a function of the length of available ground truth data. Using CARP, we also demonstrate estimation using a disparate dataset that also has dependencies: real-world prediction precision for the global risk model based on the World Economic Forum Global Risk Report. We conclude that the CARP model is an efficient method for predicting catastrophic cascading events with potential applications to emerging local and global interconnected risks.

  20. Automated reasoning with dynamic event trees: a real-time, knowledge-based decision aide

    International Nuclear Information System (INIS)

    Touchton, R.A.; Gunter, A.D.; Subramanyan, N.

    1988-01-01

    The models and data contained in a probabilistic risk assessment (PRA) Event Sequence Analysis represent a wealth of information that can be used for dynamic calculation of event sequence likelihood. In this paper we report a new and unique computerization methodology which utilizes these data. This sub-system (referred to as PREDICTOR) has been developed and tested as part of a larger system. PREDICTOR performs a real-time (re)calculation of the estimated likelihood of core-melt as a function of plant status. This methodology uses object-oriented programming techniques from the artificial intelligence discipline that enable one to codify event tree and fault tree logic models and associated probabilities developed in a PRA study. Existence of off-normal conditions is reported to PREDICTOR, which then updates the relevant failure probabilities throughout the event tree and fault tree models by dynamically replacing the off-the-shelf (or prior) probabilities with new probabilities based on the current situation. The new event probabilities are immediately propagated through the models (using 'demons') and an updated core-melt probability is calculated. Along the way, the dominant non-success path of each event tree is determined and highlighted. (author)

  1. Tutorial in biostatistics: competing risks and multi-state models

    NARCIS (Netherlands)

    Putter, H.; Fiocco, M.; Geskus, R. B.

    2007-01-01

    Standard survival data measure the time span from some time origin until the occurrence of one type of event. If several types of events occur, a model describing progression to each of these competing risks is needed. Multi-state models generalize competing risks models by also describing

  2. An overview of techniques for linking high-dimensional molecular data to time-to-event endpoints by risk prediction models.

    Science.gov (United States)

    Binder, Harald; Porzelius, Christine; Schumacher, Martin

    2011-03-01

    Analysis of molecular data promises identification of biomarkers for improving prognostic models, thus potentially enabling better patient management. For identifying such biomarkers, risk prediction models can be employed that link high-dimensional molecular covariate data to a clinical endpoint. In low-dimensional settings, a multitude of statistical techniques already exists for building such models, e.g. allowing for variable selection or for quantifying the added value of a new biomarker. We provide an overview of techniques for regularized estimation that transfer this toward high-dimensional settings, with a focus on models for time-to-event endpoints. Techniques for incorporating specific covariate structure are discussed, as well as techniques for dealing with more complex endpoints. Employing gene expression data from patients with diffuse large B-cell lymphoma, some typical modeling issues from low-dimensional settings are illustrated in a high-dimensional application. First, the performance of classical stepwise regression is compared to stage-wise regression, as implemented by a component-wise likelihood-based boosting approach. A second issues arises, when artificially transforming the response into a binary variable. The effects of the resulting loss of efficiency and potential bias in a high-dimensional setting are illustrated, and a link to competing risks models is provided. Finally, we discuss conditions for adequately quantifying the added value of high-dimensional gene expression measurements, both at the stage of model fitting and when performing evaluation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  4. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and

  5. Modeling crowd behavior based on the discrete-event multiagent approach

    OpenAIRE

    Лановой, Алексей Феликсович; Лановой, Артем Алексеевич

    2014-01-01

    The crowd is a temporary, relatively unorganized group of people, who are in close physical contact with each other. Individual behavior of human outside the crowd is determined by many factors, associated with his intellectual activities, but inside the crowd the man loses his identity and begins to obey more simple laws of behavior.One of approaches to the construction of multi-level model of the crowd using discrete-event multiagent approach was described in the paper.Based on this analysi...

  6. Cognitive complexity of the medical record is a risk factor for major adverse events.

    Science.gov (United States)

    Roberson, David; Connell, Michael; Dillis, Shay; Gauvreau, Kimberlee; Gore, Rebecca; Heagerty, Elaina; Jenkins, Kathy; Ma, Lin; Maurer, Amy; Stephenson, Jessica; Schwartz, Margot

    2014-01-01

    Patients in tertiary care hospitals are more complex than in the past, but the implications of this are poorly understood as "patient complexity" has been difficult to quantify. We developed a tool, the Complexity Ruler, to quantify the amount of data (as bits) in the patient’s medical record. We designated the amount of data in the medical record as the cognitive complexity of the medical record (CCMR). We hypothesized that CCMR is a useful surrogate for true patient complexity and that higher CCMR correlates with risk of major adverse events. The Complexity Ruler was validated by comparing the measured CCMR with physician rankings of patient complexity on specific inpatient services. It was tested in a case-control model of all patients with major adverse events at a tertiary care pediatric hospital from 2005 to 2006. The main outcome measure was an externally reported major adverse event. We measured CCMR for 24 hours before the event, and we estimated lifetime CCMR. Above empirically derived cutoffs, 24-hour and lifetime CCMR were risk factors for major adverse events (odds ratios, 5.3 and 6.5, respectively). In a multivariate analysis, CCMR alone was essentially as predictive of risk as a model that started with 30-plus clinical factors. CCMR correlates with physician assessment of complexity and risk of adverse events. We hypothesize that increased CCMR increases the risk of physician cognitive overload. An automated version of the Complexity Ruler could allow identification of at-risk patients in real time.

  7. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  8. Cardiovascular risk prediction in HIV-infected patients: comparing the Framingham, atherosclerotic cardiovascular disease risk score (ASCVD), Systematic Coronary Risk Evaluation for the Netherlands (SCORE-NL) and Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) risk prediction models.

    Science.gov (United States)

    Krikke, M; Hoogeveen, R C; Hoepelman, A I M; Visseren, F L J; Arends, J E

    2016-04-01

    The aim of the study was to compare the predictions of five popular cardiovascular disease (CVD) risk prediction models, namely the Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) model, the Framingham Heart Study (FHS) coronary heart disease (FHS-CHD) and general CVD (FHS-CVD) models, the American Heart Association (AHA) atherosclerotic cardiovascular disease risk score (ASCVD) model and the Systematic Coronary Risk Evaluation for the Netherlands (SCORE-NL) model. A cross-sectional design was used to compare the cumulative CVD risk predictions of the models. Furthermore, the predictions of the general CVD models were compared with those of the HIV-specific D:A:D model using three categories ( 20%) to categorize the risk and to determine the degree to which patients were categorized similarly or in a higher/lower category. A total of 997 HIV-infected patients were included in the study: 81% were male and they had a median age of 46 [interquartile range (IQR) 40-52] years, a known duration of HIV infection of 6.8 (IQR 3.7-10.9) years, and a median time on ART of 6.4 (IQR 3.0-11.5) years. The D:A:D, ASCVD and SCORE-NL models gave a lower cumulative CVD risk, compared with that of the FHS-CVD and FHS-CHD models. Comparing the general CVD models with the D:A:D model, the FHS-CVD and FHS-CHD models only classified 65% and 79% of patients, respectively, in the same category as did the D:A:D model. However, for the ASCVD and SCORE-NL models, this percentage was 89% and 87%, respectively. Furthermore, FHS-CVD and FHS-CHD attributed a higher CVD risk to 33% and 16% of patients, respectively, while this percentage was D:A:D, ASCVD and SCORE-NL models. This could have consequences regarding overtreatment, drug-related adverse events and drug-drug interactions. © 2015 British HIV Association.

  9. Satellite Collision Modeling with Physics-Based Hydrocodes: Debris Generation Predictions of the Iridium-Cosmos Collision Event and Other Impact Events

    International Nuclear Information System (INIS)

    Springer, H.K.; Miller, W.O.; Levatin, J.L.; Pertica, A.J.; Olivier, S.S.

    2010-01-01

    Satellite collision debris poses risks to existing space assets and future space missions. Predictive models of debris generated from these hypervelocity collisions are critical for developing accurate space situational awareness tools and effective mitigation strategies. Hypervelocity collisions involve complex phenomenon that spans several time- and length-scales. We have developed a satellite collision debris modeling approach consisting of a Lagrangian hydrocode enriched with smooth particle hydrodynamics (SPH), advanced material failure models, detailed satellite mesh models, and massively parallel computers. These computational studies enable us to investigate the influence of satellite center-of-mass (CM) overlap and orientation, relative velocity, and material composition on the size, velocity, and material type distributions of collision debris. We have applied our debris modeling capability to the recent Iridium 33-Cosmos 2251 collision event. While the relative velocity was well understood in this event, the degree of satellite CM overlap and orientation was ill-defined. In our simulations, we varied the collision CM overlap and orientation of the satellites from nearly maximum overlap to partial overlap on the outermost extents of the satellites (i.e, solar panels and gravity boom). As expected, we found that with increased satellite overlap, the overall debris cloud mass and momentum (transfer) increases, the average debris size decreases, and the debris velocity increases. The largest predicted debris can also provide insight into which satellite components were further removed from the impact location. A significant fraction of the momentum transfer is imparted to the smallest debris (< 1-5mm, dependent on mesh resolution), especially in large CM overlap simulations. While the inclusion of the smallest debris is critical to enforcing mass and momentum conservation in hydrocode simulations, there seems to be relatively little interest in their

  10. Carotid Atherosclerosis Progression and Risk of Cardiovascular Events in a Community in Taiwan.

    Science.gov (United States)

    Chen, Pei-Chun; Jeng, Jiann-Shing; Hsu, Hsiu-Ching; Su, Ta-Chen; Chien, Kuo-Liong; Lee, Yuan-Teh

    2016-05-12

    The authors investigated the association between progression of carotid atherosclerosis and incidence of cardiovascular disease in a community cohort in Taiwan. Data has rarely been reported in Asian populations. Study subjects were 1,398 participants who underwent ultrasound measures of common carotid artery intima-media thickness (IMT) and extracranial carotid artery plaque score at both 1994-1995 and 1999-2000 surveys. Cox proportional hazards model was used to assess the risk of incident cardiovascular disease. During a median follow-up of 13 years (1999-2013), 71 strokes and 68 coronary events occurred. The 5-year individual IMT change was not associated with development of cardiovascular events in unadjusted and adjusted models. Among subjects without plaque in 1994-1995, we observed elevated risk associated with presence of new plaque (plaque score >0 in 1999-2000) in a dose-response manner in unadjusted and age- and sex- adjusted models. The associations attenuated and became statistically non-significant after controlling for cardiovascular risk factors (hazard ratio [95% confidence interval] for plaque score >2 vs. 0: stroke, 1.61 [0.79-3.27], coronary events, 1.13 [0.48-2.69]). This study suggested that carotid plaque formation measured by ultrasound is associated increased risk of developing cardiovascular disease, and cardiovascular risk factors explain the associations to a large extent.

  11. Alternative measures of risk of extreme events in decision trees

    International Nuclear Information System (INIS)

    Frohwein, H.I.; Lambert, J.H.; Haimes, Y.Y.

    1999-01-01

    A need for a methodology to control the extreme events, defined as low-probability, high-consequence incidents, in sequential decisions is identified. A variety of alternative and complementary measures of the risk of extreme events are examined for their usability as objective functions in sequential decisions, represented as single- or multiple-objective decision trees. Earlier work had addressed difficulties, related to non-separability, with the minimization of some measures of the risk of extreme events in sequential decisions. In an extension of these results, it is shown how some non-separable measures of the risk of extreme events can be interpreted in terms of separable constituents of risk, thereby enabling a wider class of measures of the risk of extreme events to be handled in a straightforward manner in a decision tree. Also for extreme events, results are given to enable minimax- and Hurwicz-criterion analyses in decision trees. An example demonstrates the incorporation of different measures of the risk of extreme events in a multi-objective decision tree. Conceptual formulations for optimizing non-separable measures of the risk of extreme events are identified as an important area for future investigation

  12. Managing and understanding risk perception of surface leaks from CCS sites: risk assessment for emerging technologies and low-probability, high-consequence events

    Science.gov (United States)

    Augustin, C. M.

    2015-12-01

    Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a

  13. A quantile-based Time at Risk: A new approach for assessing risk in financial markets

    Science.gov (United States)

    Bolgorian, Meysam; Raei, Reza

    2013-11-01

    In this paper, we provide a new measure for evaluation of risk in financial markets. This measure is based on the return interval of critical events in financial markets or other investment situations. Our main goal was to devise a model like Value at Risk (VaR). As VaR, for a given financial asset, probability level and time horizon, gives a critical value such that the likelihood of loss on the asset over the time horizon exceeds this value is equal to the given probability level, our concept of Time at Risk (TaR), using a probability distribution function of return intervals, provides a critical time such that the probability that the return interval of a critical event exceeds this time equals the given probability level. As an empirical application, we applied our model to data from the Tehran Stock Exchange Price Index (TEPIX) as a financial asset (market portfolio) and reported the results.

  14. Analysis and modeling of a hail event consequences on a building portfolio

    Science.gov (United States)

    Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel

    2014-05-01

    North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the

  15. A case for multi-model and multi-approach based event attribution: The 2015 European drought

    Science.gov (United States)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle

    2017-04-01

    Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.

  16. Racial differences in risks for first cardiovascular events and noncardiovascular death: the Atherosclerosis Risk in Communities study, the Cardiovascular Health Study, and the Multi-Ethnic Study of Atherosclerosis.

    Science.gov (United States)

    Feinstein, Matthew; Ning, Hongyan; Kang, Joseph; Bertoni, Alain; Carnethon, Mercedes; Lloyd-Jones, Donald M

    2012-07-03

    No studies have compared first cardiovascular disease (CVD) events and non-CVD death between races in a competing risks framework, which examines risks for numerous events simultaneously. We used competing Cox models to estimate hazards for first CVD events and non-CVD death within and between races in 3 multicenter, National Heart, Lung, and Blood Institute-sponsored cohorts. Of 14 569 Atherosclerosis Risk in Communities (ARIC) study participants aged 45 to 64 years with mean follow-up of 10.5 years, 11.6% had CVD and 5.0% had non-CVD death as first events; among 4237 Cardiovascular Health Study (CHS) study participants aged 65 to 84 years and followed for 8.5 years, these figures were 43.2% and 15.7%, respectively. Middle-aged blacks were significantly more likely than whites to experience any CVD as a first event; this disparity disappeared by older adulthood and after adjustment for CVD risk factors. The pattern of results was similar for Multi-Ethnic Study of Atherosclerosis (MESA) participants. Traditional Cox and competing risks models yielded different results for coronary heart disease risk. Black men appeared somewhat more likely than white men to experience coronary heart disease with use of a standard Cox model (hazard ratio 1.06; 95% CI 0.90, 1.26), whereas they appeared less likely than white men to have a first coronary heart disease event with use of a competing risks model (hazard ratio, 0.77; 95% CI, 0.60, 1.00). CVD affects blacks at an earlier age than whites; this may be attributable in part to elevated CVD risk factor levels among blacks. Racial disparities in first CVD incidence disappear by older adulthood. Competing risks analyses may yield somewhat different results than traditional Cox models and provide a complementary approach to examining risks for first CVD events.

  17. Modeling urban flood risk territories for Riga city

    Science.gov (United States)

    Piliksere, A.; Sennikovs, J.; Virbulis, J.; Bethers, U.; Bethers, P.; Valainis, A.

    2012-04-01

    Riga, the capital of Latvia, is located on River Daugava at the Gulf of Riga. The main flooding risks of Riga city are: (1) storm caused water setup in South part of Gulf of Riga (storm event), (2) water level increase caused by Daugava River discharge maximums (spring snow melting event) and (3) strong rainfall or rapid snow melting in densely populated urban areas. The first two flooding factors were discussed previously (Piliksere et al, 2011). The aims of the study were (1) the identification of the flood risk situations in densely populated areas, (2) the quantification of the flooding scenarios caused by rain and snow melting events of different return periods nowadays, in the near future (2021-2050), far future (2071-2100) taking into account the projections of climate change, (3) estimation of groundwater level for Riga city, (4) the building and calibration of the hydrological mathematical model based on SWMM (EPA, 2004) for the domain potentially vulnerable for rain and snow melt flooding events, (5) the calculation of rain and snow melting flood events with different return periods, (6) mapping the potentially flooded areas on a fine grid. The time series of short term precipitation events during warm time period of year (id est. rain events) were analyzed for 35 year long time period. Annual maxima of precipitation intensity for events with different duration (5 min; 15 min; 1h; 3h; 6h; 12h; 1 day; 2 days; 4 days; 10 days) were calculated. The time series of long term simultaneous precipitation data and observations of the reduction of thickness of snow cover were analyzed for 27 year long time period. Snow thawing periods were detected and maximum of snow melting intensity for events with different intensity (1day; 2 days; 4 days; 7 days; 10 days) were calculated. According to the occurrence probability six scenarios for each event for nowadays, near and far future with return period once in 5, 10, 20, 50, 100 and 200 years were constructed based on

  18. Competing risks of cancer mortality and cardiovascular events in individuals with multimorbidity

    Directory of Open Access Journals (Sweden)

    Elizabeth A. Bayliss

    2014-08-01

    Full Text Available Background: Cancer patients with cardiovascular and other comorbidities are at concurrent risk of multiple adverse outcomes. However, most treatment decisions are guided by evidence from single-outcome models, which may be misleading for multimorbid patients. Objective: We assessed the interacting effects of cancer, cardiovascular, and other morbidity burdens on the competing outcomes of cancer mortality, serious cardiovascular events, and other-cause mortality. Design: We analyzed a cohort of 6,500 adults with initial cancer diagnosis between 2001 and 2008, SEER 5-year survival ≥26%, and a range of cardiovascular risk factors. We estimated the cumulative incidence of cancer mortality, a serious cardiovascular event (myocardial infarction, coronary revascularization, or cardiovascular mortality, and other-cause mortality over 5 years, and identified factors associated with the competing risks of each outcome using cause-specific Cox proportional hazard models. Results: Following cancer diagnosis, there were 996 (15.3% cancer deaths, 328 (5.1% serious cardiovascular events, and 542 (8.3% deaths from other causes. In all, 4,634 (71.3% cohort members had none of these outcomes. Although cancer prognosis had the greatest effect, cardiovascular and other morbidity also independently increased the hazard of each outcome. The effect of cancer prognosis on outcome was greatest in year 1, and the effect of other morbidity was greater in individuals with better cancer prognoses. Conclusion: In multimorbid oncology populations, comorbidities interact to affect the competing risk of different outcomes. Quantifying these risks may provide persons with cancer plus cardiovascular and other comorbidities more accurate information for shared decision-making than risks calculated from single-outcome models.

  19. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface...... for predicting the covariate specific absolute risks, their confidence intervals, and their confidence bands based on right censored time to event data. We provide explicit formulas for our implementation of the estimator of the (stratified) baseline hazard function in the presence of tied event times. As a by...... functionals. The software presented here is implemented in the riskRegression package....

  20. Adverse life events as risk factors for behavioural and emotional problems in a 7-year follow-up of a population-based child cohort

    DEFF Research Database (Denmark)

    Rasmussen, Cathrine Skovmand; Nielsen, Louise Gramstrup; Petersen, Dorthe Janne

    2014-01-01

    on emotional and behavioural problems was obtained from parents filling in the Child Behavior Checklist (CBCL) when the child was 8-9 and again when 15 years old. Data on risk factors was drawn from Danish registers. Analysis used was logistic regression for crude and adjusted change. Results: Parental divorce......Background and aim: The aim of the study was to identify risk factors for significant changes in emotional and behavioural problem load in a community-based cohort of Danish children aged 9-16 years, the risk factors being seven parental and two child-related adverse life events. Methods: Data...... significantly raised the odds ratio of an increase in emotional and behavioural problems; furthermore, the risk of deterioration in problem behaviour rose significantly with increasing number of adverse life events. By dividing the children into four groups based on the pathway in problem load (increasers...

  1. Integrating Urban Infrastructure and Health System Impact Modeling for Disasters and Mass-Casualty Events

    Science.gov (United States)

    Balbus, J. M.; Kirsch, T.; Mitrani-Reiser, J.

    2017-12-01

    Over recent decades, natural disasters and mass-casualty events in United States have repeatedly revealed the serious consequences of health care facility vulnerability and the subsequent ability to deliver care for the affected people. Advances in predictive modeling and vulnerability assessment for health care facility failure, integrated infrastructure, and extreme weather events have now enabled a more rigorous scientific approach to evaluating health care system vulnerability and assessing impacts of natural and human disasters as well as the value of specific interventions. Concurrent advances in computing capacity also allow, for the first time, full integration of these multiple individual models, along with the modeling of population behaviors and mass casualty responses during a disaster. A team of federal and academic investigators led by the National Center for Disaster Medicine and Public Health (NCDMPH) is develoing a platform for integrating extreme event forecasts, health risk/impact assessment and population simulations, critical infrastructure (electrical, water, transportation, communication) impact and response models, health care facility-specific vulnerability and failure assessments, and health system/patient flow responses. The integration of these models is intended to develop much greater understanding of critical tipping points in the vulnerability of health systems during natural and human disasters and build an evidence base for specific interventions. Development of such a modeling platform will greatly facilitate the assessment of potential concurrent or sequential catastrophic events, such as a terrorism act following a severe heat wave or hurricane. This presentation will highlight the development of this modeling platform as well as applications not just for the US health system, but also for international science-based disaster risk reduction efforts, such as the Sendai Framework and the WHO SMART hospital project.

  2. Major life events and risk of Parkinson's disease

    DEFF Research Database (Denmark)

    Rod, Naja Hulvej; Hansen, Johnni; Schernhammer, Eva

    2010-01-01

    major life events are risk factors for Parkinson's disease. Between 1986 and 2006, we identified 13,695 patients with a (PD) primary diagnosis of PD in the Danish National Hospital Register. Each case was frequency matched by age and gender to five population controls. Information on major life events...... before onset of PD was ascertained from national registries. Among men, number of life events was associated with risk of Parkinson's disease in an inverse dose-response manner (P ....34-0.99). Life events were not associated with PD in women. In contrast, a higher risk of PD was observed among women who had never been married (1.16; 1.04-1.29) and among men (1.47; 1.18-1.82) and women (1.30; 1.05-1.61) who have never been employees. The lower risk of Parkinson's disease among men who had...

  3. Climate Change and Hydrological Extreme Events - Risks and Perspectives for Water Management in Bavaria and Québec

    Science.gov (United States)

    Ludwig, R.

    2017-12-01

    There is as yet no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for `virtual perfect prediction', which assesses climate change impacts on flood risk and water resources management by identifying patterns in the data which reveal preferential triggers of hydrological extreme events. The presentation will highlight first results from the analysis of the large scale ClimEx model ensemble, showing the

  4. Measuring the coupled risks: A copula-based CVaR model

    Science.gov (United States)

    He, Xubiao; Gong, Pu

    2009-01-01

    Integrated risk management for financial institutions requires an approach for aggregating risk types (such as market and credit) whose distributional shapes vary considerably. The financial institutions often ignore risks' coupling influence so as to underestimate the financial risks. We constructed a copula-based Conditional Value-at-Risk (CVaR) model for market and credit risks. This technique allows us to incorporate realistic marginal distributions that capture essential empirical features of these risks, such as skewness and fat-tails while allowing for a rich dependence structure. Finally, the numerical simulation method is used to implement the model. Our results indicate that the coupled risks for the listed company's stock maybe are undervalued if credit risk is ignored, especially for the listed company with bad credit quality.

  5. Increased risk of arterial thromboembolic events after Staphylococcus aureus bacteremia

    DEFF Research Database (Denmark)

    Mejer, N; Gotland, N; Uhre, M L

    2015-01-01

    OBJECTIVES: An association between infection and arterial thromboembolic events (ATE) has been suggested. Here we examined the risk of myocardial infarction (MI), stroke and other ATE after Staphylococcus aureus bacteremia (SAB). METHODS: Danish register-based nation-wide observational cohort study...

  6. Formal safety assessment based on relative risks model in ship navigation

    Energy Technology Data Exchange (ETDEWEB)

    Hu Shenping [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: sphu@mmc.shmtu.edu.cn; Fang Quangen [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: qgfang@mmc.shmtu.edu.cn; Xia Haibo [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: hbxia@mmc.shmtu.edu.cn; Xi Yongtao [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: xiyt@mmc.shmtu.edu.cn

    2007-03-15

    Formal safety assessment (FSA) is a structured and systematic methodology aiming at enhancing maritime safety. It has been gradually and broadly used in the shipping industry nowadays around the world. On the basis of analysis and conclusion of FSA approach, this paper discusses quantitative risk assessment and generic risk model in FSA, especially frequency and severity criteria in ship navigation. Then it puts forward a new model based on relative risk assessment (MRRA). The model presents a risk-assessment approach based on fuzzy functions and takes five factors into account, including detailed information about accident characteristics. It has already been used for the assessment of pilotage safety in Shanghai harbor, China. Consequently, it can be proved that MRRA is a useful method to solve the problems in the risk assessment of ship navigation safety in practice.

  7. Formal safety assessment based on relative risks model in ship navigation

    International Nuclear Information System (INIS)

    Hu Shenping; Fang Quangen; Xia Haibo; Xi Yongtao

    2007-01-01

    Formal safety assessment (FSA) is a structured and systematic methodology aiming at enhancing maritime safety. It has been gradually and broadly used in the shipping industry nowadays around the world. On the basis of analysis and conclusion of FSA approach, this paper discusses quantitative risk assessment and generic risk model in FSA, especially frequency and severity criteria in ship navigation. Then it puts forward a new model based on relative risk assessment (MRRA). The model presents a risk-assessment approach based on fuzzy functions and takes five factors into account, including detailed information about accident characteristics. It has already been used for the assessment of pilotage safety in Shanghai harbor, China. Consequently, it can be proved that MRRA is a useful method to solve the problems in the risk assessment of ship navigation safety in practice

  8. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    OpenAIRE

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory perfor...

  9. 20. Prediction of 10-year risk of hard coronary events among Saudi adults based on prevalence of heart disease risk factors

    Directory of Open Access Journals (Sweden)

    Muhammad Adil Soofi

    2015-10-01

    Conclusions: Our study is the first to estimate the 10-year risk of HCE among adults in an emerging country and discovered a significant proportion of younger aged population are at risk for development of hard coronary events. Public awareness programs to control risk factors are warranted.

  10. Ambulatory blood pressure monitoring and development of cardiovascular events in high-risk patients included in the Spanish ABPM registry: the CARDIORISC Event study.

    Science.gov (United States)

    de la Sierra, Alejandro; Banegas, José R; Segura, Julián; Gorostidi, Manuel; Ruilope, Luis M

    2012-04-01

    Ambulatory blood pressure monitoring (ABPM) is superior to conventional BP measurement in predicting outcome, with baseline 24-h, daytime and night-time absolute values, as well as relative nocturnal decline, as powerful determinants of prognosis. We aimed to evaluate ABPM estimates on the appearance of cardiovascular events and mortality in a cohort of high-risk treated hypertensive patients. A total of 2115 treated hypertensive patients with high or very high added risk were evaluated by means of office and 24-h ABPM. Cardiovascular events and mortality were assessed after a median follow-up of 4 years. Two hundred and sixty-eight patients (12.7%) experienced a primary event (nonfatal coronary or cerebrovascular event, heart failure hospitalization or cardiovascular death) and 114 died (45 from cardiovascular causes). In a multiple Cox regression model, and after adjusting for baseline cardiovascular risk and office BP, night-time SBP predicted cardiovascular events [hazard ratio for each SD increase: 1.45; 95% confidence interval (CI) 1.29-1.59]. Values above 130 mmHg increased the risk by 52% in comparison to values less than 115 mmHg. In addition to clinical determinants of cardiovascular risk and conventional BP, ABPM performed during treatment adds prognostic significance on the development of cardiovascular events in high-risk hypertensive patients. Among different ABPM-derived values, night-time SBP is the most potent predictor of outcome.

  11. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    Science.gov (United States)

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2017-10-01

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  12. Adverse life events increase risk for postpartum psychiatric episodes: A population-based epidemiologic study.

    Science.gov (United States)

    Meltzer-Brody, S; Larsen, J T; Petersen, L; Guintivano, J; Florio, A Di; Miller, W C; Sullivan, P F; Munk-Olsen, T

    2018-02-01

    Trauma histories may increase risk of perinatal psychiatric episodes. We designed an epidemiological population-based cohort study to explore if adverse childhood experiences (ACE) in girls increases risk of later postpartum psychiatric episodes. Using Danish registers, we identified women born in Denmark between January 1980 and December 1998 (129,439 childbirths). Exposure variables were ACE between ages 0 and 15 including: (1) family disruption, (2) parental somatic illness, (3) parental labor market exclusion, (4) parental criminality, (5) parental death, (6) placement in out-of-home care, (7) parental psychopathology excluding substance use, and (8) parental substance use disorder. Primary outcome was first occurrence of in- or outpatient contact 0-6 months postpartum at a psychiatric treatment facility with any psychiatric diagnoses, ICD-10, F00-F99 (N = 651). We conducted survival analyses using Cox proportional hazard regressions of postpartum psychiatric episodes. Approximately 52% of the sample experienced ACE, significantly increasing risk of any postpartum psychiatric diagnosis. Highest risks were observed among women who experienced out-of-home placement, hazard ratio (HR) 2.57 (95% CI: 1.90-3.48). Women experiencing two adverse life events had higher risks of postpartum psychiatric diagnosis HR: 1.88 (95% CI: 1.51-2.36), compared to those with one ACE, HR: 1.24 (95% CI: 1.03-49) and no ACE, HR: 1.00 (reference group). ACE primarily due to parental psychopathology and disability contributes to increased risk of postpartum psychiatric episodes; and greater numbers of ACE increases risk for postpartum psychiatric illness with an observed dose-response effect. Future work should explore genetic and environmental factors that increase risk and/or confer resilience. © 2017 Wiley Periodicals, Inc.

  13. Blended Risk Approach in Applying PSA Models to Risk-Based Regulations

    International Nuclear Information System (INIS)

    Dimitrijevic, V. B.; Chapman, J. R.

    1996-01-01

    In this paper, the authors will discuss a modern approach in applying PSA models in risk-based regulation. The Blended Risk Approach is a combination of traditional and probabilistic processes. It is receiving increased attention in different industries in the U. S. and abroad. The use of the deterministic regulations and standards provides a proven and well understood basis on which to assess and communicate the impact of change to plant design and operation. Incorporation of traditional values into risk evaluation is working very well in the blended approach. This approach is very application specific. It includes multiple risk attributes, qualitative risk analysis, and basic deterministic principles. In blending deterministic and probabilistic principles, this approach ensures that the objectives of the traditional defense-in-depth concept are not compromised and the design basis of the plant is explicitly considered. (author)

  14. Generalization of the event-based Carnevale-Hines integration scheme for integrate-and-fire models

    NARCIS (Netherlands)

    van Elburg, R.A.J.; van Ooyen, A.

    2009-01-01

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on

  15. Generalization of the Event-Based Carnevale-Hines Integration Scheme for Integrate-and-Fire Models

    NARCIS (Netherlands)

    van Elburg, Ronald A. J.; van Ooyen, Arjen

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on

  16. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  17. Using a High-Resolution Ensemble Modeling Method to Inform Risk-Based Decision-Making at Taylor Park Dam, Colorado

    Science.gov (United States)

    Mueller, M.; Mahoney, K. M.; Holman, K. D.

    2015-12-01

    The Bureau of Reclamation (Reclamation) is responsible for the safety of Taylor Park Dam, located in central Colorado at an elevation of 9300 feet. A key aspect of dam safety is anticipating extreme precipitation, runoff and the associated inflow of water to the reservoir within a probabilistic framework for risk analyses. The Cooperative Institute for Research in Environmental Sciences (CIRES) has partnered with Reclamation to improve understanding and estimation of precipitation in the western United States, including the Taylor Park watershed. A significant challenge is that Taylor Park Dam is located in a relatively data-sparse region, surrounded by mountains exceeding 12,000 feet. To better estimate heavy precipitation events in this basin, a high-resolution modeling approach is used. The Weather Research and Forecasting (WRF) model is employed to simulate events that have produced observed peaks in streamflow at the location of interest. Importantly, an ensemble of model simulations are run on each event so that uncertainty bounds (i.e., forecast error) may be provided such that the model outputs may be more effectively used in Reclamation's risk assessment framework. Model estimates of precipitation (and the uncertainty thereof) are then used in rainfall runoff models to determine the probability of inflows to the reservoir for use in Reclamation's dam safety risk analyses.

  18. Maximum likelihood estimation of semiparametric mixture component models for competing risks data.

    Science.gov (United States)

    Choi, Sangbum; Huang, Xuelin

    2014-09-01

    In the analysis of competing risks data, the cumulative incidence function is a useful quantity to characterize the crude risk of failure from a specific event type. In this article, we consider an efficient semiparametric analysis of mixture component models on cumulative incidence functions. Under the proposed mixture model, latency survival regressions given the event type are performed through a class of semiparametric models that encompasses the proportional hazards model and the proportional odds model, allowing for time-dependent covariates. The marginal proportions of the occurrences of cause-specific events are assessed by a multinomial logistic model. Our mixture modeling approach is advantageous in that it makes a joint estimation of model parameters associated with all competing risks under consideration, satisfying the constraint that the cumulative probability of failing from any cause adds up to one given any covariates. We develop a novel maximum likelihood scheme based on semiparametric regression analysis that facilitates efficient and reliable estimation. Statistical inferences can be conveniently made from the inverse of the observed information matrix. We establish the consistency and asymptotic normality of the proposed estimators. We validate small sample properties with simulations and demonstrate the methodology with a data set from a study of follicular lymphoma. © 2014, The International Biometric Society.

  19. Stimulating household flood risk mitigation investments through insurance and subsidies: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Botzen, Wouter; de Moel, Hans; Aerts, Jeroen

    2015-04-01

    In the period 1998-2009, floods triggered roughly 52 billion euro in insured economic losses making floods the most costly natural hazard in Europe. Climate change and socio/economic trends are expected to further aggrevate floods losses in many regions. Research shows that flood risk can be significantly reduced if households install protective measures, and that the implementation of such measures can be stimulated through flood insurance schemes and subsidies. However, the effectiveness of such incentives to stimulate implementation of loss-reducing measures greatly depends on the decision process of individuals and is hardly studied. In our study, we developed an Agent-Based Model that integrates flood damage models, insurance mechanisms, subsidies, and household behaviour models to assess the effectiveness of different economic tools on stimulating households to invest in loss-reducing measures. Since the effectiveness depends on the decision making process of individuals, the study compares different household decision models ranging from standard economic models, to economic models for decision making under risk, to more complex decision models integrating economic models and risk perceptions, opinion dynamics, and the influence of flood experience. The results show the effectiveness of incentives to stimulate investment in loss-reducing measures for different household behavior types, while assuming climate change scenarios. It shows how complex decision models can better reproduce observed real-world behaviour compared to traditional economic models. Furthermore, since flood events are included in the simulations, the results provide an analysis of the dynamics in insured and uninsured losses for households, the costs of reducing risk by implementing loss-reducing measures, the capacity of the insurance market, and the cost of government subsidies under different scenarios. The model has been applied to the City of Rotterdam in The Netherlands.

  20. An Integrated Risk Index Model Based on Hierarchical Fuzzy Logic for Underground Risk Assessment

    Directory of Open Access Journals (Sweden)

    Muhammad Fayaz

    2017-10-01

    Full Text Available Available space in congested cities is getting scarce due to growing urbanization in the recent past. The utilization of underground space is considered as a solution to the limited space in smart cities. The numbers of underground facilities are growing day by day in the developing world. Typical underground facilities include the transit subway, parking lots, electric lines, water supply and sewer lines. The likelihood of the occurrence of accidents due to underground facilities is a random phenomenon. To avoid any accidental loss, a risk assessment method is required to conduct the continuous risk assessment and report any abnormality before it happens. In this paper, we have proposed a hierarchical fuzzy inference based model for under-ground risk assessment. The proposed hierarchical fuzzy inference architecture reduces the total number of rules from the rule base. Rule reduction is important because the curse of dimensionality damages the transparency and interpretation as it is very tough to understand and justify hundreds or thousands of fuzzy rules. The computation time also increases as rules increase. The proposed model takes 175 rules having eight input parameters to compute the risk index, and the conventional fuzzy logic requires 390,625 rules, having the same number of input parameters to compute risk index. Hence, the proposed model significantly reduces the curse of dimensionality. Rule design for fuzzy logic is also a tedious task. In this paper, we have also introduced new rule schemes, namely maximum rule-based and average rule-based; both schemes can be used interchangeably according to the logic needed for rule design. The experimental results show that the proposed method is a virtuous choice for risk index calculation where the numbers of variables are greater.

  1. Including model uncertainty in risk-informed decision making

    International Nuclear Information System (INIS)

    Reinert, Joshua M.; Apostolakis, George E.

    2006-01-01

    Model uncertainties can have a significant impact on decisions regarding licensing basis changes. We present a methodology to identify basic events in the risk assessment that have the potential to change the decision and are known to have significant model uncertainties. Because we work with basic event probabilities, this methodology is not appropriate for analyzing uncertainties that cause a structural change to the model, such as success criteria. We use the risk achievement worth (RAW) importance measure with respect to both the core damage frequency (CDF) and the change in core damage frequency (ΔCDF) to identify potentially important basic events. We cross-check these with generically important model uncertainties. Then, sensitivity analysis is performed on the basic event probabilities, which are used as a proxy for the model parameters, to determine how much error in these probabilities would need to be present in order to impact the decision. A previously submitted licensing basis change is used as a case study. Analysis using the SAPHIRE program identifies 20 basic events as important, four of which have model uncertainties that have been identified in the literature as generally important. The decision is fairly insensitive to uncertainties in these basic events. In three of these cases, one would need to show that model uncertainties would lead to basic event probabilities that would be between two and four orders of magnitude larger than modeled in the risk assessment before they would become important to the decision. More detailed analysis would be required to determine whether these higher probabilities are reasonable. Methods to perform this analysis from the literature are reviewed and an example is demonstrated using the case study

  2. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    Science.gov (United States)

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  3. Various sizes of sliding event bursts in the plastic flow of metallic glasses based on a spatiotemporal dynamic model

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Jingli, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn; Chen, Cun [School of Mathematics and Statistics, Zhengzhou University, Zhengzhou 450001 (China); Wang, Gang, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn [Laboratory for Microstructures, Shanghai University, Shanghai 200444 (China); Cheung, Wing-Sum [Department of Mathematics, The University of HongKong, HongKong (China); Sun, Baoan; Mattern, Norbert [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Siegmund, Stefan [Department of Mathematics, TU Dresden, D-01062 Dresden (Germany); Eckert, Jürgen [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Institute of Materials Science, TU Dresden, D-01062 Dresden (Germany)

    2014-07-21

    This paper presents a spatiotemporal dynamic model based on the interaction between multiple shear bands in the plastic flow of metallic glasses during compressive deformation. Various sizes of sliding events burst in the plastic deformation as the generation of different scales of shear branches occurred; microscopic creep events and delocalized sliding events were analyzed based on the established model. This paper discusses the spatially uniform solutions and traveling wave solution. The phase space of the spatially uniform system applied in this study reflected the chaotic state of the system at a lower strain rate. Moreover, numerical simulation showed that the microscopic creep events were manifested at a lower strain rate, whereas the delocalized sliding events were manifested at a higher strain rate.

  4. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  5. Some implications of an event-based definition of exposure to the risk of road accident

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving the road......This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving...

  6. Characteristics and Future Changes of Great Mississippi Flood Events in a Global Coupled Climate Model

    Science.gov (United States)

    van der Wiel, K.; Kapnick, S. B.; Vecchi, G.; Smith, J. A.

    2017-12-01

    The Mississippi-Missouri river catchment houses millions of people and much of the U.S. national agricultural production. Severe flooding events can therefore have large negative societal, natural and economic impacts. GFDL FLOR, a global coupled climate model (atmosphere, ocean, land, sea ice with integrated river routing module) is used to investigate the characteristics of great Mississippi floods with an average return period of 100 years. Model experiments under pre-industrial greenhouse gas forcing were conducted for 3400 years, such that the most extreme flooding events were explicitly modeled and the land and/or atmospheric causes could be investigated. It is shown that melt of snow pack and frozen sub-surface water in the Missouri and Upper Mississippi basins prime the river system, subsequently sensitizing it to above average precipitation in the Ohio and Tennessee basins. The months preceding the greatest flooding events are above average wet, leading to moist sub-surface conditions. Anomalous melt depends on the availability of frozen water in the catchment, therefore anomalous amounts of sub-surface frozen water and anomalous large snow pack in winter (Nov-Feb) make the river system susceptible for these great flooding events in spring (Feb-Apr). An additional experiment of 1200 years under transient greenhouse gas forcing (RCP4.5, 5 members) was done to investigate potential future change in flood risk. Based on a peak-over-threshold method, it is found that the number of great flooding events decreases in a warmer future. This decrease coincides with decreasing occurrence of large melt events, but is despite increasing numbers of large precipitation events. Though the model results indicate a decreasing risk for the greatest flooding events, the predictability of events might decrease in a warmer future given the changing characters of melt and precipitation.

  7. Review of the severe accident risk reduction program (SARRP) containment event trees

    International Nuclear Information System (INIS)

    1986-05-01

    A part of the Severe Accident Risk Reduction Program, researchers at Sandia National Laboratories have constructed a group of containment event trees to be used in the analysis of key accident sequences for light water reactors (LWR) during postulated severe accidents. The ultimate goal of the program is to provide to the NRC staff a current assessment of the risk from severe reactor accidents for a group of five light water reactors. This review specifically focuses on the development and construction of the containment event trees and the results for containment failure probability, modes and timing. The report first gives the background on the program, the review criteria, and a summary of the observations, findings and recommendations. secondly, the individual reviews of each committee member on the event trees is presented. Finally, a review is provided on the computer model used to construct and evaluate the event trees

  8. Application of a Lifestyle-Based Tool to Estimate Premature Cardiovascular Disease Events in Young Adults: The Coronary Artery Risk Development in Young Adults (CARDIA) Study.

    Science.gov (United States)

    Gooding, Holly C; Ning, Hongyan; Gillman, Matthew W; Shay, Christina; Allen, Norrina; Goff, David C; Lloyd-Jones, Donald; Chiuve, Stephanie

    2017-09-01

    Few tools exist for assessing the risk for early atherosclerotic cardiovascular disease (ASCVD) events in young adults. To assess the performance of the Healthy Heart Score (HHS), a lifestyle-based tool that estimates ASCVD events in older adults, for ASCVD events occurring before 55 years of age. This prospective cohort study included 4893 US adults aged 18 to 30 years from the Coronary Artery Risk Development in Young Adults (CARDIA) study. Participants underwent measurement of lifestyle factors from March 25, 1985, through June 7, 1986, and were followed up for a median of 27.1 years (interquartile range, 26.9-27.2 years). Data for this study were analyzed from February 24 through December 12, 2016. The HHS includes age, smoking status, body mass index, alcohol intake, exercise, and a diet score composed of self-reported daily intake of cereal fiber, fruits and/or vegetables, nuts, sugar-sweetened beverages, and red and/or processed meats. The HHS in the CARDIA study was calculated using sex-specific equations produced by its derivation cohorts. The ability of the HHS to assess the 25-year risk for ASCVD (death from coronary heart disease, nonfatal myocardial infarction, and fatal or nonfatal ischemic stroke) in the total sample, in race- and sex-specific subgroups, and in those with and without clinical ASCVD risk factors at baseline. Model discrimination was assessed with the Harrell C statistic; model calibration, with Greenwood-Nam-D'Agostino statistics. The study population of 4893 participants included 2205 men (45.1%) and 2688 women (54.9%) with a mean (SD) age at baseline of 24.8 (3.6) years; 2483 (50.7%) were black; and 427 (8.7%) had at least 1 clinical ASCVD risk factor (hypertension, hyperlipidemia, or diabetes types 1 and 2). Among these participants, 64 premature ASCVD events occurred in women and 99 in men. The HHS showed moderate discrimination for ASCVD risk assessment in this diverse population of mostly healthy young adults (C statistic, 0

  9. A Semi Risk-Based Approach for Managing Urban Drainage Systems under Extreme Rainfall

    Directory of Open Access Journals (Sweden)

    Carlos Salinas-Rodriguez

    2018-03-01

    Full Text Available Conventional design standards for urban drainage systems are not set to deal with extreme rainfall events. As these events are becoming more frequent, there is room for proposing new planning approaches and standards that are flexible enough to cope with a wide range of rainfall events. In this paper, a semi risk-based approach is presented as a simple and practical way for the analysis and management of rainfall flooding at the precinct scale. This approach uses various rainfall events as input parameters for the analysis of the flood hazard and impacts, and categorises the flood risk in different levels, ranging from very low to very high risk. When visualised on a map, the insight into the risk levels across the precinct will enable engineers and spatial planners to identify and prioritise interventions to manage the flood risk. The approach is demonstrated for a sewer district in the city of Rotterdam, the Netherlands, using a one-dimensional (1D/two-dimensional (2D flood model. The risk level of this area is classified as being predominantly very low or low, with a couple of locations with high and very high risk. For these locations interventions, such as disconnection and lowering street profiles, have been proposed and analysed with the 1D/2D flood model. The interventions were shown to be effective in reducing the risk levels from very high/high risk to medium/low risk.

  10. Perioperative Respiratory Adverse Events in Pediatric Ambulatory Anesthesia: Development and Validation of a Risk Prediction Tool.

    Science.gov (United States)

    Subramanyam, Rajeev; Yeramaneni, Samrat; Hossain, Mohamed Monir; Anneken, Amy M; Varughese, Anna M

    2016-05-01

    Perioperative respiratory adverse events (PRAEs) are the most common cause of serious adverse events in children receiving anesthesia. Our primary aim of this study was to develop and validate a risk prediction tool for the occurrence of PRAE from the onset of anesthesia induction until discharge from the postanesthesia care unit in children younger than 18 years undergoing elective ambulatory anesthesia for surgery and radiology. The incidence of PRAE was studied. We analyzed data from 19,059 patients from our department's quality improvement database. The predictor variables were age, sex, ASA physical status, morbid obesity, preexisting pulmonary disorder, preexisting neurologic disorder, and location of ambulatory anesthesia (surgery or radiology). Composite PRAE was defined as the presence of any 1 of the following events: intraoperative bronchospasm, intraoperative laryngospasm, postoperative apnea, postoperative laryngospasm, postoperative bronchospasm, or postoperative prolonged oxygen requirement. Development and validation of the risk prediction tool for PRAE were performed using a split sampling technique to split the database into 2 independent cohorts based on the year when the patient received ambulatory anesthesia for surgery and radiology using logistic regression. A risk score was developed based on the regression coefficients from the validation tool. The performance of the risk prediction tool was assessed by using tests of discrimination and calibration. The overall incidence of composite PRAE was 2.8%. The derivation cohort included 8904 patients, and the validation cohort included 10,155 patients. The risk of PRAE was 3.9% in the development cohort and 1.8% in the validation cohort. Age ≤ 3 years (versus >3 years), ASA physical status II or III (versus ASA physical status I), morbid obesity, preexisting pulmonary disorder, and surgery (versus radiology) significantly predicted the occurrence of PRAE in a multivariable logistic regression

  11. Intelligent judgements over health risks in a spatial agent-based model.

    Science.gov (United States)

    Abdulkareem, Shaheen A; Augustijn, Ellen-Wien; Mustafa, Yaseen T; Filatova, Tatiana

    2018-03-20

    Millions of people worldwide are exposed to deadly infectious diseases on a regular basis. Breaking news of the Zika outbreak for instance, made it to the main media titles internationally. Perceiving disease risks motivate people to adapt their behavior toward a safer and more protective lifestyle. Computational science is instrumental in exploring patterns of disease spread emerging from many individual decisions and interactions among agents and their environment by means of agent-based models. Yet, current disease models rarely consider simulating dynamics in risk perception and its impact on the adaptive protective behavior. Social sciences offer insights into individual risk perception and corresponding protective actions, while machine learning provides algorithms and methods to capture these learning processes. This article presents an innovative approach to extend agent-based disease models by capturing behavioral aspects of decision-making in a risky context using machine learning techniques. We illustrate it with a case of cholera in Kumasi, Ghana, accounting for spatial and social risk factors that affect intelligent behavior and corresponding disease incidents. The results of computational experiments comparing intelligent with zero-intelligent representations of agents in a spatial disease agent-based model are discussed. We present a spatial disease agent-based model (ABM) with agents' behavior grounded in Protection Motivation Theory. Spatial and temporal patterns of disease diffusion among zero-intelligent agents are compared to those produced by a population of intelligent agents. Two Bayesian Networks (BNs) designed and coded using R and are further integrated with the NetLogo-based Cholera ABM. The first is a one-tier BN1 (only risk perception), the second is a two-tier BN2 (risk and coping behavior). We run three experiments (zero-intelligent agents, BN1 intelligence and BN2 intelligence) and report the results per experiment in terms of

  12. Modeling and simulation of discrete event systems

    CERN Document Server

    Choi, Byoung Kyu

    2013-01-01

    Computer modeling and simulation (M&S) allows engineers to study and analyze complex systems. Discrete-event system (DES)-M&S is used in modern management, industrial engineering, computer science, and the military. As computer speeds and memory capacity increase, so DES-M&S tools become more powerful and more widely used in solving real-life problems. Based on over 20 years of evolution within a classroom environment, as well as on decades-long experience in developing simulation-based solutions for high-tech industries, Modeling and Simulation of Discrete-Event Systems is the only book on

  13. Program package for data preparation of RISK events

    International Nuclear Information System (INIS)

    Denes, E.; Wagner, I.; Nagy, J.

    1980-01-01

    A FORTRAN program package written for the CDC-6500 computer is presented. The SMHV program is designed to transform data obtained from events of RISK streamer chamber by means of measuring SAMET or PUOS devices to the HEVAS data tormat format needed by the geometrical reconstruction program. Such a transformation provides the standartization of measurement data procession inside the RISK collaboration and capability of the direct input into program of event geometrical reconstruction registered on a film of RISK streamer chamber

  14. Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments

    Science.gov (United States)

    Manning, Ted A.; Lawrence, Scott L.

    2014-01-01

    As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.

  15. Hierarchical Context Modeling for Video Event Recognition.

    Science.gov (United States)

    Wang, Xiaoyang; Ji, Qiang

    2016-10-11

    Current video event recognition research remains largely target-centered. For real-world surveillance videos, targetcentered event recognition faces great challenges due to large intra-class target variation, limited image resolution, and poor detection and tracking results. To mitigate these challenges, we introduced a context-augmented video event recognition approach. Specifically, we explicitly capture different types of contexts from three levels including image level, semantic level, and prior level. At the image level, we introduce two types of contextual features including the appearance context features and interaction context features to capture the appearance of context objects and their interactions with the target objects. At the semantic level, we propose a deep model based on deep Boltzmann machine to learn event object representations and their interactions. At the prior level, we utilize two types of prior-level contexts including scene priming and dynamic cueing. Finally, we introduce a hierarchical context model that systematically integrates the contextual information at different levels. Through the hierarchical context model, contexts at different levels jointly contribute to the event recognition. We evaluate the hierarchical context model for event recognition on benchmark surveillance video datasets. Results show that incorporating contexts in each level can improve event recognition performance, and jointly integrating three levels of contexts through our hierarchical model achieves the best performance.

  16. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    Science.gov (United States)

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  17. Risk of cerebrovascular events in persons with and without HIV: A Danish nationwide population-based cohort study

    DEFF Research Database (Denmark)

    Rasmussen, Line D; Engsig, Frederik Neess; Christensen, Hanne

    2011-01-01

    OBJECTIVE:: To assess the risk of cerebrovascular events (CVE) in HIV-infected individuals and evaluate the impact of proven risk factors, injection drug abuse (IDU), immunodeficiency, highly active antiretroviral therapy (HAART) and family-related risk factors. DESIGN:: Nationwide, population...

  18. Risk-based systems analysis for emerging technologies: Applications of a technology risk assessment model to public decision making

    International Nuclear Information System (INIS)

    Quadrel, M.J.; Fowler, K.M.; Cameron, R.; Treat, R.J.; McCormack, W.D.; Cruse, J.

    1995-01-01

    The risk-based systems analysis model was designed to establish funding priorities among competing technologies for tank waste remediation. The model addresses a gap in the Department of Energy's (DOE's) ''toolkit'' for establishing funding priorities among emerging technologies by providing disciplined risk and cost assessments of candidate technologies within the context of a complete remediation system. The model is comprised of a risk and cost assessment and a decision interface. The former assesses the potential reductions in risk and cost offered by new technology relative to the baseline risk and cost of an entire system. The latter places this critical information in context of other values articulated by decision makers and stakeholders in the DOE system. The risk assessment portion of the model is demonstrated for two candidate technologies for tank waste retrieval (arm-based mechanical retrieval -- the ''long reach arm'') and subsurface barriers (close-coupled chemical barriers). Relative changes from the base case in cost and risk are presented for these two technologies to illustrate how the model works. The model and associated software build on previous work performed for DOE's Office of Technology Development and the former Underground Storage Tank Integrated Demonstration, and complement a decision making tool presented at Waste Management 1994 for integrating technical judgements and non-technical (stakeholder) values when making technology funding decisions

  19. ClimEx - Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec

    Science.gov (United States)

    Ludwig, Ralf; Baese, Frank; Braun, Marco; Brietzke, Gilbert; Brissette, Francois; Frigon, Anne; Giguère, Michel; Komischke, Holger; Kranzlmueller, Dieter; Leduc, Martin; Martel, Jean-Luc; Ricard, Simon; Schmid, Josef; von Trentini, Fabian; Turcotte, Richard; Weismueller, Jens; Willkofer, Florian; Wood, Raul

    2017-04-01

    The recent accumulation of extreme hydrological events in Bavaria and Québec has stimulated scientific and also societal interest. In addition to the challenges of an improved prediction of such situations and the implications for the associated risk management, there is, as yet, no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for 'virtual perfect prediction', which assesses climate change impacts

  20. Concordance for prognostic models with competing risks

    DEFF Research Database (Denmark)

    Wolbers, Marcel; Blanche, Paul; Koller, Michael T

    2014-01-01

    The concordance probability is a widely used measure to assess discrimination of prognostic models with binary and survival endpoints. We formally define the concordance probability for a prognostic model of the absolute risk of an event of interest in the presence of competing risks and relate i...

  1. Stochastic modeling of central apnea events in preterm infants.

    Science.gov (United States)

    Clark, Matthew T; Delos, John B; Lake, Douglas E; Lee, Hoshik; Fairchild, Karen D; Kattwinkel, John; Moorman, J Randall

    2016-04-01

    A near-ubiquitous pathology in very low birth weight infants is neonatal apnea, breathing pauses with slowing of the heart and falling blood oxygen. Events of substantial duration occasionally occur after an infant is discharged from the neonatal intensive care unit (NICU). It is not known whether apneas result from a predictable process or from a stochastic process, but the observation that they occur in seemingly random clusters justifies the use of stochastic models. We use a hidden-Markov model to analyze the distribution of durations of apneas and the distribution of times between apneas. The model suggests the presence of four breathing states, ranging from very stable (with an average lifetime of 12 h) to very unstable (with an average lifetime of 10 s). Although the states themselves are not visible, the mathematical analysis gives estimates of the transition rates among these states. We have obtained these transition rates, and shown how they change with post-menstrual age; as expected, the residence time in the more stable breathing states increases with age. We also extrapolated the model to predict the frequency of very prolonged apnea during the first year of life. This paradigm-stochastic modeling of cardiorespiratory control in neonatal infants to estimate risk for severe clinical events-may be a first step toward personalized risk assessment for life threatening apnea events after NICU discharge.

  2. Web-Based versus High-Fidelity Simulation Training for Certified Registered Nurse Anesthetists in the Management of High Risk/Low Occurrence Anesthesia Events

    Science.gov (United States)

    Kimemia, Judy

    2017-01-01

    Purpose: The purpose of this project was to compare web-based to high-fidelity simulation training in the management of high risk/low occurrence anesthesia related events, to enhance knowledge acquisition for Certified Registered Nurse Anesthetists (CRNAs). This project was designed to answer the question: Is web-based training as effective as…

  3. STakeholder-Objective Risk Model (STORM): Determining the aggregated risk of multiple contaminant hazards in groundwater well catchments

    Science.gov (United States)

    Enzenhoefer, R.; Binning, P. J.; Nowak, W.

    2015-09-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.

  4. Addressing dependability by applying an approach for model-based risk assessment

    International Nuclear Information System (INIS)

    Gran, Bjorn Axel; Fredriksen, Rune; Thunem, Atoosa P.-J.

    2007-01-01

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development

  5. Addressing dependability by applying an approach for model-based risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

    2007-11-15

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

  6. One-year adherence to warfarin treatment for venous thromboembolism in high-risk patients and its association with long-term risk of recurrent events.

    Science.gov (United States)

    Chen, Shih-Yin; Wu, Ning; Gulseth, Michael; LaMori, Joyce; Bookhart, Brahim K; Boulanger, Luke; Fields, Larry; Schein, Jeff

    2013-05-01

    Warfarin is the predominant oral anticoagulant used for the prevention of recurrent venous thromboembolism (VTE) events. However, its long-term use is complicated by the need to manage the drug within a narrow therapeutic range and by possible food and drug interactions. To examine the association between 1-year adherence, measured through compliance with and persistence on warfarin treatment for VTE, and long-term risk of recurrent events among patients at high risk. Medical and pharmacy claims for patients with commercial or Medicare supplemental insurance in the Thomson Reuters MarketScan database were analyzed. Adult patients with medical claims with an associated VTE diagnosis between January 1, 2006, and March 31, 2008, were identified. The index date was defined as the date of the first observed VTE claim or the date of discharge if the index event was a hospital stay. High-risk patients (patients with cancer, or noncancer patients who did not have reversible risk factors during the 3-month period prior to the index date) who filled a warfarin prescription within 2 weeks of the index date were included. Persistence was evaluated in terms of discontinuation, defined as a 90-day gap in warfarin supply during a 1-year assessment period following the index date. Compliance was measured by the proportion of days covered (PDC) over the 1-year assessment period, with PDC less than 0.8 defined as noncompliance. Recurrent VTE events were identified as hospitalizations where VTE was the primary diagnosis after the 1-year assessment period and until patients were lost to follow-up. The association between adherence to warfarin therapy and VTE recurrence was evaluated descriptively via Kaplan-Meier curves and a Cox proportional hazards model, adjusted for patient demographic and clinical characteristics. A similar analysis using the medication possession ratio (MPR) as a measure of compliance was also performed in a subset of patients who had filled at least 2 warfarin

  7. Weather based risks and insurances for crop production in Belgium

    Science.gov (United States)

    Gobin, Anne

    2014-05-01

    Extreme weather events such as late frosts, droughts, heat waves and rain storms can have devastating effects on cropping systems. Damages due to extreme events are strongly dependent on crop type, crop stage, soil type and soil conditions. The perspective of rising risk-exposure is exacerbated further by limited aid received for agricultural damage, an overall reduction of direct income support to farmers and projected intensification of weather extremes with climate change. According to both the agriculture and finance sectors, a risk assessment of extreme weather events and their impact on cropping systems is needed. The impact of extreme weather events particularly during the sensitive periods of the farming calendar requires a modelling approach to capture the mixture of non-linear interactions between the crop, its environment and the occurrence of the meteorological event. The risk of soil moisture deficit increases towards harvesting, such that drought stress occurs in spring and summer. Conversely, waterlogging occurs mostly during early spring and autumn. Risks of temperature stress appear during winter and spring for chilling and during summer for heat. Since crop development is driven by thermal time and photoperiod, the regional crop model REGCROP (Gobin, 2010) enabled to examine the likely frequency, magnitude and impacts of frost, drought, heat stress and waterlogging in relation to the cropping season and crop sensitive stages. The risk profiles were subsequently confronted with yields, yield losses and insurance claims for different crops. Physically based crop models such as REGCROP assist in understanding the links between different factors causing crop damage as demonstrated for cropping systems in Belgium. Extreme weather events have already precipitated contraction of insurance coverage in some markets (e.g. hail insurance), and the process can be expected to continue if the losses or damages from such events increase in the future. Climate

  8. Application and Use of PSA-based Event Analysis in Belgium

    International Nuclear Information System (INIS)

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  9. Risk-oriented approach application at planning and orginizing antiepidemic provision of mass events

    Directory of Open Access Journals (Sweden)

    D.V. Efremenko

    2017-03-01

    Full Text Available Mass events tend to become more and more dangerous for population health, as they cause various health risks, including infectious pathologies risks. Our research goal was to work out scientifically grounded approaches to assessing and managing epidemiologic risks as well as analyze their application practices implemented during preparation to the Olympics-2014, the Games themselves, as well as other mass events which took place in 2014–2016. We assessed epidemiologic complications risks with the use of diagnostic test-systems and applying a new technique which allowed for mass events peculiarities. The technique is based on infections ranking as per 3 potential danger categories in accordance with created criteria which represented quantitative and qualitative predictive parameters (predictors. Application of risk-oriented approach and multi-factor analysis allowed us to detect exact possible maximum requirements for providing sanitary-epidemiologic welfare in terms of each separate nosologic form. As we enhanced our laboratory base with test-systems to provide specific indication as per accomplished calculations, it enabled us, on one hand, to secure the required preparations, and, on the other hand, to avoid unnecessary expenditures. To facilitate decision-making process during the Olympics-2014 we used an innovative product, namely, a computer program based on geoinformation system (GIS. It helped us to simplify and to accelerate information exchange within the frameworks of intra- and interdepartmental interaction. "Dynamic epidemiologic threshold" was daily calculated for measles, chickenpox, acute enteric infections and acute respiratory viral infections of various etiology. And if it was exceeded or possibility of "epidemiologic spot" for one or several nosologies occurred, an automatic warning appeared in GIS. Planning prevention activities regarding feral herd infections and zoogenous extremely dangerous infections which were endemic

  10. Event Management for Teacher-Coaches: Risk and Supervision Considerations for School-Based Sports

    Science.gov (United States)

    Paiement, Craig A.; Payment, Matthew P.

    2011-01-01

    A professional sports event requires considerable planning in which years are devoted to the success of that single activity. School-based sports events do not have that luxury, because high schools across the country host athletic events nearly every day. It is not uncommon during the fall sports season for a combination of boys' and girls'…

  11. Disruptive event uncertainties in a perturbation approach to nuclear waste repository risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, T.F.

    1980-09-01

    A methodology is developed for incorporating a full range of the principal forecasting uncertainties into a risk analysis of a nuclear waste repository. The result of this methodology is a set of risk curves similar to those used by Rasmussen in WASH-1400. The set of curves is partially derived from a perturbation approach to analyze potential disruptive event sequences. Such a scheme could be useful in truncating the number of disruptive event scenarios and providing guidance to those establishing data-base development priorities.

  12. Cardiovascular event-free survival after adjuvant radiation therapy in breast cancer patients stratified by cardiovascular risk

    International Nuclear Information System (INIS)

    Onwudiwe, Nneka C; Kwok, Young; Onukwugha, Eberechukwu; Sorkin, John D; Zuckerman, Ilene H; Shaya, Fadia T; Daniel Mullins, C

    2014-01-01

    The objective of this study was to estimate the risk of a cardiovascular event or death associated with modern radiation in a population of elderly female breast cancer patients with varying baseline cardiovascular risk. The data used for this analysis are from the linked Surveillance, Epidemiology, and End-Results (SEER)-Medicare database. The retrospective cohort study included women aged 66 years and older with stage 0–III breast cancer diagnosed between 2000 and 2005. Women were grouped as low, intermediate, or high cardiovascular risk based on the presence of certain clinical diagnoses. The risk for the combined outcome of a hospitalization for a cardiovascular event or death within 6 months and 24 months of diagnosis was estimated using a multivariable Cox model. The median follow-up time was 24 months. Among the 91,612 women with American Joint Committee on Cancer (AJCC) stage 0–III breast cancer: 39,555 (43.2%) were treated with radiation therapy and 52,057 (56.8%) were not. The receipt of radiation therapy in the first 6 months was associated with a statistically significant increased risk for the combined outcome in women categorized as high risk (HR = 1.510; 95% CI, 1.396–1.634) or intermediate risk (HR = 1.415; 95% CI, 1.188–1.686) but not low risk (HR = 1.027; 95% CI, 0.798–1.321). Women with a prior medical history of cardiovascular disease treated with radiation therapy are at increased risk for an event and should be monitored for at least 6 months following treatment with radiation therapy

  13. Markov chains and semi-Markov models in time-to-event analysis.

    Science.gov (United States)

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  14. Risk Assessment of Engineering Project Financing Based on PPP Model

    Directory of Open Access Journals (Sweden)

    Ma Qiuli

    2017-01-01

    Full Text Available At present, the project financing channel is single, and the urban facilities are in short supply, and the risk assessment and prevention mechanism of financing should be further improved to reduce the risk of project financing. In view of this, the fuzzy comprehensive evaluation model of project financing risk which combined the method of fuzzy comprehensive evaluation and analytic hierarchy process is established. The scientificalness and effectiveness of the model are verified by the example of the world port project in Luohe city, and it provides basis and reference for engineering project financing based on PPP mode.

  15. Analysis hierarchical model for discrete event systems

    Science.gov (United States)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  16. Can discrete event simulation be of use in modelling major depression?

    Science.gov (United States)

    Le Lay, Agathe; Despiegel, Nicolas; François, Clément; Duru, Gérard

    2006-12-05

    Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors), our aim was to clarify to what extent "Discrete Event Simulation" (DES) models provide methodological benefits in depicting disease evolution. We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.). DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful compared with Markov processes.

  17. An assessment of the risk significance of human errors in selected PSAs and operating events

    International Nuclear Information System (INIS)

    Palla, R.L. Jr.; El-Bassioni, A.

    1991-01-01

    Sensitivity studies based on Probabilistic Safety Assessments (PSAs) for a pressurized water reactor and a boiling water reactor are described. In each case human errors modeled in the PSAs were categorized according to such factors as error type, location, timing, and plant personnel involved. Sensitivity studies were then conducted by varying the error rates in each category and evaluating the corresponding change in total core damage frequency and accident sequence frequency. Insights obtained are discussed and reasons for differences in risk sensitivity between plants are explored. A separate investigation into the role of human error in risk-important operating events is also described. This investigation involved the analysis of data from the USNRC Accident Sequence Precursor program to determine the effect of operator-initiated events on accident precursor trends, and to determine whether improved training can be correlated to current trends. The findings of this study are also presented. 5 refs., 15 figs., 1 tab

  18. Proton pump inhibitor monotherapy and the risk of cardiovascular events in patients with gastro-esophageal reflux disease: a meta-analysis.

    Science.gov (United States)

    Sun, S; Cui, Z; Zhou, M; Li, R; Li, H; Zhang, S; Ba, Y; Cheng, G

    2017-02-01

    Proton pump inhibitors (PPIs) are commonly used as potent gastric acid secretion antagonists for gastro-esophageal disorders and their overall safety in patients with gastro-esophageal reflux disease (GERD) is considered to be good and they are well-tolerated. However, recent studies have suggested that PPIs may be a potential independent risk factor for cardiovascular adverse events. The aim of our meta-analysis was to examine the association between PPI monotherapy and cardiovascular events in patients with GERD. A literature search involved examination of relevant databases up to July 2015 including PubMed, Cochrane Library, EMBASE, and ClinicalTrial.gov, as well as selected randomized controlled trials (RCTs) reporting cardiovascular events with PPI exposure in GERD patients. In addition, the pooled risk ratio (RR) and heterogeneity were assessed based on a fixed effects model of the meta-analysis and the I 2 statistic, respectively. Seventeen RCTs covering 7540 patients were selected. The pooled data suggested that the use of PPIs was associated with a 70% increased cardiovascular risk (RR=1.70, 95% CI: [1.13-2.56], P=.01, I 2 =0%). Furthermore, higher risks of adverse cardiovascular events in the omeprazole subgroup (RR=3.17, 95% CI: [1.43-7.03], P=.004, I 2 =25%) and long-term treatment subgroup (RR=2.33, 95% CI: [1.33-4.08], P=.003, I 2 =0%) were found. PPI monotherapy can be a risk factor for cardiovascular adverse events. Omeprazole could significantly increase the risk of cardiovascular events and, so, should be used carefully. © 2016 John Wiley & Sons Ltd.

  19. N reactor external events probabilistic risk assessment

    International Nuclear Information System (INIS)

    Baxter, J.T.

    1989-01-01

    An external events probabilistic risk assessment of the N Reactor has been completed. The methods used are those currently being proposed for external events analysis in NUREG-1150. Results are presented for the external hazards that survived preliminary screening. They are earthquake, fire, and external flood. Core damage frequencies for these hazards are shown to be comparable to those for commercial pressurized water reactors. Dominant fire sequences are described and related to 10 CFR 50, Appendix R design requirements. Potential remedial measures that reduce fire core damage risk are described including modifications to fire protection systems, procedure changes, and addition of new administrative controls. Dominant seismic sequences are described. The effect of non-safety support system dependencies on seismic risk is presented

  20. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    International Nuclear Information System (INIS)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described

  1. Risk-based performance indicators

    International Nuclear Information System (INIS)

    Azarm, M.A.; Boccio, J.L.; Vesely, W.E.; Lofgren, E.

    1987-01-01

    The purpose of risk-based indicators is to monitor plant safety. Safety is measured by monitoring the potential for core melt (core-melt frequency) and the public risk. Targets for these measures can be set consistent with NRC safety goals. In this process, the performance of safety systems, support systems, major components, and initiating events can be monitored using measures such as unavailability, failure or occurrence frequency. The changes in performance measures and their trends are determined from the time behavior of monitored measures by differentiation between stochastical and actual variations. Therefore, degradation, as well as improvement in the plant safety performance, can be determined. The development of risk-based performance indicators will also provide the means to trace a change in the safety measures to specific problem areas which are amenable to root cause analysis and inspection audits. In addition, systematic methods will be developed to identify specific improvement policies using the plant information system for the identified problem areas. The final product of the performance indicator project will be a methodology, and an integrated and validated set of software packages which, if properly interfaced with the logic model software of a plant, can monitor the plant performance as plant information is provided as input

  2. TEMAC, Top Event Sensitivity Analysis

    International Nuclear Information System (INIS)

    Iman, R.L.; Shortencarier, M.J.

    1988-01-01

    1 - Description of program or function: TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude of risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates. 2 - Method of solution: Sensitivity and uncertainty analyses associated with top events involve mathematical operations on the corresponding Boolean expression for the top event, as well as repeated evaluations of the top event in a Monte Carlo fashion. TEMAC employs a general matrix approach which provides a convenient general form for Boolean expressions, is computationally efficient, and allows large problems to be analyzed. 3 - Restrictions on the complexity of the problem - Maxima of: 4000 cut sets, 500 events, 500 values in a Monte Carlo sample, 16 characters in an event name. These restrictions are implemented through the FORTRAN 77 PARAMATER statement

  3. Towards renewed health economic simulation of type 2 diabetes: risk equations for first and second cardiovascular events from Swedish register data.

    Directory of Open Access Journals (Sweden)

    Aliasghar Ahmad Kiadaliri

    Full Text Available OBJECTIVE: Predicting the risk of future events is an essential part of health economic simulation models. In pursuit of this goal, the current study aims to predict the risk of developing first and second acute myocardial infarction, heart failure, non-acute ischaemic heart disease, and stroke after diagnosis in patients with type 2 diabetes, using data from the Swedish National Diabetes Register. MATERIAL AND METHODS: Register data on 29,034 patients with type 2 diabetes were analysed over five years of follow up (baseline 2003. To develop and validate the risk equations, the sample was randomly divided into training (75% and test (25% subsamples. The Weibull proportional hazard model was used to estimate the coefficients of the risk equations, and these were validated in both the training and the test samples. RESULTS: In total, 4,547 first and 2,418 second events were observed during the five years of follow up. Experiencing a first event substantially elevated the risk of subsequent events. There were heterogeneities in the effects of covariates within as well as between events; for example, while for females the hazard ratio of having a first acute myocardial infarction was 0.79 (0.70-0.90, the hazard ratio of a second was 1.21 (0.98-1.48. The hazards of second events decreased as the time since first events elapsed. The equations showed adequate calibration and discrimination (C statistics range: 0.70-0.84 in test samples. CONCLUSION: The accuracy of health economic simulation models of type 2 diabetes can be improved by ensuring that they account for the heterogeneous effects of covariates on the risk of first and second cardiovascular events. Thus it is important to extend such models by including risk equations for second cardiovascular events.

  4. WORKSHOP ON APPLICATION OF STATISTICAL METHODS TO BIOLOGICALLY-BASED PHARMACOKINETIC MODELING FOR RISK ASSESSMENT

    Science.gov (United States)

    Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...

  5. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    Science.gov (United States)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  6. Large-scale model-based assessment of deer-vehicle collision risk.

    Directory of Open Access Journals (Sweden)

    Torsten Hothorn

    Full Text Available Ungulates, in particular the Central European roe deer Capreolus capreolus and the North American white-tailed deer Odocoileus virginianus, are economically and ecologically important. The two species are risk factors for deer-vehicle collisions and as browsers of palatable trees have implications for forest regeneration. However, no large-scale management systems for ungulates have been implemented, mainly because of the high efforts and costs associated with attempts to estimate population sizes of free-living ungulates living in a complex landscape. Attempts to directly estimate population sizes of deer are problematic owing to poor data quality and lack of spatial representation on larger scales. We used data on >74,000 deer-vehicle collisions observed in 2006 and 2009 in Bavaria, Germany, to model the local risk of deer-vehicle collisions and to investigate the relationship between deer-vehicle collisions and both environmental conditions and browsing intensities. An innovative modelling approach for the number of deer-vehicle collisions, which allows nonlinear environment-deer relationships and assessment of spatial heterogeneity, was the basis for estimating the local risk of collisions for specific road types on the scale of Bavarian municipalities. Based on this risk model, we propose a new "deer-vehicle collision index" for deer management. We show that the risk of deer-vehicle collisions is positively correlated to browsing intensity and to harvest numbers. Overall, our results demonstrate that the number of deer-vehicle collisions can be predicted with high precision on the scale of municipalities. In the densely populated and intensively used landscapes of Central Europe and North America, a model-based risk assessment for deer-vehicle collisions provides a cost-efficient instrument for deer management on the landscape scale. The measures derived from our model provide valuable information for planning road protection and defining

  7. Simple probabilistic method for relative risk evaluation of nuclear terrorism events

    International Nuclear Information System (INIS)

    Zhang Songbai; Wu Jun

    2006-01-01

    On the basis of the event-tree and probability analysis methods, a probabilistic method of nuclear terrorism risk was built, and the risk of terrorism events was analyzed. With the statistical data for and hypothetical data for relative events, the relative probabilities of the four kinds of nuclear terrorism events were obtained, as well as the relative risks of these four kinds of nuclear terrorism events were calculated by using this probabilistic method. The illustrated case show that the descending sequence of damages from the four kinds of nuclear terrorism events for single event is as following: nuclear explosive and improvised nuclear explosive, nuclear facility attacked, and 'dirty bomb'. Under the hypothetical condition, the descending sequence of possibilities for the four kinds of nuclear terrorism events is as following: 'dirty bomb', nuclear facility attacked, improvised nuclear explosive and nuclear explosive, but the descending sequence of risks is as following: 'dirty bomb', improvised nuclear explosive, nuclear facility attacked, and nuclear explosive . (authors)

  8. The impact of climate change on catastrophe risk models : implications for catastrophe risk markets in developing countries

    OpenAIRE

    Seo, John; Mahul, Olivier

    2009-01-01

    Catastrophe risk models allow insurers, reinsurers and governments to assess the risk of loss from catastrophic events, such as hurricanes. These models rely on computer technology and the latest earth and meteorological science information to generate thousands if not millions of simulated events. Recently observed hurricane activity, particularly in the 2004 and 2005 hurricane seasons, i...

  9. AN ANALYSIS OF RISK EVENTS IN THE OIL-TANKER MAINTENANCE BUSINESS

    Directory of Open Access Journals (Sweden)

    Roque Rabechini Junior

    2012-12-01

    Full Text Available This work presents the results of an investigation into risk events and their respective causes, carried out in ship maintenance undertakings in the logistical sector of the Brazilian oil industry. Its theoretical, conceptual positioning lies in those aspects related to risk management of the undertakings as instruments of support in decision making by executives in the tanker-maintenance business. The case-study method was used as an alternative methodology with a qualitative approach of an exploratory nature and, for the presentation of data, a descriptive format was chosen. Through the analysis of 75 risk events in projects of tanker docking it was possible to extract eight of the greatest relevance. The risk analysis facilitated the identification of actions aimed at their mitigation. As a conclusion it was possible to propose a risk-framework model in four categories, HSE (health, safety and the environment, technicians, externalities and management, designed to provide tanker-docking business executives and administrators, with evidence of actions to assist in their decision-making processes. Finally, the authors identified proposals for further study as well as showing the principal limitations of the study.

  10. Risk-based safety indicators

    International Nuclear Information System (INIS)

    Sedlak, J.

    2001-12-01

    The report is structured as follows: 1. Risk-based safety indicators: Typology of risk-based indicators (RBIs); Tools for defining RBIs; Requirements for the PSA model; Data sources for RBIs; Types of risks monitored; RBIs and operational safety indicators; Feedback from operating experience; PSO model modification for RBIs; RBI categorization; RBI assessment; RBI applications; Suitable RBI applications. 2. Proposal for risk-based indicators: Acquiring information from operational experience; Method of acquiring safety relevance coefficients for the systems from a PSA model; Indicator definitions; On-line indicators. 3. Annex: Application of RBIs worldwide. (P.A.)

  11. How to model mutually exclusive events based on independent causal pathways in Bayesian network models

    OpenAIRE

    Fenton, N.; Neil, M.; Lagnado, D.; Marsh, W.; Yet, B.; Constantinou, A.

    2016-01-01

    We show that existing Bayesian network (BN) modelling techniques cannot capture the correct intuitive reasoning in the important case when a set of mutually exclusive events need to be modelled as separate nodes instead of states of a single node. A previously proposed ‘solution’, which introduces a simple constraint node that enforces mutual exclusivity, fails to preserve the prior probabilities of the events, while other proposed solutions involve major changes to the original model. We pro...

  12. Discrete event dynamic system (DES)-based modeling for dynamic material flow in the pyroprocess

    International Nuclear Information System (INIS)

    Lee, Hyo Jik; Kim, Kiho; Kim, Ho Dong; Lee, Han Soo

    2011-01-01

    A modeling and simulation methodology was proposed in order to implement the dynamic material flow of the pyroprocess. Since the static mass balance provides the limited information on the material flow, it is hard to predict dynamic behavior according to event. Therefore, a discrete event system (DES)-based model named, PyroFlow, was developed at the Korea Atomic Energy Research Institute (KAERI). PyroFlow is able to calculate dynamic mass balance and also show various dynamic operational results in real time. By using PyroFlow, it is easy to rapidly predict unforeseeable results, such as throughput in unit process, accumulated product in buffer and operation status. As preliminary simulations, bottleneck analyses in the pyroprocess were carried out and consequently it was presented that operation strategy had influence on the productivity of the pyroprocess.

  13. Prediction of Major Vascular Events after Stroke

    DEFF Research Database (Denmark)

    Ovbiagele, Bruce; Goldstein, Larry B.; Amarenco, Pierre

    2014-01-01

    BACKGROUND: Identifying patients with recent stroke or transient ischemic attack (TIA) at high risk of major vascular events (MVEs; stroke, myocardial infarction, or vascular death) may help optimize the intensity of secondary preventive interventions. We evaluated the relationships between...... the baseline Framingham Coronary Risk Score (FCRS) and a novel risk prediction model and with the occurrence of MVEs after stroke or TIA in subjects enrolled in the Stroke Prevention by Aggressive Reduction in Cholesterol Level (SPARCL) trial. METHODS: Data from the 4731 subjects enrolled in the SPARCL study...... were analyzed. Hazard ratios (HRs) from Cox regression models were used to determine the risk of subsequent MVEs based on the FCRS predicting 20% or more 10-year coronary heart disease risk. The novel risk model was derived based on multivariable modeling with backward selection. Model discrimination...

  14. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  15. Model Based Verification of Cyber Range Event Environments

    Science.gov (United States)

    2015-11-13

    that may include users, applications, operating systems, servers, hosts, routers, switches, control planes , and instrumentation planes , many of...which lack models for their configuration. Our main contributions in this paper are the following. First, we have developed a configuration ontology...configuration errors in environment designs for several cyber range events. The rest of the paper is organized as follows. Section 2 provides an overview of

  16. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  17. Serum uric acid level as a cardio-cerebrovascular event risk factor in middle-aged and non-obese Chinese men.

    Science.gov (United States)

    Li, Zhi-Jun; Yi, Chen-Ju; Li, Jing; Tang, Na

    2017-04-11

    The role of uric acid as a risk factor for cardio-cerebrovascular diseases is controversial. In this study, we aimed to investigate the relationship between serum uric acid level and the risk of cardio-cerebrovascular events in middle-aged and non-obese Chinese men. We included 3152 participants from the health examination center of Tongji Hospital from June 2007 to June 2010. Clinical examination and medical records were collected at the annual health examination. The hazard ratios (HRs) of uric acid for cardio-cerebrovascular events were calculated by Cox proportional hazards models. Generalized additive model and threshold effect analysis were used to explore the non-linear relationship between serum uric acid level and the incidence of cardio-cerebrovascular event. The mean follow-up time was 52 months. When the participants were classified into four groups by the serum acid quarter (Q1-Q4), the HRs (95% CI) of Q2-Q4 for cardio-cerebrovascular events were 1.26 (0.83, 1.92), 1.97 (1.33, 2.91) and 2.05 (1.40, 3.01), respectively, compared with the reference (Q1). The actual incidence and conditional incidence of cardio-cerebrovascular events in the high serum acid group were higher than those in the low serum acid group, which were stratified by the turning point (sUA = 372 μmol/L). We also showed a strong prognostic accuracy of the multiple variable-based score in 3 years and 5 years, with area under the receiver operating characteristic (ROC) curve of 0.790 (0.756-0.823) and 0.777 (0.749-0.804), respectively. Serum uric acid level is a strong risk factor for cardio-cerebrovascular events.

  18. Soil moisture prediction: bridging event and continuous runoff modelling

    NARCIS (Netherlands)

    Sheikh, V.

    2006-01-01

    The general objective of this study was to investigate the possibility of providing spatially distributed soil moisture data for event-based hydrological models close before a rainfall event. The study area is known as "Catsop", a small catchmment in south Limburg. The models used are: LISEM and

  19. Generation of risk importance information from severe accident PSA model

    International Nuclear Information System (INIS)

    Seo, Mi Ro; Kim, Hyeong Taek; Moon, Chan Kook

    2012-01-01

    One of the important objects conducting Probabilistic Safety Assessment (PSA) is the relative evaluation of importance of the component or function that is greatly affected to the plant safety. This evaluation is performed by the importance assessment methods such as Risk Reduction Worth, Risk Achievement Worth, and Fuss el Vessley method from the aspect of core damage frequency (CDF). In the Level 1 PSA model, the importance of each component can be evaluated since the CDF is calculated by the combination of the branch probability of event tree and the component failure probability in the fault tree. But, the Level 2 PSA model in order to assess the containment integrity cannot evaluate the risk importance by the above methods because the model is consisted of 3 parts, plant damage status, containment event tree, and source term category. So, in the field that the Level 2 PSA risk importance information should be reflected, such as maintenance rule program, risk importance has been determined by the subjective judgment of the model developer. This study was performed in order to generate the risk importance information more objectively and systematically in the Level 2 PSA model, focused on the containment event tree in the domain PHWR Level 2 PSA model

  20. External event analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Bohn, M.P.; Lambright, J.A.

    1989-01-01

    The US Nuclear Regulatory Commission is sponsoring probabilistic risk assessments of six operating commercial nuclear power plants as part of a major update of the understanding of risk as provided by the original WASH-1400 risk assessments. In contrast to the WASH-1400 studies, at least two of the NUREG-1150 risk assessments will include an analysis of risks due to earthquakes, fires, floods, etc., which are collectively known as eternal events. This paper summarizes the methods to be used in the external event analysis for NUREG-1150 and the results obtained to date. The two plants for which external events are being considered are Surry and Peach Bottom, a PWR and BWR respectively. The external event analyses (through core damage frequency calculations) were completed in June 1989, with final documentation available in September. In contrast to most past external event analyses, wherein rudimentary systems models were developed reflecting each external event under consideration, the simplified NUREG-1150 analyses are based on the availability of the full internal event PRA systems models (event trees and fault trees) and make use of extensive computer-aided screening to reduce them to sequence cut sets important to each external event. This provides two major advantages in that consistency and scrutability with respect to the internal event analysis is achieved, and the full gamut of random and test/maintenance unavailabilities are automatically included, while only those probabilistically important survive the screening process. Thus, full benefit of the internal event analysis is obtained by performing the internal and external event analyses sequentially

  1. Serious adverse events and the risk of stroke in patients with rheumatoid arthritis: results from the German RABBIT cohort.

    Science.gov (United States)

    Meissner, Y; Richter, A; Manger, B; Tony, H P; Wilden, E; Listing, J; Zink, A; Strangfeld, A

    2017-09-01

    In the general population, the incidence of stroke is increased following other serious events and hospitalisation. We investigated the impact of serious adverse events on the risk of stroke in patients with rheumatoid arthritis (RA), taking risk factors and treatment into account. Using data of the German biologics register RABBIT (Rheumatoid Arthritis: Observation of Biologic Therapy) with 12354 patients with RA, incidence rates (IRs) and risk factors for stroke were investigated using multi-state and Cox proportional hazard models. In addition, in a nested case-control study, all patients with stroke were matched 1:2 to patients with identical baseline risk profile and analysed using a shared frailty model. During follow-up, 166 strokes were reported. The overall IR was 3.2/1000 patient-years (PY) (95% CI 2.7 to 3.7). It was higher after a serious adverse event (IR: 9.0 (7.3 to 11.0)), particularly within 30 days after the event (IR: 94.9 (72.6 to 121.9)). The adjusted Cox model showed increased risks of age per 5 years (HR: 1.4 (1.3 to 1.5)), hyperlipoproteinaemia (HR: 1.6 (1.0 to 2.5)) and smoking (HR: 1.9 (1.3 to 2.6)). The risk decreased with better physical function (HR: 0.9 (0.8 to 0.96)). In the case-control study, 163 patients were matched to 326 controls. Major risk factors for stroke were untreated cardiovascular disease (HR: 3.3 (1.5 to 7.2)) and serious infections (HR:4.4 (1.6 to 12.5)) or other serious adverse events (HR: 2.6 (1.4 to 4.8)). Incident adverse events, in particular serious infections, and insufficient treatment of cardiovascular diseases are independent drivers of the risk of stroke. Physicians should be aware that patients who experience a serious event are at increased risk of subsequent stroke. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Development of transient initiating event frequencies for use in probabilistic risk assessments

    International Nuclear Information System (INIS)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors

  3. Hospital deaths and adverse events in Brazil

    Directory of Open Access Journals (Sweden)

    Pavão Ana Luiza B

    2011-09-01

    Full Text Available Abstract Background Adverse events are considered a major international problem related to the performance of health systems. Evaluating the occurrence of adverse events involves, as any other outcome measure, determining the extent to which the observed differences can be attributed to the patient's risk factors or to variations in the treatment process, and this in turn highlights the importance of measuring differences in the severity of the cases. The current study aims to evaluate the association between deaths and adverse events, adjusted according to patient risk factors. Methods The study is based on a random sample of 1103 patient charts from hospitalizations in the year 2003 in 3 teaching hospitals in the state of Rio de Janeiro, Brazil. The methodology involved a retrospective review of patient charts in two stages - screening phase and evaluation phase. Logistic regression was used to evaluate the relationship between hospital deaths and adverse events. Results The overall mortality rate was 8.5%, while the rate related to the occurrence of an adverse event was 2.9% (32/1103 and that related to preventable adverse events was 2.3% (25/1103. Among the 94 deaths analyzed, 34% were related to cases involving adverse events, and 26.6% of deaths occurred in cases whose adverse events were considered preventable. The models tested showed good discriminatory capacity. The unadjusted odds ratio (OR 11.43 and the odds ratio adjusted for patient risk factors (OR 8.23 between death and preventable adverse event were high. Conclusions Despite discussions in the literature regarding the limitations of evaluating preventable adverse events based on peer review, the results presented here emphasize that adverse events are not only prevalent, but are associated with serious harm and even death. These results also highlight the importance of risk adjustment and multivariate models in the study of adverse events.

  4. Multi-state model for studying an intermediate event using time-dependent covariates: application to breast cancer.

    Science.gov (United States)

    Meier-Hirmer, Carolina; Schumacher, Martin

    2013-06-20

    The aim of this article is to propose several methods that allow to investigate how and whether the shape of the hazard ratio after an intermediate event depends on the waiting time to occurrence of this event and/or the sojourn time in this state. A simple multi-state model, the illness-death model, is used as a framework to investigate the occurrence of this intermediate event. Several approaches are shown and their advantages and disadvantages are discussed. All these approaches are based on Cox regression. As different time-scales are used, these models go beyond Markov models. Different estimation methods for the transition hazards are presented. Additionally, time-varying covariates are included into the model using an approach based on fractional polynomials. The different methods of this article are then applied to a dataset consisting of four studies conducted by the German Breast Cancer Study Group (GBSG). The occurrence of the first isolated locoregional recurrence (ILRR) is studied. The results contribute to the debate on the role of the ILRR with respect to the course of the breast cancer disease and the resulting prognosis. We have investigated different modelling strategies for the transition hazard after ILRR or in general after an intermediate event. Including time-dependent structures altered the resulting hazard functions considerably and it was shown that this time-dependent structure has to be taken into account in the case of our breast cancer dataset. The results indicate that an early recurrence increases the risk of death. A late ILRR increases the hazard function much less and after the successful removal of the second tumour the risk of death is almost the same as before the recurrence. With respect to distant disease, the appearance of the ILRR only slightly increases the risk of death if the recurrence was treated successfully. It is important to realize that there are several modelling strategies for the intermediate event and that

  5. Blood pressure variability and risk of cardiovascular events and death in patients with hypertension and different baseline risks.

    Science.gov (United States)

    Mehlum, Maria H; Liestøl, Knut; Kjeldsen, Sverre E; Julius, Stevo; Hua, Tsushung A; Rothwell, Peter M; Mancia, Giuseppe; Parati, Gianfranco; Weber, Michael A; Berge, Eivind

    2018-01-20

    Blood pressure variability is associated with increased risk of cardiovascular events, particularly in high-risk patients. We assessed if variability was associated with increased risk of cardiovascular events and death in hypertensive patients at different risk levels. The Valsartan Antihypertensive Long-term Use Evaluation trial was a randomized controlled trial of valsartan vs. amlodipine in patients with hypertension and different risks of cardiovascular events, followed for a mean of 4.2 years. We calculated standard deviation (SD) of mean systolic blood pressure from visits from 6 months onward in patients with ≥3 visits and no events during the first 6 months. We compared the risk of cardiovascular events in the highest and lowest quintile of visit-to-visit blood pressure variability, using Cox regression. For analysis of death, variability was analysed as a continuous variable. Of 13 803 patients included, 1557 (11.3%) had a cardiovascular event and 1089 (7.9%) died. Patients in the highest quintile of SD had an increased risk of cardiovascular events [hazard ratio (HR) 2.1, 95% confidence interval (95% CI) 1.7-2.4; P risk of death (HR 1.10, 95% CI 1.04-1.17; P = 0.002). Associations were stronger among younger patients and patients with lower systolic blood pressure, and similar between patients with different baseline risks, except for higher risk of death among patients with established cardiovascular disease. Higher visit-to-visit systolic blood pressure variability is associated with increased risk of cardiovascular events in patients with hypertension, irrespective of baseline risk of cardiovascular events. Associations were stronger in younger patients and in those with lower mean systolic blood pressure. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author(s) 2018. For permissions, please email: journals.permissions@oup.com.

  6. Can discrete event simulation be of use in modelling major depression?

    Directory of Open Access Journals (Sweden)

    François Clément

    2006-12-01

    Full Text Available Abstract Background Depression is among the major contributors to worldwide disease burden and adequate modelling requires a framework designed to depict real world disease progression as well as its economic implications as closely as possible. Objectives In light of the specific characteristics associated with depression (multiple episodes at varying intervals, impact of disease history on course of illness, sociodemographic factors, our aim was to clarify to what extent "Discrete Event Simulation" (DES models provide methodological benefits in depicting disease evolution. Methods We conducted a comprehensive review of published Markov models in depression and identified potential limits to their methodology. A model based on DES principles was developed to investigate the benefits and drawbacks of this simulation method compared with Markov modelling techniques. Results The major drawback to Markov models is that they may not be suitable to tracking patients' disease history properly, unless the analyst defines multiple health states, which may lead to intractable situations. They are also too rigid to take into consideration multiple patient-specific sociodemographic characteristics in a single model. To do so would also require defining multiple health states which would render the analysis entirely too complex. We show that DES resolve these weaknesses and that its flexibility allow patients with differing attributes to move from one event to another in sequential order while simultaneously taking into account important risk factors such as age, gender, disease history and patients attitude towards treatment, together with any disease-related events (adverse events, suicide attempt etc.. Conclusion DES modelling appears to be an accurate, flexible and comprehensive means of depicting disease progression compared with conventional simulation methodologies. Its use in analysing recurrent and chronic diseases appears particularly useful

  7. Power System Event Ranking Using a New Linear Parameter-Varying Modeling with a Wide Area Measurement System-Based Approach

    Directory of Open Access Journals (Sweden)

    Mohammad Bagher Abolhasani Jabali

    2017-07-01

    Full Text Available Detecting critical power system events for Dynamic Security Assessment (DSA is required for reliability improvement. The approach proposed in this paper investigates the effects of events on dynamic behavior during nonlinear system response while common approaches use steady-state conditions after events. This paper presents some new and enhanced indices for event ranking based on time-domain simulation and polytopic linear parameter-varying (LPV modeling of a power system. In the proposed approach, a polytopic LPV representation is generated via linearization about some points of the nonlinear dynamic behavior of power system using wide-area measurement system (WAMS concepts and then event ranking is done based on the frequency response of the system models on the vertices. Therefore, the nonlinear behaviors of the system in the time of fault occurrence are considered for events ranking. The proposed algorithm is applied to a power system using nonlinear simulation. The comparison of the results especially in different fault conditions shows the advantages of the proposed approach and indices.

  8. Integrating adaptive behaviour in large-scale flood risk assessments: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Aerts, Jeroen

    2015-04-01

    Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.

  9. Episodes, events, and models

    Directory of Open Access Journals (Sweden)

    Sangeet eKhemlani

    2015-10-01

    Full Text Available We describe a novel computational theory of how individuals segment perceptual information into representations of events. The theory is inspired by recent findings in the cognitive science and cognitive neuroscience of event segmentation. In line with recent theories, it holds that online event segmentation is automatic, and that event segmentation yields mental simulations of events. But it posits two novel principles as well: first, discrete episodic markers track perceptual and conceptual changes, and can be retrieved to construct event models. Second, the process of retrieving and reconstructing those episodic markers is constrained and prioritized. We describe a computational implementation of the theory, as well as a robotic extension of the theory that demonstrates the processes of online event segmentation and event model construction. The theory is the first unified computational account of event segmentation and temporal inference. We conclude by demonstrating now neuroimaging data can constrain and inspire the construction of process-level theories of human reasoning.

  10. Higher risk of offspring schizophrenia following antenatal maternal exposure to severe adverse life events

    DEFF Research Database (Denmark)

    Khashan, Ali; Abel, Kathryn; McNamee, R.

    2008-01-01

    CONTEXT: Most societies believe that a mother's psychological state can influence her unborn baby. Severe adverse life events during pregnancy have been consistently associated with an elevated risk of low birth weight and prematurity. Such events during the first trimester have also been...... associated with risk of congenital malformations. OBJECTIVE: To assess the effect in offspring of antenatal maternal exposure to an objective measure of stress on risk of adverse neurodevelopment, specifically schizophrenia. We hypothesized that the strongest relationship would be to maternal exposures...... not linked with a higher risk of schizophrenia. CONCLUSIONS: Our population-based study suggests that severe stress to a mother during the first trimester may alter the risk of schizophrenia in offspring. This finding is consistent with ecological evidence from whole populations exposed to severe stressors...

  11. Task-based dermal exposure models for regulatory risk assessment

    NARCIS (Netherlands)

    Warren, N.D.; Marquart, H.; Christopher, Y.; Laitinen, J.; Hemmen, J.J. van

    2006-01-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of

  12. Elevated Serum Neopterin is Associated with Increased Risk of Cardiovascular Events in Acute Coronary Syndromes

    Directory of Open Access Journals (Sweden)

    Anwar Santoso

    2009-04-01

    Full Text Available BACKGROUND: Neopterin is a soluble biomarker of monocyte activation and its increased concentration might be expressed in atherosclerosis. Until recently, there has been lacking of information on the prognostic role of neopterin in acute coronary syndromes (ACS. The study was aimed at measuring the associations between elevated serum neopterin and increased risk of cardiovascular (CV events in ACS. METHODS: This was a prospective cohort study, recruited 71 ACS patients from January 31 through August 31, 2007 in Sanglah Hospital of Udayana School of Medicine, Denpasar, Bali. Cardiovascular events, such as: CV death, recurrent myocardial infarction, stroke and recurrent myocardial ischemia were previously defined. Relative risk and survival rate were measured successively by Cox proportional model and Kaplan-Meier curve. RESULTS: Of 71 ACS patients aged 56.8±9.5 years, 21 (29.5% subjects underwent CV events. Overall mean followup was 151.6 (95% CI: 129.7-173.5 days. Baseline characteristic were similarly distributed between groups with the highest quartile neopterin level (≥14.7 nmol/L than those with lowest quartile (≤6.2 nmol/L. Patients with the highest quartile had the worst survival curve than those with the lowest quartile (log-rank test; p=0.047. On Cox proportional model, relative risk of highest quartile group was 5.84 (95% CI: 1.19-28.47; p=0.029 compared to lowest quartile, after being adjusted with other predictors. CONCLUSIONS: Elevated serum neopterin is associated with increased risk of CV events in acute coronary syndromes. KEYWORDS: neopterin, cardiovascular events, acute coronary syndromes.

  13. Quantitative microbial risk assessment for spray irrigation of dairy manure based on an empirical fate and transport model

    Science.gov (United States)

    Burch, Tucker R; Spencer, Susan K.; Stokdyk, Joel; Kieke, Burney A; Larson, Rebecca A; Firnstahl, Aaron; Rule, Ana M; Borchardt, Mark A.

    2017-01-01

    BACKGROUND: Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. OBJECTIVES: We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. METHODS: We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irri- gation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. RESULTS: Median risk estimates from Monte Carlo simulations ranged from 10−5 to 10−2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. CONCLUSIONS: Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk.

  14. Modeling the Impact of Control on the Attractiveness of Risk in a Prospect Theory Framework

    Science.gov (United States)

    Young, Diana L.; Goodie, Adam S.; Hall, Daniel B.

    2010-01-01

    Many decisions involve a degree of personal control over event outcomes, which is exerted through one’s knowledge or skill. In three experiments we investigated differences in decision making between prospects based on a) the outcome of random events and b) the outcome of events characterized by control. In Experiment 1, participants estimated certainty equivalents (CEs) for bets based on either random events or the correctness of their answers to U.S. state population questions across the probability spectrum. In Experiment 2, participants estimated CEs for bets based on random events, answers to U.S. state population questions, or answers to questions about 2007 NCAA football game results. Experiment 3 extended the same procedure as Experiment 1 using a within-subjects design. We modeled data from all experiments in a prospect theory framework to establish psychological mechanisms underlying decision behavior. Participants weighted the probabilities associated with bets characterized by control so as to reflect greater risk attractiveness relative to bets based on random events, as evidenced by more elevated weighting functions under conditions of control. This research elucidates possible cognitive mechanisms behind increased risk taking for decisions characterized by control, and implications for various literatures are discussed. PMID:21278906

  15. Modeling the Impact of Control on the Attractiveness of Risk in a Prospect Theory Framework.

    Science.gov (United States)

    Young, Diana L; Goodie, Adam S; Hall, Daniel B

    2011-01-01

    Many decisions involve a degree of personal control over event outcomes, which is exerted through one's knowledge or skill. In three experiments we investigated differences in decision making between prospects based on a) the outcome of random events and b) the outcome of events characterized by control. In Experiment 1, participants estimated certainty equivalents (CEs) for bets based on either random events or the correctness of their answers to U.S. state population questions across the probability spectrum. In Experiment 2, participants estimated CEs for bets based on random events, answers to U.S. state population questions, or answers to questions about 2007 NCAA football game results. Experiment 3 extended the same procedure as Experiment 1 using a within-subjects design. We modeled data from all experiments in a prospect theory framework to establish psychological mechanisms underlying decision behavior. Participants weighted the probabilities associated with bets characterized by control so as to reflect greater risk attractiveness relative to bets based on random events, as evidenced by more elevated weighting functions under conditions of control. This research elucidates possible cognitive mechanisms behind increased risk taking for decisions characterized by control, and implications for various literatures are discussed.

  16. Sugar-sweetened beverages, vascular risk factors and events

    DEFF Research Database (Denmark)

    Keller, Amelie; Heitmann, Berit L; Olsen, Nanna

    2015-01-01

    , while two of three studies, including both men and women, found direct associations between SSB consumption and stroke; however, the association was significant among women only. All included studies examining vascular risk factors found direct associations between SSB consumption and change in blood...... pressure, blood lipid or blood sugar. CONCLUSIONS: The reviewed studies generally showed that SSB intake was related to vascular risk factors, whereas associations with vascular events were less consistent. Due to a limited number of published papers, especially regarding vascular events, the strength......OBJECTIVE: A high intake of sugar-sweetened beverages (SSB) has been linked to weight gain, obesity and type 2 diabetes; however, the influence on CVD risk remains unclear. Therefore, our objective was to summarize current evidence for an association between SSB consumption and cardiovascular risk...

  17. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    International Nuclear Information System (INIS)

    Brinkmann, Markus; Eichbaum, Kathrin; Kammann, Ulrike; Hudjetz, Sebastian; Cofalla, Catrina; Buchinger, Sebastian; Reifferscheid, Georg; Schüttrumpf, Holger; Preuss, Thomas

    2014-01-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios

  18. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Energy Technology Data Exchange (ETDEWEB)

    Brinkmann, Markus; Eichbaum, Kathrin [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Kammann, Ulrike [Thünen-Institute of Fisheries Ecology, Palmaille 9, 22767 Hamburg (Germany); Hudjetz, Sebastian [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Cofalla, Catrina [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Buchinger, Sebastian; Reifferscheid, Georg [Federal Institute of Hydrology (BFG), Department G3: Biochemistry, Ecotoxicology, Am Mainzer Tor 1, 56068 Koblenz (Germany); Schüttrumpf, Holger [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Preuss, Thomas [Department of Environmental Biology and Chemodynamics, Institute for Environmental Research,ABBt- Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); and others

    2014-07-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios.

  19. A process-based model for the definition of hydrological alert systems in landslide risk mitigation

    Directory of Open Access Journals (Sweden)

    M. Floris

    2012-11-01

    Full Text Available The definition of hydrological alert systems for rainfall-induced landslides is strongly related to a deep knowledge of the geological and geomorphological features of the territory. Climatic conditions, spatial and temporal evolution of the phenomena and characterization of landslide triggering, together with propagation mechanisms, are the key elements to be considered. Critical steps for the development of the systems consist of the identification of the hydrological variable related to landslide triggering and of the minimum rainfall threshold for landslide occurrence.

    In this paper we report the results from a process-based model to define a hydrological alert system for the Val di Maso Landslide, located in the northeastern Italian Alps and included in the Vicenza Province (Veneto region, NE Italy. The instability occurred in November 2010, due to an exceptional rainfall event that hit the Vicenza Province and the entire NE Italy. Up to 500 mm in 3-day cumulated rainfall generated large flood conditions and triggered hundreds of landslides. During the flood, the Soil Protection Division of the Vicenza Province received more than 500 warnings of instability phenomena. The complexity of the event and the high level of risk to infrastructure and private buildings are the main reasons for deepening the specific phenomenon occurred at Val di Maso.

    Empirical and physically-based models have been used to identify the minimum rainfall threshold for the occurrence of instability phenomena in the crown area of Val di Maso landslide, where a retrogressive evolution by multiple rotational slides is expected. Empirical models helped in the identification and in the evaluation of recurrence of critical rainfall events, while physically-based modelling was essential to verify the effects on the slope stability of determined rainfall depths. Empirical relationships between rainfall and landslide consist of the calculation of rainfall

  20. STakeholder-Objective Risk Model (STORM): Determiningthe aggregated risk of multiple contaminant hazards in groundwater well catchments

    DEFF Research Database (Denmark)

    Enzenhoefer, R.; Binning, Philip John; Nowak, W.

    2015-01-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any......-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired...

  1. Multivariate operational risk: dependence modelling with Lévy copulas

    OpenAIRE

    Böcker, K. and Klüppelberg, C.

    2015-01-01

    Simultaneous modelling of operational risks occurring in different event type/business line cells poses the challenge for operational risk quantification. Invoking the new concept of L´evy copulas for dependence modelling yields simple approximations of high quality for multivariate operational VAR.

  2. A calibration hierarchy for risk models was defined: from utopia to empirical data.

    Science.gov (United States)

    Van Calster, Ben; Nieboer, Daan; Vergouwe, Yvonne; De Cock, Bavo; Pencina, Michael J; Steyerberg, Ewout W

    2016-06-01

    Calibrated risk models are vital for valid decision support. We define four levels of calibration and describe implications for model development and external validation of predictions. We present results based on simulated data sets. A common definition of calibration is "having an event rate of R% among patients with a predicted risk of R%," which we refer to as "moderate calibration." Weaker forms of calibration only require the average predicted risk (mean calibration) or the average prediction effects (weak calibration) to be correct. "Strong calibration" requires that the event rate equals the predicted risk for every covariate pattern. This implies that the model is fully correct for the validation setting. We argue that this is unrealistic: the model type may be incorrect, the linear predictor is only asymptotically unbiased, and all nonlinear and interaction effects should be correctly modeled. In addition, we prove that moderate calibration guarantees nonharmful decision making. Finally, results indicate that a flexible assessment of calibration in small validation data sets is problematic. Strong calibration is desirable for individualized decision support but unrealistic and counter productive by stimulating the development of overly complex models. Model development and external validation should focus on moderate calibration. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Modeling logistic performance in quantitative microbial risk assessment.

    Science.gov (United States)

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  4. A Risk Prediction Model for Sporadic CRC Based on Routine Lab Results.

    Science.gov (United States)

    Boursi, Ben; Mamtani, Ronac; Hwang, Wei-Ting; Haynes, Kevin; Yang, Yu-Xiao

    2016-07-01

    Current risk scores for colorectal cancer (CRC) are based on demographic and behavioral factors and have limited predictive values. To develop a novel risk prediction model for sporadic CRC using clinical and laboratory data in electronic medical records. We conducted a nested case-control study in a UK primary care database. Cases included those with a diagnostic code of CRC, aged 50-85. Each case was matched with four controls using incidence density sampling. CRC predictors were examined using univariate conditional logistic regression. Variables with p value CRC prediction models which included age, sex, height, obesity, ever smoking, alcohol dependence, and previous screening colonoscopy had an AUC of 0.58 (0.57-0.59) with poor goodness of fit. A laboratory-based model including hematocrit, MCV, lymphocytes, and neutrophil-lymphocyte ratio (NLR) had an AUC of 0.76 (0.76-0.77) and a McFadden's R2 of 0.21 with a NRI of 47.6 %. A combined model including sex, hemoglobin, MCV, white blood cells, platelets, NLR, and oral hypoglycemic use had an AUC of 0.80 (0.79-0.81) with a McFadden's R2 of 0.27 and a NRI of 60.7 %. Similar results were shown in an internal validation set. A laboratory-based risk model had good predictive power for sporadic CRC risk.

  5. Validation in the Absence of Observed Events.

    Science.gov (United States)

    Lathrop, John; Ezell, Barry

    2016-04-01

    This article addresses the problem of validating models in the absence of observed events, in the area of weapons of mass destruction terrorism risk assessment. We address that problem with a broadened definition of "validation," based on stepping "up" a level to considering the reason why decisionmakers seek validation, and from that basis redefine validation as testing how well the model can advise decisionmakers in terrorism risk management decisions. We develop that into two conditions: validation must be based on cues available in the observable world; and it must focus on what can be done to affect that observable world, i.e., risk management. That leads to two foci: (1) the real-world risk generating process, and (2) best use of available data. Based on our experience with nine WMD terrorism risk assessment models, we then describe three best use of available data pitfalls: SME confidence bias, lack of SME cross-referencing, and problematic initiation rates. Those two foci and three pitfalls provide a basis from which we define validation in this context in terms of four tests--Does the model: … capture initiation? … capture the sequence of events by which attack scenarios unfold? … consider unanticipated scenarios? … consider alternative causal chains? Finally, we corroborate our approach against three validation tests from the DOD literature: Is the model a correct representation of the process to be simulated? To what degree are the model results comparable to the real world? Over what range of inputs are the model results useful? © 2015 Society for Risk Analysis.

  6. Catastrophe loss modelling of storm-surge flood risk in eastern England.

    Science.gov (United States)

    Muir Wood, Robert; Drayton, Michael; Berger, Agnete; Burgess, Paul; Wright, Tom

    2005-06-15

    Probabilistic catastrophe loss modelling techniques, comprising a large stochastic set of potential storm-surge flood events, each assigned an annual rate of occurrence, have been employed for quantifying risk in the coastal flood plain of eastern England. Based on the tracks of the causative extratropical cyclones, historical storm-surge events are categorized into three classes, with distinct windfields and surge geographies. Extreme combinations of "tide with surge" are then generated for an extreme value distribution developed for each class. Fragility curves are used to determine the probability and magnitude of breaching relative to water levels and wave action for each section of sea defence. Based on the time-history of water levels in the surge, and the simulated configuration of breaching, flow is time-stepped through the defences and propagated into the flood plain using a 50 m horizontal-resolution digital elevation model. Based on the values and locations of the building stock in the flood plain, losses are calculated using vulnerability functions linking flood depth and flood velocity to measures of property loss. The outputs from this model for a UK insurance industry portfolio include "loss exceedence probabilities" as well as "average annualized losses", which can be employed for calculating coastal flood risk premiums in each postcode.

  7. Development of transient initiating event frequencies for use in probabilistic risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors.

  8. Modelling the interaction between flooding events and economic growth

    Directory of Open Access Journals (Sweden)

    J. Grames

    2015-06-01

    Full Text Available Socio-hydrology describes the interaction between the socio-economy and water. Recent models analyze the interplay of community risk-coping culture, flooding damage and economic growth (Di Baldassarre et al., 2013; Viglione et al., 2014. These models descriptively explain the feedbacks between socio-economic development and natural disasters like floods. Contrary to these descriptive models, our approach develops an optimization model, where the intertemporal decision of an economic agent interacts with the hydrological system. In order to build this first economic growth model describing the interaction between the consumption and investment decisions of an economic agent and the occurrence of flooding events, we transform an existing descriptive stochastic model into an optimal deterministic model. The intermediate step is to formulate and simulate a descriptive deterministic model. We develop a periodic water function to approximate the former discrete stochastic time series of rainfall events. Due to the non-autonomous exogenous periodic rainfall function the long-term path of consumption and investment will be periodic.

  9. Risk of Vascular Thrombotic Events Following Discontinuation of Antithrombotics After Peptic Ulcer Bleeding.

    Science.gov (United States)

    Kim, Seung Young; Hyun, Jong Jin; Suh, Sang Jun; Jung, Sung Woo; Jung, Young Kul; Koo, Ja Seol; Yim, Hyung Joon; Park, Jong Jae; Chun, Hoon Jai; Lee, Sang Woo

    2016-04-01

    To evaluate whether the risk of cardiovascular events increases when antithrombotics are discontinued after ulcer bleeding. Peptic ulcer bleeding associated with antithrombotics has increased due to the increase in the proportion of elderly population. Little is known about the long-term effects of discontinuing antithrombotics after peptic ulcer bleeding. The aim of this study was to evaluate whether the risk of cardiovascular events increases when antithrombotics are discontinued after ulcer bleeding. We reviewed the medical records of patients with ulcer bleeding who were taking antiplatelet agents or anticoagulants at the time of ulcer bleeding. Cox-regression model was used to adjust for potential confounders, and analyzed association between discontinuation of antithrombotic drugs after ulcer bleeding and thrombotic events such as ischemic heart disease or stroke. Of the 544 patients with ulcer bleeding, 72 patients who were taking antithrombotics and followed up for >2 months were analyzed. Forty patients discontinued antithrombotics after ulcer bleeding (discontinuation group) and 32 patients continued antithrombotics with or without transient interruption (continuation group). Thrombotic events developed more often in discontinuation group than in the continuation group [7/32 (21.9%) vs. 1/40 (2.5%), P=0.019]. Hazard ratio for thrombotic event when antithrombotics were continuously discontinued was 10.9 (95% confidence interval, 1.3-89.7). There were no significant differences in recurrent bleeding events between the 2 groups. Discontinuation of antithrombotics after peptic ulcer bleeding increases the risk of cardiovascular events. Therefore, caution should be taken when discontinuing antithrombotics after ulcer bleeding.

  10. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    Science.gov (United States)

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  11. Component Degradation Susceptibilities As The Bases For Modeling Reactor Aging Risk

    International Nuclear Information System (INIS)

    Unwin, Stephen D.; Lowry, Peter P.; Toyooka, Michael Y.

    2010-01-01

    The extension of nuclear power plant operating licenses beyond 60 years in the United States will be necessary if we are to meet national energy needs while addressing the issues of carbon and climate. Characterizing the operating risks associated with aging reactors is problematic because the principal tool for risk-informed decision-making, Probabilistic Risk Assessment (PRA), is not ideally-suited to addressing aging systems. The components most likely to drive risk in an aging reactor - the passives - receive limited treatment in PRA, and furthermore, standard PRA methods are based on the assumption of stationary failure rates: a condition unlikely to be met in an aging system. A critical barrier to modeling passives aging on the wide scale required for a PRA is that there is seldom sufficient field data to populate parametric failure models, and nor is there the availability of practical physics models to predict out-year component reliability. The methodology described here circumvents some of these data and modeling needs by using materials degradation metrics, integrated with conventional PRA models, to produce risk importance measures for specific aging mechanisms and component types. We suggest that these measures have multiple applications, from the risk-screening of components to the prioritization of materials research.

  12. riskRegression

    DEFF Research Database (Denmark)

    Ozenne, Brice; Sørensen, Anne Lyngholm; Scheike, Thomas

    2017-01-01

    In the presence of competing risks a prediction of the time-dynamic absolute risk of an event can be based on cause-specific Cox regression models for the event and the competing risks (Benichou and Gail, 1990). We present computationally fast and memory optimized C++ functions with an R interface......-product we obtain fast access to the baseline hazards (compared to survival::basehaz()) and predictions of survival probabilities, their confidence intervals and confidence bands. Confidence intervals and confidence bands are based on point-wise asymptotic expansions of the corresponding statistical...

  13. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  14. A risk-based review of Instrument Air systems at nuclear power plants

    International Nuclear Information System (INIS)

    DeMoss, G.; Lofgren, E.; Rothleder, B.; Villeran, M.; Ruger, C.

    1990-01-01

    The broad objective of this analysis was to provide risk-based information to help focus regulatory actions related to Instrument Air (IA) systems at operating nuclear power plants. We first created an extensive data base of summarized and characterized IA-related events that gave a qualitative indication of the nature and severity of these events. Additionally, this data base was used to calculate the frequencies of certain events, which were used in the risk analysis. The risk analysis consisted of reviewing published PRAs and NRC Accident Sequence Precursor reports for IA-initiated accident sequences, IA interactions with frontline systems, and IA-related risk significant events. Sensitivity calculations were performed when possible. Generically, IA was found to contribute less to total risk than many safety systems; however, specific design weaknesses in safety systems, non-safety systems, and the IA system were found to be significant in risk. 22 refs., 13 figs., 24 tabs

  15. Dealing with project complexity by matrix-based propagation modelling for project risk analysis

    OpenAIRE

    Fang , Chao; Marle , Franck

    2012-01-01

    International audience; Engineering projects are facing a growing complexity and are thus exposed to numerous and interdependent risks. In this paper, we present a quantitative method for modelling propagation behaviour in the project risk network. The construction of the network requires the involvement of the project manager and related experts using the Design Structure Matrix (DSM) method. A matrix-based risk propagation model is introduced to calculate risk propagation and thus to re-eva...

  16. Theorizing "Big Events" as a potential risk environment for drug use, drug-related harm and HIV epidemic outbreaks.

    Science.gov (United States)

    Friedman, Samuel R; Rossi, Diana; Braine, Naomi

    2009-05-01

    Political-economic transitions in the Soviet Union, Indonesia, and China, but not the Philippines, were followed by HIV epidemics among drug users. Wars also may sometimes increase HIV risk. Based on similarities in some of the causal pathways through which wars and transitions can affect HIV risk, we use the term "Big Events" to include both. We first critique several prior epidemiological models of Big Events as inadequately incorporating social agency and as somewhat imprecise and over-generalizing in their sociology. We then suggest a model using the following concepts: first, event-specific HIV transmission probabilities are functions of (a) the probability that partners are infection-discordant; (b) the infection-susceptibility of the uninfected partner; (c) the infectivity of the infected--as well as (d) the behaviours engaged in. These probabilities depend on the distributions of HIV and other variables in populations. Sexual or injection events incorporate risk behaviours and are embedded in sexual and injection partnership patterns and community networks, which in turn are shaped by the content of normative regulation in communities. Wars and transitions can change socio-economic variables that can sometimes precipitate increases in the numbers of people who engage in high-risk drug and sexual networks and behaviours and in the riskiness of what they do. These variables that Big Events affect may include population displacement; economic difficulties and policies; police corruption, repressiveness, and failure to preserve order; health services; migration; social movements; gender roles; and inter-communal violence--which, in turn, affect normative regulation, youth alienation, networks and behaviours. As part of these pathways, autonomous action by neighbourhood residents, teenagers, drug users and sex workers to maintain their economic welfare, health or happiness may affect many of these variables or otherwise mediate whether HIV epidemics follow

  17. Engineering models for catastrophe risk and their application to insurance

    Science.gov (United States)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  18. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  19. Modelling Extreme Events (Hurricanes) at the Seafloor in the Gulf of Mexico:

    Science.gov (United States)

    Syvitski, J. P.; Jenkins, C. J.; Meiburg, E. H.; Radhakrishnan, S.; Harris, C. K.; Arango, H.; Kniskern, T. A.; Hutton, E.; Auad, G.

    2016-02-01

    The subsea infrastructure of the N Gulf of Mexico is exposed to risks of seabed failure and flowage under extreme storm events. Numerical assessments of the likelihood, location and severity of those phenomena would help in planning. A project under BOEM, couples advanced modelling modules in order to begin such a system. The period 2008-10 was used for test data, covering hurricanes Gustav and Ike in the Mississippi to De Soto Canyons region. Currents, tides and surface waves were computed using the Regional Ocean Modeling System (ROMS) and river discharges from WBMsed. The Community Sediment Transport Model (CSTMS) calculated the concurrent sediment erosion-transport-deposition. Local sediment properties were from the dbSEABED database. The preferred paths of near-bottom sediment flows were based on a stream analysis of the bathymetry. Locations and timings of suspended sediment gravity flow were identified by applying energy flow ignition criterea. Wave-induced mass failure and subbottom liquefaction were assessed using a bevy of marine geotechnical models. The persistence, densities and velocities of turbidity flows yielded by the disruption of the sediment masses were calculated using high-Reynolds Number adaptations of LES/RANS-TURBINS models (Large-Eddy Simulation / Reynolds Averaged Navier-Stokes). A valuable experience in the project was devising workflows and linkages between these advanced, but independent models. We thank H Arango, T Kniskern, J Birchler and S Radhakrishnan for their help in this. Results: as known, much of the shelf sediment mantle is suspended and/or moved during hurricanes. Many short-lived gravity-flow ignitions happen on the shelf; only those at the shelf edge will ignite into fast, erosive currents. Sediment patchiness and vagaries of hurricane path mean that the pattern alters from event to event. To understand the impacts on infrastructure, a numerical process-based modelling approach will be essential - along the lines we

  20. Density of calcium in the ascending thoracic aorta and risk of incident cardiovascular disease events.

    Science.gov (United States)

    Thomas, Isac C; McClelland, Robyn L; Michos, Erin D; Allison, Matthew A; Forbang, Nketi I; Longstreth, W T; Post, Wendy S; Wong, Nathan D; Budoff, Matthew J; Criqui, Michael H

    2017-10-01

    The volume and density of coronary artery calcium (CAC) both independently predict cardiovascular disease (CVD) beyond standard risk factors, with CAC density inversely associated with incident CVD after accounting for CAC volume. We tested the hypothesis that ascending thoracic aorta calcium (ATAC) volume and density predict incident CVD events independently of CAC. The Multi-Ethnic Study of Atherosclerosis (MESA) is a prospective cohort study of participants without clinical CVD at baseline. ATAC and CAC were measured from baseline cardiac computed tomography (CT). Cox regression models were used to estimate the associations of ATAC volume and density with incident coronary heart disease (CHD) events and CVD events, after adjustment for standard CVD risk factors and CAC volume and density. Among 6811 participants, 234 (3.4%) had prevalent ATAC and 3395 (49.8%) had prevalent CAC. Over 10.3 years, 355 CHD and 562 CVD events occurred. One-standard deviation higher ATAC density was associated with a lower risk of CHD (HR 0.48 [95% CI 0.29-0.79], pdensity was inversely associated with incident CHD and CVD after adjustment for CVD risk factors and CAC volume and density. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Classification of radiation-hazardous objects by the ecological risk rate, based on the concept of the International Nuclear Event Scale (INES)

    International Nuclear Information System (INIS)

    Vetrov, V.A.

    2003-01-01

    The principal categories of the radiation-hazardous objects (RHO) (nuclear fuel cycle plants (NFC including NPP); ships with nuclear engine units and appropriate service facilities; units related with nuclear weapons (design, manufacture, storage, etc.); contaminated territories in the result nuclear accidents and nuclear facilities tests; civil enterprises using the radioactive sources) with real accident risk from radioactive substances (RS) release into environment are considered. For assessment of the ecological risk rate from RHO the International Nuclear Event Scale implemented by IAEA for NPP use is suggested. By opinion of the specialists the INES criteria could be used for radiation events assessment to other RHO that gives possibility for RHO arrangement by the potential hazard rate for environment in the case of accident. For RHO qualitative classification the main parameters assessment influencing on radioactive release risk (amount (total activity) of radioactive substances; possibility of chain reaction development; strength of technological parameters, etc.) was suggested. On the base of the INES all above-listed RHO kinds in the case of accident could be conditionally separated into three categories: 1. most radiation dangerous objects, on which could be severe and serious radiation accidents (corresponding to 4-7 INES levels); 2. RHO on which there is risk for accidents accompanying with RS release (accidents up to 4 INES level). 3. RHO without practical possibility for event (incidents - not higher 3 INES level). Introduction of suggested classification gives possibility for RHO safety control requirements rationalizing to radiation monitoring purposes for both RHO and the local systems

  2. A risk standard based on societal cost with bounded consequences

    International Nuclear Information System (INIS)

    Worledge, D.H.

    1982-01-01

    A risk standard is proposed that relates the frequency of occurrence of single events to the consequences of the events. Maximum consequences and risk aversion are used to give the cumulative risk curve a shape similar to the results of a risk assessment and to bound the expectation of deaths. Societal costs in terms of deaths are used to fix the parameters of the model together with an approximate comparison with individual risks. The proposed standard is compared with some practical applications of risk assessment to nuclear reactor systems

  3. Cost-effectiveness of rosuvastatin in comparison with generic atorvastatin and simvastatin in a Swedish population at high risk of cardiovascular events

    Directory of Open Access Journals (Sweden)

    Gandhi SK

    2012-01-01

    Full Text Available Sanjay K Gandhi1, Marie M Jensen2, Kathleen M Fox3, Lee Smolen4, Anders G Olsson5, Thomas Paulsson61AstraZeneca LP, Wilmington, DE, USA; 2AstraZeneca, Lund, Sweden; 3Strategic HealthCare Solution, Monkton, MD; 4Medical Decision Modeling Inc, Indianapolis, IN, USA; 5Department of Medical and Health Sciences, Linkoping University, and Stockholm Heart Center, Stockholm; 6AstraZeneca, Sodertalje, SwedenBackground: To assess the long-term cost-effectiveness of rosuvastatin therapy compared with generic simvastatin and generic atorvastatin in reducing the incidence of cardiovascular events and mortality in a Swedish population with Framingham risk ≥20%.Methods: A probabilistic Monte Carlo simulation model based on data from JUPITER (the Justification for the Use of statins in Prevention: an Intervention Trial Evaluating Rosuvastatin was used to estimate the long-term cost-effectiveness of rosuvastatin 20 mg daily versus simvastatin or atorvastatin 40 mg for the prevention of cardiovascular death and morbidity. The three-stage model included cardiovascular event prevention simulating the 4 years of JUPITER, initial prevention beyond the trial, and subsequent cardiovascular event prevention. A Swedish health care payer perspective (direct costs only was modeled for a lifetime horizon, with 2008/2009 as the costing period. Univariate and probabilistic sensitivity analyses were performed.Results: The incremental cost per quality-adjusted life-year (QALY gained with rosuvastatin 20 mg over simvastatin or atorvastatin 40 mg ranged from SEK88,113 (rosuvastatin 20 mg versus simvastatin 40 mg; Framingham risk ≥30%; net avoidance of 34 events/1000 patients to SEK497,542 (versus atorvastatin 40 mg: Framingham risk ≥20%; net avoidance of 11 events/1000 patients over a lifetime horizon. Probabilistic sensitivity analyses indicated that at a willingness-to-pay threshold of SEK500,000/QALY, rosuvastatin 20 mg would be cost-effective for approximately 75%–85

  4. Towards large scale stochastic rainfall models for flood risk assessment in trans-national basins

    Science.gov (United States)

    Serinaldi, F.; Kilsby, C. G.

    2012-04-01

    While extensive research has been devoted to rainfall-runoff modelling for risk assessment in small and medium size watersheds, less attention has been paid, so far, to large scale trans-national basins, where flood events have severe societal and economic impacts with magnitudes quantified in billions of Euros. As an example, in the April 2006 flood events along the Danube basin at least 10 people lost their lives and up to 30 000 people were displaced, with overall damages estimated at more than half a billion Euros. In this context, refined analytical methods are fundamental to improve the risk assessment and, then, the design of structural and non structural measures of protection, such as hydraulic works and insurance/reinsurance policies. Since flood events are mainly driven by exceptional rainfall events, suitable characterization and modelling of space-time properties of rainfall fields is a key issue to perform a reliable flood risk analysis based on alternative precipitation scenarios to be fed in a new generation of large scale rainfall-runoff models. Ultimately, this approach should be extended to a global flood risk model. However, as the need of rainfall models able to account for and simulate spatio-temporal properties of rainfall fields over large areas is rather new, the development of new rainfall simulation frameworks is a challenging task involving that faces with the problem of overcoming the drawbacks of the existing modelling schemes (devised for smaller spatial scales), but keeping the desirable properties. In this study, we critically summarize the most widely used approaches for rainfall simulation. Focusing on stochastic approaches, we stress the importance of introducing suitable climate forcings in these simulation schemes in order to account for the physical coherence of rainfall fields over wide areas. Based on preliminary considerations, we suggest a modelling framework relying on the Generalized Additive Models for Location, Scale

  5. Decision Making and Risk Evaluation Frameworks for Extreme Space Weather Events

    Science.gov (United States)

    Uritskaya, O.; Robinson, R. M.; Pulkkinen, A. A.

    2017-12-01

    Extreme Space Weather events (ESWE) are in the spotlight nowadays because they can produce a significant impact not only due to their intensity and broad geographical scope, but also because of the widespread levels and the multiple sectors of the economy that could be involved. In the task of evaluation of the ESWE consequences, the most problematic and vulnerable aspect is the determination and calculation of the probability of statistically infrequent events and the subsequent assessment of the economic risks. In this work, we conduct a detailed analysis of the available frameworks of the general Decision-Making Theory in the presence of uncertainty, in the context of their applicability for the numerical estimation of the risks and losses associated with ESWE. The results of our study demonstrate that, unlike the Multiple-criteria decision analysis or Minimax approach to modeling of the possible scenarios for the ESWE effects, which prevail in the literature, the most suitable concept is the Games Against Nature (GAN). It enables an evaluation of every economically relevant aspect of space weather conditions and obtain more detailed results. Choosing the appropriate methods for solving GAN models, i.e. determining the most optimal strategy with a given level of uncertainty, requires estimating the conditional probabilities of Space Weather events for each outcome of possible scenarios of this natural disaster. Due to the specifics of complex natural and economic systems, with which we are dealing in this case, this problem remains unsolved, mainly because of inevitable loss of information at every stage of the decision-making process. The analysis is illustrated by deregulated electricity markets of the USA and Canada, whose power grid systems are known to be perceptive to ESWE. The GAN model is more appropriate in identifying potential risks in economic systems. The proposed approach, when applied to the existing database of Space Weather observations and

  6. A Risk Assessment Example for Soil Invertebrates Using Spatially Explicit Agent-Based Models

    DEFF Research Database (Denmark)

    Reed, Melissa; Alvarez, Tania; Chelinho, Sonia

    2016-01-01

    Current risk assessment methods for measuring the toxicity of plant protection products (PPPs) on soil invertebrates use standardized laboratory conditions to determine acute effects on mortality and sublethal effects on reproduction. If an unacceptable risk is identified at the lower tier...... population models for ubiquitous soil invertebrates (collembolans and earthworms) as refinement options in current risk assessment. Both are spatially explicit agent-based models (ABMs), incorporating individual and landscape variability. The models were used to provide refined risk assessments for different...... application scenarios of a hypothetical pesticide applied to potato crops (full-field spray onto the soil surface [termed “overall”], in-furrow, and soil-incorporated pesticide applications). In the refined risk assessment, the population models suggest that soil invertebrate populations would likely recover...

  7. Risk factors for venous thromboembolic events in pediatric surgical patients: Defining indications for prophylaxis.

    Science.gov (United States)

    Cairo, Sarah B; Lautz, Timothy B; Schaefer, Beverly A; Yu, Guan; Naseem, Hibbut-Ur-Rauf; Rothstein, David H

    2017-12-27

    Venous thromboembolism (VTE) in pediatric surgical patients is a rare event. The risk factors for VTE in pediatric general surgery patients undergoing abdominopelvic procedures are unknown. The American College of Surgeon's National Surgical Quality Improvement Program-Pediatric (NSQIP-P) database (2012-2015) was queried for patients with VTE after abdominopelvic general surgery procedures. Patient and operative variables were assessed to identify risk factors associated with VTE and develop a pediatric risk score. From 2012-2015, 68 of 34,813 (0.20%) patients who underwent abdominopelvic general surgery procedures were diagnosed with VTE. On multivariate analysis, there was no increased risk of VTE based on concomitant malignancy, chemotherapy, inflammatory bowel disease, or laparoscopic surgical approach, while a higher rate of VTE was identified among female patients. The odds of experiencing VTE were increased on stepwise regression for patients older than 15 years and those with preexisting renal failure or a diagnosis of septic shock, patients with American Society of Anesthesia (ASA) classification ≥ 2, and for anesthesia time longer than 2 h. The combination of age > 15 years, ASA classification ≥ 2, anesthesia time > 2 h, renal failure, and septic shock was included in a model for predicting risk of VTE (AUC = 0.907, sensitivity 84.4%, specificity 88.2%). VTE is rare in pediatric patients, but prediction modeling may help identify those patients at heightened risk. Additional studies are needed to validate the factors identified in this study in a risk assessment model as well as to assess the efficacy and cost-effectiveness of prophylaxis methods. Level III, retrospective comparative study. Copyright © 2018. Published by Elsevier Inc.

  8. Construction and Quantification of the One Top model of the Fire Events PSA

    International Nuclear Information System (INIS)

    Kang, Dae Il; Lee, Yoon Hwan; Han, Sang Hoon

    2008-01-01

    KAERI constructed the one top model of the fire events PSA for Ulchin Unit 3 and 4 by using the 'mapping technique'. The mapping technique was developed for the construction and quantification of external events PSA models with a one top model for an internal events PSA. With 'AIMS', the mapping technique can be implemented by the construction of mapping tables. The mapping tables include fire rooms, fire ignition frequency, related initiating events, fire transfer events, and the internal PSA basic events affected by a fire. The constructed one top fire PSA model is based on previously conducted fire PSA results for Ulchin Unit 3 and 4. In this paper, we introduce the construction procedure and quantification results of the one top model of the fire events PSA by using the mapping technique. As the one top model of the fire events PSA developed in this study is based on the previous study, we also introduce the previous fire PSA approach focused on quantification

  9. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    Science.gov (United States)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  10. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi...

  11. A state-of-the-art multi-criteria model for drug benefit-risk analysis

    NARCIS (Netherlands)

    Tervonen, T.; Hillege, H.L.; Buskens, E.; Postmus, D.

    2010-01-01

    Drug benefit-risk analysis is based on firm clinical evidence related to various safety and efficacy outcomes, such as tolerability, treatment response, and adverse events. In this paper, we propose a new approach for constructing a supporting multi-criteria model that fully takes into account this

  12. Future changes in extreme precipitation in the Rhine basin based on global and regional climate model simulations

    NARCIS (Netherlands)

    Pelt, van S.C.; Beersma, J.J.; Buishand, T.A.; Hurk, van den B.J.J.M.; Kabat, P.

    2012-01-01

    Probability estimates of the future change of extreme precipitation events are usually based on a limited number of available global climate model (GCM) or regional climate model (RCM) simulations. Since floods are related to heavy precipitation events, this restricts the assessment of flood risks.

  13. Relationships of different types of event to cardiovascular death in trials of antihypertensive treatment: an aid to definition of total cardiovascular disease risk in hypertension.

    Science.gov (United States)

    Zambon, Antonella; Arfè, Andrea; Corrao, Giovanni; Zanchetti, Alberto

    2014-03-01

    Guidelines for management of cardiovascular diseases stratify absolute cardiovascular risk into categories with a high-risk threshold defined at a 20% cardiovascular events risk in 10 years, but it is unclear whether only major events or the Framingham-extended definition should be considered. The 2013 ESH-ESC hypertension guidelines, instead, define cardiovascular risk as a risk of cardiovascular death in 10 years, as in the SCORE model, setting the threshold for high risk at the 5% level. It would be therefore convenient to know the quantitative relationship between the risks of the different outcomes adopted by the different guidelines, especially because some outcome definitions include serious nonfatal cardiovascular events relevant in cardiovascular prevention. We have therefore analysed these relationships in trials of antihypertensive therapy as an aid to defining total cardiovascular risk in hypertensive patients. Sixty-one trials were identified, and 51 retained for analysis of the relationship of cardiovascular death to the incidence of all-cause death, major cardiovascular events and inclusive (Framingham) cardiovascular events. The relationship between cardiovascular death rates and each type of event rates was explored by fitting flexible regression models. The included trials provided 15164 cardiovascular deaths and 1674427 patient-years. The relation of each event rate to cardiovascular death rate was best explained by a model considering the logarithm of each event rate as a dependent variable and the logarithm of cardiovascular death rate as a predictor. Mean patients' age and treatment were also predictors, but to a minor extent. The increase of the incidence rates of all types of events was less steep the higher the CV death rate: the rate ratios of all-cause death to cardiovascular death were 2.2, 1.9 and 1.8 at low-moderate (cardiovascular death hypertensive patients whose cardiovascular death risk is calculated by the SCORE model.

  14. A systemic approach for managing extreme risk events-dynamic financial analysis

    Directory of Open Access Journals (Sweden)

    Ph.D.Student Rodica Ianole

    2011-12-01

    Full Text Available Following the Black Swan logic, it often happens that what we do not know becomes more relevant that what we (believe to know. The management of extreme risks falls under this paradigm in the sense that it cannot be limited to a static approach based only on objective and easily quantifiable variables. Making appeal to the operational tools developed primarily for the insurance industry, the present paper aims to investigate how dynamic financial analysis (DFA can be used within the framework of extreme risk events.

  15. Modeling of Ship Collision Risk Index Based on Complex Plane and Its Realization

    OpenAIRE

    Xiaoqin Xu; Xiaoqiao Geng; Yuanqiao Wen

    2016-01-01

    Ship collision risk index is the basic and important concept in the domain of ship collision avoidance. In this paper, the advantages and deficiencies of the various calculation methods of ship collision risk index are pointed out. Then the ship collision risk model based on complex plane, which can well make up for the deficiencies of the widely-used evaluation model proposed by Kearon.J and Liu ruru is proposed. On this basis, the calculation method of collision risk index under the encount...

  16. THE FLOOD RISK IN THE LOWER GIANH RIVER: MODELLING AND FIELD VERIFICATION

    Directory of Open Access Journals (Sweden)

    NGUYEN H. D.

    2016-03-01

    Full Text Available Problems associated with flood risk definitely represent a highly topical issue in Vietnam. The case of the lower Gianh River in the central area of Vietnam, with a watershed area of 353 km2, is particularly interesting. In this area, periodically subject to flood risk, the scientific question is strongly linked to risk management. In addition, flood risk is the consequence of the hydrological hazard of an event and the damages related to this event. For this reason, our approach is based on hydrodynamic modelling using Mike Flood to simulate the runoff during a flood event. Unfortunately the data in the studied area are quite limited. Our computation of the flood risk is based on a three-step modelling process, using rainfall data coming from 8 stations, cross sections, the topographic map and the land-use map. The first step consists of creating a 1-D model using Mike 11, in order to simulate the runoff in the minor river bed. In the second step, we use Mike 21 to create a 2-D model to simulate the runoff in the flood plain. The last step allows us to couple the two models in order to precisely describe the variables for the hazard analysis in the flood plain (the water level, the speed, the extent of the flooding. Moreover the model is calibrated and verified using observational data of the water level at hydrologic stations and field control data (on the one hand flood height measurements, on the other hand interviews with the community and with the local councillors. We then generate GIS maps in order to improve flood hazard management, which allows us to create flood hazard maps by coupling the flood plain map and the runoff speed map. Our results show that: the flood peak, caused by typhoon Nari, reached more than 6 m on October 16th 2013 at 4 p.m. (its area was extended by 149 km². End that the typhoon constitutes an extreme flood hazard for 11.39%, very high for 10.60%, high for 30.79%, medium for 31.91% and a light flood hazard for 15

  17. Bisphosphonates and risk of cardiovascular events: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Dae Hyun Kim

    Full Text Available Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV events, atrial fibrillation, myocardial infarction (MI, stroke, and CV death in adults with or at risk for low bone mass.A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs and 95% confidence intervals (CIs of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed.Absolute risks over 25-36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84-1.14]; I2 = 0.0%, atrial fibrillation (41 trials; 1.08 [0.92-1.25]; I2 = 0.0%, MI (10 trials; 0.96 [0.69-1.34]; I2 = 0.0%, stroke (10 trials; 0.99 [0.82-1.19]; I2 = 5.8%, and CV death (14 trials; 0.88 [0.72-1.07]; I2 = 0.0% with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96-1.61]; I2 = 0.0%, not for oral bisphosphonates (26 trials; 1.02 [0.83-1.24]; I2 = 0.0%. The CV effects did not vary by subgroups or study quality.Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large reduction in fractures with bisphosphonates, changes in

  18. Quantifying and comparing dynamic predictive accuracy of joint models for longitudinal marker and time-to-event in presence of censoring and competing risks.

    Science.gov (United States)

    Blanche, Paul; Proust-Lima, Cécile; Loubère, Lucie; Berr, Claudine; Dartigues, Jean-François; Jacqmin-Gadda, Hélène

    2015-03-01

    Thanks to the growing interest in personalized medicine, joint modeling of longitudinal marker and time-to-event data has recently started to be used to derive dynamic individual risk predictions. Individual predictions are called dynamic because they are updated when information on the subject's health profile grows with time. We focus in this work on statistical methods for quantifying and comparing dynamic predictive accuracy of this kind of prognostic models, accounting for right censoring and possibly competing events. Dynamic area under the ROC curve (AUC) and Brier Score (BS) are used to quantify predictive accuracy. Nonparametric inverse probability of censoring weighting is used to estimate dynamic curves of AUC and BS as functions of the time at which predictions are made. Asymptotic results are established and both pointwise confidence intervals and simultaneous confidence bands are derived. Tests are also proposed to compare the dynamic prediction accuracy curves of two prognostic models. The finite sample behavior of the inference procedures is assessed via simulations. We apply the proposed methodology to compare various prediction models using repeated measures of two psychometric tests to predict dementia in the elderly, accounting for the competing risk of death. Models are estimated on the French Paquid cohort and predictive accuracies are evaluated and compared on the French Three-City cohort. © 2014, The International Biometric Society.

  19. Stochastic modeling of central apnea events in preterm infants

    International Nuclear Information System (INIS)

    Clark, Matthew T; Lake, Douglas E; Randall Moorman, J; Delos, John B; Lee, Hoshik; Fairchild, Karen D; Kattwinkel, John

    2016-01-01

    A near-ubiquitous pathology in very low birth weight infants is neonatal apnea, breathing pauses with slowing of the heart and falling blood oxygen. Events of substantial duration occasionally occur after an infant is discharged from the neonatal intensive care unit (NICU). It is not known whether apneas result from a predictable process or from a stochastic process, but the observation that they occur in seemingly random clusters justifies the use of stochastic models. We use a hidden-Markov model to analyze the distribution of durations of apneas and the distribution of times between apneas. The model suggests the presence of four breathing states, ranging from very stable (with an average lifetime of 12 h) to very unstable (with an average lifetime of 10 s). Although the states themselves are not visible, the mathematical analysis gives estimates of the transition rates among these states. We have obtained these transition rates, and shown how they change with post-menstrual age; as expected, the residence time in the more stable breathing states increases with age. We also extrapolated the model to predict the frequency of very prolonged apnea during the first year of life. This paradigm—stochastic modeling of cardiorespiratory control in neonatal infants to estimate risk for severe clinical events—may be a first step toward personalized risk assessment for life threatening apnea events after NICU discharge. (paper)

  20. Empagliflozin and Cerebrovascular Events in Patients With Type 2 Diabetes Mellitus at High Cardiovascular Risk.

    Science.gov (United States)

    Zinman, Bernard; Inzucchi, Silvio E; Lachin, John M; Wanner, Christoph; Fitchett, David; Kohler, Sven; Mattheus, Michaela; Woerle, Hans J; Broedl, Uli C; Johansen, Odd Erik; Albers, Gregory W; Diener, Hans Christoph

    2017-05-01

    In the EMPA-REG OUTCOME trial (Empagliflozin Cardiovascular Outcome Event Trial in Type 2 Diabetes Mellitus Patients), empagliflozin added to standard of care in patients with type 2 diabetes mellitus and high cardiovascular risk reduced the risk of 3-point major adverse cardiovascular events, driven by a reduction in cardiovascular mortality, with no significant difference between empagliflozin and placebo in risk of myocardial infarction or stroke. In a modified intent-to-treat analysis, the hazard ratio for stroke was 1.18 (95% confidence interval, 0.89-1.56; P =0.26). We further investigated cerebrovascular events. Patients were randomized to empagliflozin 10 mg, empagliflozin 25 mg, or placebo; 7020 patients were treated. Median observation time was 3.1 years. The numeric difference in stroke between empagliflozin and placebo in the modified intent-to-treat analysis was primarily because of 18 patients in the empagliflozin group with a first event >90 days after last intake of study drug (versus 3 on placebo). In a sensitivity analysis based on events during treatment or ≤90 days after last dose of drug, the hazard ratio for stroke with empagliflozin versus placebo was 1.08 (95% confidence interval, 0.81-1.45; P =0.60). There were no differences in risk of recurrent, fatal, or disabling strokes, or transient ischemic attack, with empagliflozin versus placebo. Patients with the largest increases in hematocrit or largest decreases in systolic blood pressure did not have an increased risk of stroke. In patients with type 2 diabetes mellitus and high cardiovascular risk, there was no significant difference in the risk of cerebrovascular events with empagliflozin versus placebo. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01131676. © 2017 The Authors.

  1. Risk based surveillance for vector borne diseases

    DEFF Research Database (Denmark)

    Bødker, Rene

    of samples and hence early detection of outbreaks. Models for vector borne diseases in Denmark have demonstrated dramatic variation in outbreak risk during the season and between years. The Danish VetMap project aims to make these risk based surveillance estimates available on the veterinarians smart phones...... in Northern Europe. This model approach may be used as a basis for risk based surveillance. In risk based surveillance limited resources for surveillance are targeted at geographical areas most at risk and only when the risk is high. This makes risk based surveillance a cost effective alternative...... sample to a diagnostic laboratory. Risk based surveillance models may reduce this delay. An important feature of risk based surveillance models is their ability to continuously communicate the level of risk to veterinarians and hence increase awareness when risk is high. This is essential for submission...

  2. Causal Loop-based Modeling on System Dynamics for Risk Communication

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Ju [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kang, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2009-10-15

    It is true that a national policy should be based on public confidence, analyzing their recognition and attitude on life safety, since they have very special risk perception characteristics. For achieving effective public consensus regarding a national policy such as nuclear power, we have to utilize a risk communication (hereafter, calls RiCom) process. However, domestic research models on RiCom process do not provide a practical guideline, because most of them are still superficial and stick on an administrative aspect. Also, most of current models have no experience in terms of verification and validation for effective applications to diverse stake holders. This study focuses on public's dynamic mechanism through the modeling on system dynamics, basically utilizing casual loop diagram (CLD) and stock flow diagram (SFD), which regards as a critical technique for decision making in many industrial RiCom models.

  3. Causal Loop-based Modeling on System Dynamics for Risk Communication

    International Nuclear Information System (INIS)

    Lee, Chang Ju; Kang, Kyung Min

    2009-01-01

    It is true that a national policy should be based on public confidence, analyzing their recognition and attitude on life safety, since they have very special risk perception characteristics. For achieving effective public consensus regarding a national policy such as nuclear power, we have to utilize a risk communication (hereafter, calls RiCom) process. However, domestic research models on RiCom process do not provide a practical guideline, because most of them are still superficial and stick on an administrative aspect. Also, most of current models have no experience in terms of verification and validation for effective applications to diverse stake holders. This study focuses on public's dynamic mechanism through the modeling on system dynamics, basically utilizing casual loop diagram (CLD) and stock flow diagram (SFD), which regards as a critical technique for decision making in many industrial RiCom models

  4. Are markers of inflammation more strongly associated with risk for fatal than for nonfatal vascular events?

    Directory of Open Access Journals (Sweden)

    Naveed Sattar

    2009-06-01

    Full Text Available BACKGROUND: Circulating inflammatory markers may more strongly relate to risk of fatal versus nonfatal cardiovascular disease (CVD events, but robust prospective evidence is lacking. We tested whether interleukin (IL-6, C-reactive protein (CRP, and fibrinogen more strongly associate with fatal compared to nonfatal myocardial infarction (MI and stroke. METHODS AND FINDINGS: In the Prospective Study of Pravastatin in the Elderly at Risk (PROSPER, baseline inflammatory markers in up to 5,680 men and women aged 70-82 y were related to risk for endpoints; nonfatal CVD (i.e., nonfatal MI and nonfatal stroke [n = 672], fatal CVD (n = 190, death from other CV causes (n = 38, and non-CVD mortality (n = 300, over 3.2-y follow-up. Elevations in baseline IL-6 levels were significantly (p = 0.0009; competing risks model analysis more strongly associated with fatal CVD (hazard ratio [HR] for 1 log unit increase in IL-6 1.75, 95% confidence interval [CI] 1.44-2.12 than with risk of nonfatal CVD (1.17, 95% CI 1.04-1.31, in analyses adjusted for treatment allocation. The findings were consistent in a fully adjusted model. These broad trends were similar for CRP and, to a lesser extent, for fibrinogen. The results were also similar in placebo and statin recipients (i.e., no interaction. The C-statistic for fatal CVD using traditional risk factors was significantly (+0.017; p<0.0001 improved by inclusion of IL-6 but not so for nonfatal CVD events (p = 0.20. CONCLUSIONS: In PROSPER, inflammatory markers, in particular IL-6 and CRP, are more strongly associated with risk of fatal vascular events than nonfatal vascular events. These novel observations may have important implications for better understanding aetiology of CVD mortality, and have potential clinical relevance.

  5. Life cycle cost-based risk model for energy performance contracting retrofits

    Science.gov (United States)

    Berghorn, George H.

    Buildings account for 41% of the primary energy consumption in the United States, nearly half of which is accounted for by commercial buildings. Among the greatest energy users are those in the municipalities, universities, schools, and hospitals (MUSH) market. Correctional facilities are in the upper half of all commercial building types for energy intensity. Public agencies have experienced reduced capital budgets to fund retrofits; this has led to the increased use of energy performance contracts (EPC), which are implemented by energy services companies (ESCOs). These companies guarantee a minimum amount of energy savings resulting from the retrofit activities, which in essence transfers performance risk from the owner to the contractor. Building retrofits in the MUSH market, especially correctional facilities, are well-suited to EPC, yet despite this potential and their high energy intensities, efficiency improvements lag behind that of other public building types. Complexities in project execution, lack of support for data requests and sub-metering, and conflicting project objectives have been cited as reasons for this lag effect. As a result, project-level risks must be understood in order to support wider adoption of retrofits in the public market, in particular the correctional facility sub-market. The goal of this research is to understand risks related to the execution of energy efficiency retrofits delivered via EPC in the MUSH market. To achieve this goal, in-depth analysis and improved understanding was sought with regard to ESCO risks that are unique to EPC in this market. The proposed work contributes to this understanding by developing a life cycle cost-based risk model to improve project decision making with regard to risk control and reduction. The specific objectives of the research are: (1) to perform an exploratory analysis of the EPC retrofit process and identify key areas of performance risk requiring in-depth analysis; (2) to construct a

  6. People's Risk Recognition Preceding Evacuation and Its Role in Demand Modeling and Planning.

    Science.gov (United States)

    Urata, Junji; Pel, Adam J

    2018-05-01

    Evacuation planning and management involves estimating the travel demand in the event that such action is required. This is usually done as a function of people's decision to evacuate, which we show is strongly linked to their risk awareness. We use an empirical data set, which shows tsunami evacuation behavior, to demonstrate that risk recognition is not synonymous with objective risk, but is instead determined by a combination of factors including risk education, information, and sociodemographics, and that it changes dynamically over time. Based on these findings, we formulate an ordered logit model to describe risk recognition combined with a latent class model to describe evacuation choices. Our proposed evacuation choice model along with a risk recognition class can evaluate quantitatively the influence of disaster mitigation measures, risk education, and risk information. The results obtained from the risk recognition model show that risk information has a greater impact in the sense that people recognize their high risk. The results of the evacuation choice model show that people who are unaware of their risk take a longer time to evacuate. © 2017 Society for Risk Analysis.

  7. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    Directory of Open Access Journals (Sweden)

    Wooyeon Sunwoo

    2017-01-01

    Full Text Available Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC prior to a rainfall event. Soil moisture is one of the most important variables in rainfall–runoff modeling, and remotely sensed soil moisture is recognized as an effective way to improve the accuracy of runoff prediction. In this study, the IWC was evaluated based on remotely sensed soil moisture by using the Soil Conservation Service-Curve Number (SCS-CN method, which is one of the representative event-based models used for reducing the uncertainty of runoff prediction. Four proxy variables for the IWC were determined from the measurements of total rainfall depth (API5, ground-based soil moisture (SSMinsitu, remotely sensed surface soil moisture (SSM, and soil water index (SWI provided by the advanced scatterometer (ASCAT. To obtain a robust IWC framework, this study consists of two main parts: the validation of remotely sensed soil moisture, and the evaluation of runoff prediction using four proxy variables with a set of rainfall–runoff events in the East Asian monsoon region. The results showed an acceptable agreement between remotely sensed soil moisture (SSM and SWI and ground based soil moisture data (SSMinsitu. In the proxy variable analysis, the SWI indicated the optimal value among the proposed proxy variables. In the runoff prediction analysis considering various infiltration conditions, the SSM and SWI proxy variables significantly reduced the runoff prediction error as compared with API5 by 60% and 66%, respectively. Moreover, the proposed IWC framework with

  8. Markov modeling and discrete event simulation in health care: a systematic comparison.

    Science.gov (United States)

    Standfield, Lachlan; Comans, Tracy; Scuffham, Paul

    2014-04-01

    The aim of this study was to assess if the use of Markov modeling (MM) or discrete event simulation (DES) for cost-effectiveness analysis (CEA) may alter healthcare resource allocation decisions. A systematic literature search and review of empirical and non-empirical studies comparing MM and DES techniques used in the CEA of healthcare technologies was conducted. Twenty-two pertinent publications were identified. Two publications compared MM and DES models empirically, one presented a conceptual DES and MM, two described a DES consensus guideline, and seventeen drew comparisons between MM and DES through the authors' experience. The primary advantages described for DES over MM were the ability to model queuing for limited resources, capture individual patient histories, accommodate complexity and uncertainty, represent time flexibly, model competing risks, and accommodate multiple events simultaneously. The disadvantages of DES over MM were the potential for model overspecification, increased data requirements, specialized expensive software, and increased model development, validation, and computational time. Where individual patient history is an important driver of future events an individual patient simulation technique like DES may be preferred over MM. Where supply shortages, subsequent queuing, and diversion of patients through other pathways in the healthcare system are likely to be drivers of cost-effectiveness, DES modeling methods may provide decision makers with more accurate information on which to base resource allocation decisions. Where these are not major features of the cost-effectiveness question, MM remains an efficient, easily validated, parsimonious, and accurate method of determining the cost-effectiveness of new healthcare interventions.

  9. Methodological Bases for Describing Risks of the Enterprise Business Model in Integrated Reporting

    Directory of Open Access Journals (Sweden)

    Nesterenko Oksana O.

    2017-12-01

    Full Text Available The aim of the article is to substantiate the methodological bases for describing the business and accounting risks of an enterprise business model in integrated reporting for their timely detection and assessment, and develop methods for their leveling or minimizing and possible prevention. It is proposed to consider risks in the process of forming integrated reporting from two sides: first, risks that arise in the business model of an organization and should be disclosed in its integrated report; second, accounting risks of integrated reporting, which should be taken into account by members of the cross-sectoral working group and management personnel in the process of forming and promulgating integrated reporting. To develop an adequate accounting and analytical tool for disclosure of information about the risks of the business model and integrated reporting, their leveling or minimization, in the article a terminological analysis of the essence of entrepreneurial and accounting risks is carried out. The entrepreneurial risk is defined as an objective-subjective economic category that characterizes the probability of negative or positive consequences of economic-social-ecological activity within the framework of the business model of an enterprise under uncertainty. The accounting risk is suggested to be understood as the probability of unfavorable consequences as a result of organizational, methodological errors in the integrated accounting system, which present threat to the quality, accuracy and reliability of the reporting information on economic, social and environmental activities in integrated reporting as well as threat of inappropriate decision-making by stakeholders based on the integrated report. For the timely identification of business risks and maximum leveling of the influence of accounting risks on the process of formation and publication of integrated reporting, in the study the place of entrepreneurial and accounting risks in

  10. Towards a whole-network risk assessment for railway bridge failures caused by scour during flood events

    Directory of Open Access Journals (Sweden)

    Lamb Rob

    2016-01-01

    Full Text Available Localised erosion (scour during flood flow conditions can lead to costly damage or catastrophic failure of bridges, and in some cases loss of life or significant disruption to transport networks. Here, we take a broad scale view to assess risk associated with bridge scour during flood events over an entire infrastructure network, illustrating the analysis with data from the British railways. There have been 54 recorded events since 1846 in which scour led to the failure of railway bridges in Britain. These events tended to occur during periods of extremely high river flow, although there is uncertainty about the precise conditions under which failures occur, which motivates a probabilistic analysis of the failure events. We show how data from the historical bridge failures, combined with hydrological analysis, have been used to construct fragility curves that quantify the conditional probability of bridge failure as a function of river flow, accompanied by estimates of the associated uncertainty. The new fragility analysis is tested using flood events simulated from a national, spatial joint probability model for extremes in river flows. The combined models appear robust in comparison with historical observations of the expected number of bridge failures in a flood event, and provide an empirical basis for further broad-scale network risk analysis.

  11. Evaluating the Risk of Metabolic Syndrome Based on an Artificial Intelligence Model

    Directory of Open Access Journals (Sweden)

    Hui Chen

    2014-01-01

    Full Text Available Metabolic syndrome is worldwide public health problem and is a serious threat to people's health and lives. Understanding the relationship between metabolic syndrome and the physical symptoms is a difficult and challenging task, and few studies have been performed in this field. It is important to classify adults who are at high risk of metabolic syndrome without having to use a biochemical index and, likewise, it is important to develop technology that has a high economic rate of return to simplify the complexity of this detection. In this paper, an artificial intelligence model was developed to identify adults at risk of metabolic syndrome based on physical signs; this artificial intelligence model achieved more powerful capacity for classification compared to the PCLR (principal component logistic regression model. A case study was performed based on the physical signs data, without using a biochemical index, that was collected from the staff of Lanzhou Grid Company in Gansu province of China. The results show that the developed artificial intelligence model is an effective classification system for identifying individuals at high risk of metabolic syndrome.

  12. Multi-day activity scheduling reactions to planned activities and future events in a dynamic agent-based model of activity-travel behavior

    NARCIS (Netherlands)

    Nijland, E.W.L.; Arentze, T.A.; Timmermans, H.J.P.

    2009-01-01

    Modeling multi-day planning has received scarce attention today in activity-based transport demand modeling. Elaborating and combining previous work on event-driven activity generation, the aim of this paper is to develop and illustrate an extension of a need-based model of activity generation that

  13. Quantifying and estimating the predictive accuracy for censored time-to-event data with competing risks.

    Science.gov (United States)

    Wu, Cai; Li, Liang

    2018-05-15

    This paper focuses on quantifying and estimating the predictive accuracy of prognostic models for time-to-event outcomes with competing events. We consider the time-dependent discrimination and calibration metrics, including the receiver operating characteristics curve and the Brier score, in the context of competing risks. To address censoring, we propose a unified nonparametric estimation framework for both discrimination and calibration measures, by weighting the censored subjects with the conditional probability of the event of interest given the observed data. The proposed method can be extended to time-dependent predictive accuracy metrics constructed from a general class of loss functions. We apply the methodology to a data set from the African American Study of Kidney Disease and Hypertension to evaluate the predictive accuracy of a prognostic risk score in predicting end-stage renal disease, accounting for the competing risk of pre-end-stage renal disease death, and evaluate its numerical performance in extensive simulation studies. Copyright © 2018 John Wiley & Sons, Ltd.

  14. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Atherosclerosis profile and incidence of cardiovascular events: a population-based survey

    Directory of Open Access Journals (Sweden)

    Bullano Michael F

    2009-09-01

    Full Text Available Abstract Background Atherosclerosis is a chronic progressive disease often presenting as clinical cardiovascular disease (CVD events. This study evaluated the characteristics of individuals with a diagnosis of atherosclerosis and estimated the incidence of CVD events to assist in the early identification of high-risk individuals. Methods Respondents to the US SHIELD baseline survey were followed for 2 years to observe incident self-reported CVD. Respondents had subclinical atherosclerosis if they reported a diagnosis of narrow or blocked arteries/carotid artery disease without a past clinical CVD event (heart attack, stroke or revascularization. Characteristics of those with atherosclerosis and incident CVD were compared with those who did not report atherosclerosis at baseline but had CVD in the following 2 years using chi-square tests. Logistic regression model identified characteristics associated with atherosclerosis and incident events. Results Of 17,640 respondents, 488 (2.8% reported having subclinical atherosclerosis at baseline. Subclinical atherosclerosis was associated with age, male gender, dyslipidemia, circulation problems, hypertension, past smoker, and a cholesterol test in past year (OR = 2.2 [all p Conclusion Self-report of subclinical atherosclerosis identified an extremely high-risk group with a >25% risk of a CVD event in the next 2 years. These characteristics may be useful for identifying individuals for more aggressive diagnostic and therapeutic efforts.

  16. Model-free and model-based reward prediction errors in EEG.

    Science.gov (United States)

    Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy

    2018-05-24

    Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Radiologically isolated syndrome: 5-year risk for an initial clinical event.

    Directory of Open Access Journals (Sweden)

    Darin T Okuda

    Full Text Available OBJECTIVE: To report the 5-year risk and to identify risk factors for the development of a seminal acute or progressive clinical event in a multi-national cohort of asymptomatic subjects meeting 2009 RIS Criteria. METHODS: Retrospectively identified RIS subjects from 22 databases within 5 countries were evaluated. Time to the first clinical event related to demyelination (acute or 12-month progression of neurological deficits was compared across different groups by univariate and multivariate analyses utilizing a Cox regression model. RESULTS: Data were available in 451 RIS subjects (F: 354 (78.5%. The mean age at from the time of the first brain MRI revealing anomalies suggestive of MS was 37.2 years (y (median: 37.1 y, range: 11-74 y with mean clinical follow-up time of 4.4 y (median: 2.8 y, range: 0.01-21.1 y. Clinical events were identified in 34% (standard error=3% of individuals within a 5-year period from the first brain MRI study. Of those who developed symptoms, 9.6% fulfilled criteria for primary progressive MS. In the multivariate model, age [hazard ratio (HR: 0.98 (95% CI: 0.96-0.99; p=0.03], sex (male [HR: 1.93 (1.24-2.99; p=0.004], and lesions within the cervical or thoracic spinal cord [HR: 3.08 (2.06-4.62; p=<0.001] were identified as significant predictors for the development of a first clinical event. INTERPRETATION: These data provide supportive evidence that a meaningful number of RIS subjects evolve to a first clinical symptom. An age <37 y, male sex, and spinal cord involvement appear to be the most important independent predictors of symptom onset.

  18. Distribution of Estimated 10-Year Risk of Recurrent Vascular Events and Residual Risk in a Secondary Prevention Population

    NARCIS (Netherlands)

    Kaasenbrood, Lotte; Boekholdt, S. Matthijs; van der Graaf, Yolanda; Ray, Kausik K.; Peters, Ron J. G.; Kastelein, John J. P.; Amarenco, Pierre; LaRosa, John C.; Cramer, Maarten J. M.; Westerink, Jan; Kappelle, L. Jaap; de Borst, Gert J.; Visseren, Frank L. J.

    2016-01-01

    Among patients with clinically manifest vascular disease, the risk of recurrent vascular events is likely to vary. We assessed the distribution of estimated 10-year risk of recurrent vascular events in a secondary prevention population. We also estimated the potential risk reduction and residual

  19. Distribution of Estimated 10-Year Risk of Recurrent Vascular Events and Residual Risk in a Secondary Prevention Population

    NARCIS (Netherlands)

    Kaasenbrood, Lotte; Boekholdt, S. Matthijs; Van Der Graaf, Yolanda; Ray, Kausik K.; Peters, Ron J G; Kastelein, John J P; Amarenco, Pierre; Larosa, John C.; Cramer, Maarten J M; Westerink, Jan; Kappelle, L. Jaap; De Borst, Gert J.; Visseren, Frank L J

    2016-01-01

    Background: Among patients with clinically manifest vascular disease, the risk of recurrent vascular events is likely to vary. We assessed the distribution of estimated 10-year risk of recurrent vascular events in a secondary prevention population. We also estimated the potential risk reduction and

  20. Risk, individual differences, and environment: an Agent-Based Modeling approach to sexual risk-taking.

    Science.gov (United States)

    Nagoski, Emily; Janssen, Erick; Lohrmann, David; Nichols, Eric

    2012-08-01

    Risky sexual behaviors, including the decision to have unprotected sex, result from interactions between individuals and their environment. The current study explored the use of Agent-Based Modeling (ABM)-a methodological approach in which computer-generated artificial societies simulate human sexual networks-to assess the influence of heterogeneity of sexual motivation on the risk of contracting HIV. The models successfully simulated some characteristics of human sexual systems, such as the relationship between individual differences in sexual motivation (sexual excitation and inhibition) and sexual risk, but failed to reproduce the scale-free distribution of number of partners observed in the real world. ABM has the potential to inform intervention strategies that target the interaction between an individual and his or her social environment.

  1. A Vulnerability-Based, Bottom-up Assessment of Future Riverine Flood Risk Using a Modified Peaks-Over-Threshold Approach and a Physically Based Hydrologic Model

    Science.gov (United States)

    Knighton, James; Steinschneider, Scott; Walter, M. Todd

    2017-12-01

    There is a chronic disconnection among purely probabilistic flood frequency analysis of flood hazards, flood risks, and hydrological flood mechanisms, which hamper our ability to assess future flood impacts. We present a vulnerability-based approach to estimating riverine flood risk that accommodates a more direct linkage between decision-relevant metrics of risk and the dominant mechanisms that cause riverine flooding. We adapt the conventional peaks-over-threshold (POT) framework to be used with extreme precipitation from different climate processes and rainfall-runoff-based model output. We quantify the probability that at least one adverse hydrologic threshold, potentially defined by stakeholders, will be exceeded within the next N years. This approach allows us to consider flood risk as the summation of risk from separate atmospheric mechanisms, and supports a more direct mapping between hazards and societal outcomes. We perform this analysis within a bottom-up framework to consider the relevance and consequences of information, with varying levels of credibility, on changes to atmospheric patterns driving extreme precipitation events. We demonstrate our proposed approach using a case study for Fall Creek in Ithaca, NY, USA, where we estimate the risk of stakeholder-defined flood metrics from three dominant mechanisms: summer convection, tropical cyclones, and spring rain and snowmelt. Using downscaled climate projections, we determine how flood risk associated with a subset of mechanisms may change in the future, and the resultant shift to annual flood risk. The flood risk approach we propose can provide powerful new insights into future flood threats.

  2. Managing the risk of extreme climate events in Australian major wheat production systems

    Science.gov (United States)

    Luo, Qunying; Trethowan, Richard; Tan, Daniel K. Y.

    2018-06-01

    Extreme climate events (ECEs) such as drought, frost risk and heat stress cause significant economic losses in Australia. The risk posed by ECEs in the wheat production systems of Australia could be better managed through the identification of safe flowering (SFW) and optimal time of sowing (TOS) windows. To address this issue, three locations (Narrabri, Roseworthy and Merredin), three cultivars (Suntop and Gregory for Narrabri, Mace for both Roseworthy and Merredin) and 20 TOS at 1-week intervals between 1 April and 12 August for the period from 1957 to 2007 were evaluated using the Agricultural Production System sIMulator (APSIM)-Wheat model. Simulation results show that (1) the average frequency of frost events decreased with TOS from 8 to 0 days (d) across the four cases (the combination of locations and cultivars), (2) the average frequency of heat stress events increased with TOS across all cases from 0 to 10 d, (3) soil moisture stress (SMS) increased with earlier TOS before reaching a plateau and then slightly decreasing for Suntop and Gregory at Narrabri and Mace at Roseworthy while SMS increased with TOS for Mace at Merredin from 0.1 to 0.8, (4) Mace at Merredin had the earliest and widest SFW (216-260) while Mace at Roseworthy had latest SFW (257-280), (5) frost risk and heat stress determine SFW at wetter sites (i.e. Narrabri and Roseworthy) while frost risk and SMS determine SFW at drier site (i.e. Merredin) and (6) the optimal TOS (window) to maximise wheat yield are 6-20 May, 13-27 May and 15 April at Narrabri, Roseworthy and Merredin, respectively. These findings provide important and specific information for wheat growers about the management of ECE risk on farm. Furthermore, the coupling of the APSIM crop models with state-of-the-art seasonal and intra-seasonal climate forecast information provides an important tool for improved management of the risk of ECEs in economically important cropping industries in the foreseeable future.

  3. Evolution of risk assessment strategies for food and feed uses of stacked GM events.

    Science.gov (United States)

    Kramer, Catherine; Brune, Phil; McDonald, Justin; Nesbitt, Monique; Sauve, Alaina; Storck-Weyhermueller, Sabine

    2016-09-01

    Data requirements are not harmonized globally for the regulation of food and feed derived from stacked genetically modified (GM) events, produced by combining individual GM events through conventional breeding. The data required by some regulatory agencies have increased despite the absence of substantiated adverse effects to animals or humans from the consumption of GM crops. Data from studies conducted over a 15-year period for several stacked GM event maize (Zea mays L.) products (Bt11 ×  GA21, Bt11 ×  MIR604, MIR604 ×  GA21, Bt11 ×  MIR604 ×  GA21, Bt11 ×  MIR162 ×  GA21 and Bt11 ×  MIR604 ×  MIR162 ×  GA21), together with their component single events, are presented. These data provide evidence that no substantial changes in composition, protein expression or insert stability have occurred after combining the single events through conventional breeding. An alternative food and feed risk assessment strategy for stacked GM events is suggested based on a problem formulation approach that utilizes (i) the outcome of the single event risk assessments, and (ii) the potential for interactions in the stack, based on an understanding of the mode of action of the transgenes and their products. © 2016 The Authors. Plant Biotechnology Journal published by Society for Experimental Biology and The Association of Applied Biologists and John Wiley & Sons Ltd.

  4. Aspirin and the risk of cardiovascular events in atherosclerosis patients with and without prior ischemic events.

    Science.gov (United States)

    Bavry, Anthony A; Elgendy, Islam Y; Elbez, Yedid; Mahmoud, Ahmed N; Sorbets, Emmanuel; Steg, Philippe Gabriel; Bhatt, Deepak L

    2017-09-01

    The benefit of aspirin among patients with stable atherosclerosis without a prior ischemic event is not well defined. Aspirin would be of benefit in outpatients with atherosclerosis with prior ischemic events, but not in those without ischemic events. Subjects from the Reduction of Atherothrombosis for Continued Health registry were divided according to prior ischemic event (n =21 724) vs stable atherosclerosis, but no prior ischemic event (n = 11 872). Analyses were propensity score matched. Aspirin use was updated at each clinic visit and considered as a time-varying covariate. The primary outcome was the first occurrence of cardiovascular death, myocardial infarction, or stroke. In the group with a prior ischemic event, aspirin use was associated with a marginally lower risk of the primary outcome at a median of 41 months (hazard ratio [HR]: 0.81, 95% confidence interval [CI]: 0.65-1.01, P = 0.06). In the group without a prior ischemic event, aspirin use was not associated with a lower risk of the primary outcome at a median of 36 months (HR: 1.03, 95% CI: 0.73-1.45, P = 0.86). In this observational analysis of outpatients with stable atherosclerosis, aspirin was marginally beneficial among patients with a prior ischemic event; however, there was no apparent benefit among those with no prior ischemic event. © 2017 Wiley Periodicals, Inc.

  5. Predictive event modelling in multicenter clinical trials with waiting time to response.

    Science.gov (United States)

    Anisimov, Vladimir V

    2011-01-01

    A new analytic statistical technique for predictive event modeling in ongoing multicenter clinical trials with waiting time to response is developed. It allows for the predictive mean and predictive bounds for the number of events to be constructed over time, accounting for the newly recruited patients and patients already at risk in the trial, and for different recruitment scenarios. For modeling patient recruitment, an advanced Poisson-gamma model is used, which accounts for the variation in recruitment over time, the variation in recruitment rates between different centers and the opening or closing of some centers in the future. A few models for event appearance allowing for 'recurrence', 'death' and 'lost-to-follow-up' events and using finite Markov chains in continuous time are considered. To predict the number of future events over time for an ongoing trial at some interim time, the parameters of the recruitment and event models are estimated using current data and then the predictive recruitment rates in each center are adjusted using individual data and Bayesian re-estimation. For a typical scenario (continue to recruit during some time interval, then stop recruitment and wait until a particular number of events happens), the closed-form expressions for the predictive mean and predictive bounds of the number of events at any future time point are derived under the assumptions of Markovian behavior of the event progression. The technique is efficiently applied to modeling different scenarios for some ongoing oncology trials. Case studies are considered. Copyright © 2011 John Wiley & Sons, Ltd.

  6. A Probabilistic Asteroid Impact Risk Model

    Science.gov (United States)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  7. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    OpenAIRE

    Wooyeon Sunwoo; Minha Choi

    2017-01-01

    Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC) prior to a rainfall even...

  8. On-line detection of apnea/hypopnea events using SpO2 signal: a rule-based approach employing binary classifier models.

    Science.gov (United States)

    Koley, Bijoy Laxmi; Dey, Debangshu

    2014-01-01

    This paper presents an online method for automatic detection of apnea/hypopnea events, with the help of oxygen saturation (SpO2) signal, measured at fingertip by Bluetooth nocturnal pulse oximeter. Event detection is performed by identifying abnormal data segments from the recorded SpO2 signal, employing a binary classifier model based on a support vector machine (SVM). Thereafter the abnormal segment is further analyzed to detect different states within the segment, i.e., steady, desaturation, and resaturation, with the help of another SVM-based binary ensemble classifier model. Finally, a heuristically obtained rule-based system is used to identify the apnea/hypopnea events from the time-sequenced decisions of these classifier models. In the developmental phase, a set of 34 time domain-based features was extracted from the segmented SpO2 signal using an overlapped windowing technique. Later, an optimal set of features was selected on the basis of recursive feature elimination technique. A total of 34 subjects were included in the study. The results show average event detection accuracies of 96.7% and 93.8% for the offline and the online tests, respectively. The proposed system provides direct estimation of the apnea/hypopnea index with the help of a relatively inexpensive and widely available pulse oximeter. Moreover, the system can be monitored and accessed by physicians through LAN/WAN/Internet and can be extended to deploy in Bluetooth-enabled mobile phones.

  9. Constructing Dynamic Event Trees from Markov Models

    International Nuclear Information System (INIS)

    Paolo Bucci; Jason Kirschenbaum; Tunc Aldemir; Curtis Smith; Ted Wood

    2006-01-01

    In the probabilistic risk assessment (PRA) of process plants, Markov models can be used to model accurately the complex dynamic interactions between plant physical process variables (e.g., temperature, pressure, etc.) and the instrumentation and control system that monitors and manages the process. One limitation of this approach that has prevented its use in nuclear power plant PRAs is the difficulty of integrating the results of a Markov analysis into an existing PRA. In this paper, we explore a new approach to the generation of failure scenarios and their compilation into dynamic event trees from a Markov model of the system. These event trees can be integrated into an existing PRA using software tools such as SAPHIRE. To implement our approach, we first construct a discrete-time Markov chain modeling the system of interest by: (a) partitioning the process variable state space into magnitude intervals (cells), (b) using analytical equations or a system simulator to determine the transition probabilities between the cells through the cell-to-cell mapping technique, and, (c) using given failure/repair data for all the components of interest. The Markov transition matrix thus generated can be thought of as a process model describing the stochastic dynamic behavior of the finite-state system. We can therefore search the state space starting from a set of initial states to explore all possible paths to failure (scenarios) with associated probabilities. We can also construct event trees of arbitrary depth by tracing paths from a chosen initiating event and recording the following events while keeping track of the probabilities associated with each branch in the tree. As an example of our approach, we use the simple level control system often used as benchmark in the literature with one process variable (liquid level in a tank), and three control units: a drain unit and two supply units. Each unit includes a separate level sensor to observe the liquid level in the tank

  10. Analysis of human error and organizational deficiency in events considering risk significance

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Kim, Yoonik; Kim, Say Hyung; Kim, Chansoo; Chung, Chang Hyun; Jung, Won Dea

    2004-01-01

    In this study, we analyzed human and organizational deficiencies in the trip events of Korean nuclear power plants. K-HPES items were used in human error analysis, and the organizational factors by Jacobs and Haber were used for organizational deficiency analysis. We proposed the use of CCDP as a risk measure to consider risk information in prioritizing K-HPES items and organizational factors. Until now, the risk significance of events has not been considered in human error and organizational deficiency analysis. Considering the risk significance of events in the process of analysis is necessary for effective enhancement of nuclear power plant safety by focusing on causes of human error and organizational deficiencies that are associated with significant risk

  11. Modeling of Ship Collision Risk Index Based on Complex Plane and Its Realization

    Directory of Open Access Journals (Sweden)

    Xiaoqin Xu

    2016-07-01

    Full Text Available Ship collision risk index is the basic and important concept in the domain of ship collision avoidance. In this paper, the advantages and deficiencies of the various calculation methods of ship collision risk index are pointed out. Then the ship collision risk model based on complex plane, which can well make up for the deficiencies of the widely-used evaluation model proposed by Kearon.J and Liu ruru is proposed. On this basis, the calculation method of collision risk index under the encountering situation of multi-ships is constructed, then the three-dimensional image and spatial curve of the risk index are figured out. Finally, single chip microcomputer is used to realize the model. And attaching this single chip microcomputer to ARPA is helpful to the decision-making of the marine navigators.

  12. Time-based collision risk modeling for air traffic management

    Science.gov (United States)

    Bell, Alan E.

    Since the emergence of commercial aviation in the early part of last century, economic forces have driven a steadily increasing demand for air transportation. Increasing density of aircraft operating in a finite volume of airspace is accompanied by a corresponding increase in the risk of collision, and in response to a growing number of incidents and accidents involving collisions between aircraft, governments worldwide have developed air traffic control systems and procedures to mitigate this risk. The objective of any collision risk management system is to project conflicts and provide operators with sufficient opportunity to recognize potential collisions and take necessary actions to avoid them. It is therefore the assertion of this research that the currency of collision risk management is time. Future Air Traffic Management Systems are being designed around the foundational principle of four dimensional trajectory based operations, a method that replaces legacy first-come, first-served sequencing priorities with time-based reservations throughout the airspace system. This research will demonstrate that if aircraft are to be sequenced in four dimensions, they must also be separated in four dimensions. In order to separate aircraft in four dimensions, time must emerge as the primary tool by which air traffic is managed. A functional relationship exists between the time-based performance of aircraft, the interval between aircraft scheduled to cross some three dimensional point in space, and the risk of collision. This research models that relationship and presents two key findings. First, a method is developed by which the ability of an aircraft to meet a required time of arrival may be expressed as a robust standard for both industry and operations. Second, a method by which airspace system capacity may be increased while maintaining an acceptable level of collision risk is presented and demonstrated for the purpose of formulating recommendations for procedures

  13. Ontology-based prediction of surgical events in laparoscopic surgery

    Science.gov (United States)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  14. Implementation of PSA models to estimate the probabilities associated with external event combination

    International Nuclear Information System (INIS)

    Burgazzi, Luciano

    2014-01-01

    This note endeavors to address some significant issues revealed by the Fukushima accident in Japan in 2011, such as the analysis of various dependency aspects arisen in the light of the external event PSA framework, as the treatment of the correlated hazards. To this aim some foundational notions to implement the PSA models related to specific aspects, like the external hazard combination, e.g., earthquake and tsunami as at the Fukushima accident, and the external hazard-caused internal events, e.g., seismic induced fire, are proposed and discussed to be incorporated within the risk assessment structure. Risk assessment of external hazards is required and utilized as an integrated part of PRA for operating and new reactor units. In the light of the Fukushima accident, of special interest are correlated events, whose modelling is proposed in the present study, in the form of some theoretical concepts, which lay the foundations for the PSA framework implementation. An applicative example is presented for illustrative purposes, since the analysis is carried out on the basis of generic numerical values assigned to an oversimplified model and results are achieved without any baseline comparison. Obviously the first step aimed at the process endorsement is the analysis of all available information in order to determine the level of applicability of the observed specific plant site events to the envisaged model and the statistical correlation analysis for event occurrence data that can be used as part of this process. Despite these drawbacks that actually do not qualify the achieved results, the present work represents an exploratory study aimed at resolving current open issues to be resolved in the PSA, like topics related to unanticipated scenarios: the combined external hazards of the earthquake and tsunami in Fukushima, external hazards causing internal events, such as seismic induced fire. These topics are to be resolved among the other ones as emerging from the

  15. The AFFORD clinical decision aid to identify emergency department patients with atrial fibrillation at low risk for 30-day adverse events.

    Science.gov (United States)

    Barrett, Tyler W; Storrow, Alan B; Jenkins, Cathy A; Abraham, Robert L; Liu, Dandan; Miller, Karen F; Moser, Kelly M; Russ, Stephan; Roden, Dan M; Harrell, Frank E; Darbar, Dawood

    2015-03-15

    There is wide variation in the management of patients with atrial fibrillation (AF) in the emergency department (ED). We aimed to derive and internally validate the first prospective, ED-based clinical decision aid to identify patients with AF at low risk for 30-day adverse events. We performed a prospective cohort study at a university-affiliated tertiary-care ED. Patients were enrolled from June 9, 2010, to February 28, 2013, and followed for 30 days. We enrolled a convenience sample of patients in ED presenting with symptomatic AF. Candidate predictors were based on ED data available in the first 2 hours. The decision aid was derived using model approximation (preconditioning) followed by strong bootstrap internal validation. We used an ordinal outcome hierarchy defined as the incidence of the most severe adverse event within 30 days of the ED evaluation. Of 497 patients enrolled, stroke and AF-related death occurred in 13 (3%) and 4 (aid included the following: age, triage vitals (systolic blood pressure, temperature, respiratory rate, oxygen saturation, supplemental oxygen requirement), medical history (heart failure, home sotalol use, previous percutaneous coronary intervention, electrical cardioversion, cardiac ablation, frequency of AF symptoms), and ED data (2 hours heart rate, chest radiograph results, hemoglobin, creatinine, and brain natriuretic peptide). The decision aid's c-statistic in predicting any 30-day adverse event was 0.7 (95% confidence interval 0.65, 0.76). In conclusion, in patients with AF in the ED, Atrial Fibrillation and Flutter Outcome Risk Determination provides the first evidence-based decision aid for identifying patients who are at low risk for 30-day adverse events and candidates for safe discharge. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Risk assessment of storm surge disaster based on numerical models and remote sensing

    Science.gov (United States)

    Liu, Qingrong; Ruan, Chengqing; Zhong, Shan; Li, Jian; Yin, Zhonghui; Lian, Xihu

    2018-06-01

    Storm surge is one of the most serious ocean disasters in the world. Risk assessment of storm surge disaster for coastal areas has important implications for planning economic development and reducing disaster losses. Based on risk assessment theory, this paper uses coastal hydrological observations, a numerical storm surge model and multi-source remote sensing data, proposes methods for valuing hazard and vulnerability for storm surge and builds a storm surge risk assessment model. Storm surges in different recurrence periods are simulated in numerical models and the flooding areas and depth are calculated, which are used for assessing the hazard of storm surge; remote sensing data and GIS technology are used for extraction of coastal key objects and classification of coastal land use are identified, which is used for vulnerability assessment of storm surge disaster. The storm surge risk assessment model is applied for a typical coastal city, and the result shows the reliability and validity of the risk assessment model. The building and application of storm surge risk assessment model provides some basis reference for the city development plan and strengthens disaster prevention and mitigation.

  17. A quantitative risk-based model for reasoning over critical system properties

    Science.gov (United States)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  18. Are 12-lead ECG findings associated with the risk of cardiovascular events after ischemic stroke in young adults?

    Science.gov (United States)

    Pirinen, Jani; Putaala, Jukka; Aarnio, Karoliina; Aro, Aapo L; Sinisalo, Juha; Kaste, Markku; Haapaniemi, Elena; Tatlisumak, Turgut; Lehto, Mika

    2016-11-01

    Ischemic stroke (IS) in a young patient is a disaster and recurrent cardiovascular events could add further impairment. Identifying patients with high risk of such events is therefore important. The prognostic relevance of ECG for this population is unknown. A total of 690 IS patients aged 15-49 years were included. A 12-lead ECG was obtained 1-14 d after the onset of stroke. We adjusted for demographic factors, comorbidities, and stroke characteristics, Cox regression models were used to identify independent ECG parameters associated with long-term risks of (1) any cardiovascular event, (2) cardiac events, and (3) recurrent stroke. Median follow-up time was 8.8 years. About 26.4% of patients experienced a cardiovascular event, 14.5% had cardiac events, and 14.6% recurrent strokes. ECG parameters associated with recurrent cardiovascular events were bundle branch blocks, P-terminal force, left ventricular hypertrophy, and a broader QRS complex. Furthermore, more leftward P-wave axis, prolonged QTc, and P-wave duration >120 ms were associated with increased risks of cardiac events. No ECG parameters were independently associated with recurrent stroke. A 12-lead ECG can be used for risk prediction of cardiovascular events but not for recurrent stroke in young IS patients. KEY MESSAGES ECG is an easy, inexpensive, and useful tool for identifying young ischemic stroke patients with a high risk for recurrent cardiovascular events and it has a statistically significant association with these events even after adjusting for confounding factors. Bundle branch blocks, P-terminal force, broader QRS complex, LVH according to Cornell voltage duration criteria, more leftward P-wave axis, prolonged QTc, and P-wave duration >120 ms are predictors for future cardiovascular or cardiac events in these patients. No ECG parameters were independently associated with recurrent stroke.

  19. Construction and Updating of Event Models in Auditory Event Processing

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E.; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-01-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event…

  20. Integrating Behaviour in Software Models: An Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2011-01-01

    One of the main problems in model-based software engineering is modelling behaviour in such a way that the behaviour models can be easily integrated with each other, with the structural software models and with pre-existing software. In this paper, we propose an event coordination notation (ECNO)...

  1. A three-gene expression signature model for risk stratification of patients with neuroblastoma.

    Science.gov (United States)

    Garcia, Idoia; Mayol, Gemma; Ríos, José; Domenech, Gema; Cheung, Nai-Kong V; Oberthuer, André; Fischer, Matthias; Maris, John M; Brodeur, Garrett M; Hero, Barbara; Rodríguez, Eva; Suñol, Mariona; Galvan, Patricia; de Torres, Carmen; Mora, Jaume; Lavarino, Cinzia

    2012-04-01

    Neuroblastoma is an embryonal tumor with contrasting clinical courses. Despite elaborate stratification strategies, precise clinical risk assessment still remains a challenge. The purpose of this study was to develop a PCR-based predictor model to improve clinical risk assessment of patients with neuroblastoma. The model was developed using real-time PCR gene expression data from 96 samples and tested on separate expression data sets obtained from real-time PCR and microarray studies comprising 362 patients. On the basis of our prior study of differentially expressed genes in favorable and unfavorable neuroblastoma subgroups, we identified three genes, CHD5, PAFAH1B1, and NME1, strongly associated with patient outcome. The expression pattern of these genes was used to develop a PCR-based single-score predictor model. The model discriminated patients into two groups with significantly different clinical outcome [set 1: 5-year overall survival (OS): 0.93 ± 0.03 vs. 0.53 ± 0.06, 5-year event-free survival (EFS): 0.85 ± 0.04 vs. 0.042 ± 0.06, both P model was an independent marker for survival (P model robustly classified patients in the total cohort and in different clinically relevant risk subgroups. We propose for the first time in neuroblastoma, a technically simple PCR-based predictor model that could help refine current risk stratification systems. ©2012 AACR.

  2. Medical Updates Number 5 to the International Space Station Probability Risk Assessment (PRA) Model Using the Integrated Medical Model

    Science.gov (United States)

    Butler, Doug; Bauman, David; Johnson-Throop, Kathy

    2011-01-01

    The Integrated Medical Model (IMM) Project has been developing a probabilistic risk assessment tool, the IMM, to help evaluate in-flight crew health needs and impacts to the mission due to medical events. This package is a follow-up to a data package provided in June 2009. The IMM currently represents 83 medical conditions and associated ISS resources required to mitigate medical events. IMM end state forecasts relevant to the ISS PRA model include evacuation (EVAC) and loss of crew life (LOCL). The current version of the IMM provides the basis for the operational version of IMM expected in the January 2011 timeframe. The objectives of this data package are: 1. To provide a preliminary understanding of medical risk data used to update the ISS PRA Model. The IMM has had limited validation and an initial characterization of maturity has been completed using NASA STD 7009 Standard for Models and Simulation. The IMM has been internally validated by IMM personnel but has not been validated by an independent body external to the IMM Project. 2. To support a continued dialogue between the ISS PRA and IMM teams. To ensure accurate data interpretation, and that IMM output format and content meets the needs of the ISS Risk Management Office and ISS PRA Model, periodic discussions are anticipated between the risk teams. 3. To help assess the differences between the current ISS PRA and IMM medical risk forecasts of EVAC and LOCL. Follow-on activities are anticipated based on the differences between the current ISS PRA medical risk data and the latest medical risk data produced by IMM.

  3. Modeling Psychological Contract Violation using Dual Regime Models: An Event-based Approach.

    Science.gov (United States)

    Hofmans, Joeri

    2017-01-01

    A good understanding of the dynamics of psychological contract violation requires theories, research methods and statistical models that explicitly recognize that violation feelings follow from an event that violates one's acceptance limits, after which interpretative processes are set into motion, determining the intensity of these violation feelings. Whereas theories-in the form of the dynamic model of the psychological contract-and research methods-in the form of daily diary research and experience sampling research-are available by now, the statistical tools to model such a two-stage process are still lacking. The aim of the present paper is to fill this gap in the literature by introducing two statistical models-the Zero-Inflated model and the Hurdle model-that closely mimic the theoretical process underlying the elicitation violation feelings via two model components: a binary distribution that models whether violation has occurred or not, and a count distribution that models how severe the negative impact is. Moreover, covariates can be included for both model components separately, which yields insight into their unique and shared antecedents. By doing this, the present paper offers a methodological-substantive synergy, showing how sophisticated methodology can be used to examine an important substantive issue.

  4. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, S.Y.

    1994-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the US Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities

  5. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, Shih-Yew

    1995-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the U.S. Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities. (author)

  6. SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL

    Directory of Open Access Journals (Sweden)

    Jenq-Daw Lee

    2008-07-01

    Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.

  7. Applicability and feasibility of systematic review for performing evidence-based risk assessment in food and feed safety

    DEFF Research Database (Denmark)

    Aiassa, E.; Higgins, J.P.T.; Frampton, G. K.

    2015-01-01

    for answering questions in health care, and can be implemented to minimise biases in food and feed safety risk assessment. However, no methodological frameworks exist for refining risk assessment multi-parameter models into questions suitable for systematic review, and use of meta-analysis to estimate all......Food and feed safety risk assessment uses multi-parameter models to evaluate the likelihood of adverse events associated with exposure to hazards in human health, plant health, animal health, animal welfare and the environment. Systematic review and meta-analysis are established methods...... parameters in the risk model. This approach to planning and prioritising systematic review seems to have useful implications for producing evidence-based food and feed safety risk assessment....

  8. Predicting Long-term Ischemic Events Using Routine Clinical Parameters in Patients with Coronary Artery Disease: The OPT-CAD Risk Score.

    Science.gov (United States)

    Han, Yaling; Chen, Jiyan; Qiu, Miaohan; Li, Yi; Li, Jing; Feng, Yingqing; Qiu, Jian; Meng, Liang; Sun, Yihong; Tao, Guizhou; Wu, Zhaohui; Yang, Chunyu; Guo, Jincheng; Pu, Kui; Chen, Shaoliang; Wang, Xiaozeng

    2018-06-05

    The prognosis of patients with coronary artery disease (CAD) at hospital discharge was constantly varying, and post-discharge risk of ischemic events remain a concern. However, risk prediction tools to identify risk of ischemia for these patients has not yet been reported. We sought to develop a scoring system for predicting long-term ischemic events in CAD patients receiving antiplatelet therapy that would be beneficial in appropriate personalized decision-making for these patients. In this prospective Optimal antiPlatelet Therapy for Chinese patients with Coronary Artery Disease (OPT-CAD, NCT01735305) registry, a total of 14,032 patients with CAD receiving at least one kind of antiplatelet agent were enrolled from 107 centers across China, from January 2012 to March 2014. The risk scoring system was developed in a derivation cohort (enrolled initially 10,000 patients in the database) using a logistic regression model and was subsequently tested in a validation cohort (the last 4,032 patients). Points in risk score was assigned based on the multivariable odds ratio of each factor. Ischemic events were defined as the composite of cardiac death, myocardial infarction or stroke. Ischemic events occurred in 342 (3.4%) patients in the derivation cohort and 160 (4.0%) patients in the validation cohort during 1-year follow-up. The OPT-CAD score, ranging from 0-257 points, consist of 10 independent risk factors, including age (0-71 points), heart rates (0-36 points), hypertension (0-20 points), prior myocardial infarction (16 points), prior stroke (16 points), renal insufficient (21 points), anemia (19 points), low ejection fraction (22 points), positive cardiac troponin (23 points) and ST-segment deviation (13 points). In predicting 1-year ischemic events, the area under receiver operating characteristics curve were 0.73 and 0.72 in derivation and validation cohort, respectively. The incidences of ischemic events in low- (0-90 points), medium- (91-150 points) and

  9. A systematic comparison of recurrent event models for application to composite endpoints.

    Science.gov (United States)

    Ozga, Ann-Kathrin; Kieser, Meinhard; Rauch, Geraldine

    2018-01-04

    Many clinical trials focus on the comparison of the treatment effect between two or more groups concerning a rarely occurring event. In this situation, showing a relevant effect with an acceptable power requires the observation of a large number of patients over a long period of time. For feasibility issues, it is therefore often considered to include several event types of interest, non-fatal or fatal, and to combine them within a composite endpoint. Commonly, a composite endpoint is analyzed with standard survival analysis techniques by assessing the time to the first occurring event. This approach neglects that an individual may experience more than one event which leads to a loss of information. As an alternative, composite endpoints could be analyzed by models for recurrent events. There exists a number of such models, e.g. regression models based on count data or Cox-based models such as the approaches of Andersen and Gill, Prentice, Williams and Peterson or, Wei, Lin and Weissfeld. Although some of the methods were already compared within the literature there exists no systematic investigation for the special requirements regarding composite endpoints. Within this work a simulation-based comparison of recurrent event models applied to composite endpoints is provided for different realistic clinical trial scenarios. We demonstrate that the Andersen-Gill model and the Prentice- Williams-Petersen models show similar results under various data scenarios whereas the Wei-Lin-Weissfeld model delivers effect estimators which can considerably deviate under commonly met data scenarios. Based on the conducted simulation study, this paper helps to understand the pros and cons of the investigated methods in the context of composite endpoints and provides therefore recommendations for an adequate statistical analysis strategy and a meaningful interpretation of results.

  10. Rheumatoid arthritis disease activity and disability affect the risk of serious infection events in RADIUS 1.

    Science.gov (United States)

    Weaver, Arthur; Troum, Orrin; Hooper, Michele; Koenig, Andrew S; Chaudhari, Sandeep; Feng, Jingyuan; Wenkert, Deborah

    2013-08-01

    To determine whether disease activity and disability independently correlate with serious infection event (SIE) risk in a large rheumatoid arthritis (RA) cohort. The associations between SIE and Clinical Disease Activity Index (CDAI) and Health Assessment Questionnaire-Disability Index (HAQ-DI) in the Rheumatoid Arthritis Disease-Modifying Antirheumatic Drug Intervention and Utilization Study (RADIUS 1) cohort were evaluated using the Andersen-Gill model (a proportional HR model allowing > 1 event per patient). Of 4084 patients with 347 SIE, 271 patients experienced ≥ 1 SIE. A 5-unit CDAI increase and 0.4-unit HAQ-DI increase corresponded to an increase in SIE risk with and without covariate adjustments. A 5-unit CDAI increase corresponded with a 7.7% increased SIE risk (adjusted HR 1.077, 95% CI 1.044-1.112, p < 0.0001) and a 0.4-unit HAQ-DI increase with a 30.1% increased risk (adjusted HR 1.301, 95% CI 1.225-1.381, p < 0.0001). Categorical analysis showed that more severe RA activity (even after controlling for disability) and disability were associated with an increased SIE risk. Increased RA disease activity and disability were each associated with a significantly increased SIE risk in the RADIUS 1 cohort, which could not be completely accounted for by disability.

  11. Lifestyle-based risk model for fall risk assessment

    OpenAIRE

    Sannino, Giovanna; De Falco, Ivanoe; De Pietro, Guiseppe

    2016-01-01

    Purpose: The aim of this study was to identify the explicit relationship between life-style and the risk of falling under the form of a mathematical model. Starting from some personal and behavioral information of a subject as, e.g., weight, height, age, data about physical activity habits, and concern about falling, the model would estimate the score of her/his Mini-Balance Evaluation Systems (Mini-BES) test. This score ranges within 0 and 28, and the lower its value the more likely the subj...

  12. Powering stochastic reliability models by discrete event simulation

    DEFF Research Database (Denmark)

    Kozine, Igor; Wang, Xiaoyun

    2012-01-01

    it difficult to find a solution to the problem. The power of modern computers and recent developments in discrete-event simulation (DES) software enable to diminish some of the drawbacks of stochastic models. In this paper we describe the insights we have gained based on using both Markov and DES models...

  13. Cabin Environment Physics Risk Model

    Science.gov (United States)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  14. A risk-return based model to measure the performance of portfolio management

    Directory of Open Access Journals (Sweden)

    Hamid Reza Vakili Fard

    2014-10-01

    Full Text Available The primary concern in all portfolio management systems is to find a good tradeoff between risk and expected return and a good balance between accepted risk and actual return indicates the performance of a particular portfolio. This paper develops “A-Y Model” to measure the performance of a portfolio and analyze it during the bull and the bear market. This paper considers the daily information of one year before and one year after Iran's 2013 precedential election. The proposed model of this paper provides lost profit and unrealized loss to measure the portfolio performance. The proposed study first ranks the resulted data and then uses some non-parametric methods to see whether there is any change because of the changes in markets on the performance of the portfolio. The results indicate that despite increasing profitable opportunities in bull market, the performance of the portfolio did not match the target risk. As a result, using A-Y Model as a risk and return base model to measure portfolio management's performance appears to reduce risks and increases return of portfolio.

  15. The incremental value of brachial flow-mediated dilation measurements in risk stratification for incident cardiovascular events: a systematic review.

    Science.gov (United States)

    Peters, Sanne A E; den Ruijter, Hester M; Bots, Michiel L

    2012-06-01

    Abstract Adequate risk assessment for cardiovascular disease (CVD) is essential as a guide to initiate drug treatment. Current methods based on traditional risk factors could be improved considerably. Although brachial flow-mediated dilation (FMD) predicts subsequent cardiovascular events, its predictive value on top of traditional risk factors is unknown. We performed a systematic review to evaluate the incremental predictive value of FMD on top of traditional risk factors in asymptomatic individuals. Using PubMed and reference tracking, three studies were identified that reported on the incremental value of FMD using change in the area under the curve (AUC). Two large cohort studies found no improvement in AUC when FMD was added to traditional risk prediction models, whereas one small case-control study found an improvement. One study used the net reclassification improvement (NRI) to assess whether FMD measurement leads to correct risk stratification in risk categories. Although this study did not find an improvement in AUC, the NRI was statistically significant. Based on the reclassification results of this study, FMD measurement might be helpful in risk prediction. Evidence supporting the use of FMD measurement in clinical practice for risk stratification for CVD on top of traditional risk factors is limited, and future studies are needed.

  16. Climate Change Risks – Methodological Framework and Case Study of Damages from Extreme Events in Cambodia

    DEFF Research Database (Denmark)

    Halsnæs, Kirsten; Kaspersen, Per Skougaard; Trærup, Sara Lærke Meltofte

    2016-01-01

    Climate change imposes some special risks on Least Developed Countries, and the chapter presents a methodological framework, which can be used to assess the impacts of key assumptions related to damage costs, risks and equity implications on current and future generations. The methodological...... framework is applied to a case study of severe storms in Cambodia based on statistical information on past storm events including information about buildings damaged and victims. Despite there is limited data available on the probability of severe storm events under climate change as well on the actual...... damage costs associated with the events in the case of Cambodia, we are using the past storm events as proxy data in a sensitivity analysis. It is here demonstrated how key assumptions on future climate change, income levels of victims, and income distribution over time, reflected in discount rates...

  17. Task-based dermal exposure models for regulatory risk assessment.

    Science.gov (United States)

    Warren, Nicholas D; Marquart, Hans; Christopher, Yvette; Laitinen, Juha; VAN Hemmen, Joop J

    2006-07-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of new measurements of dermal exposure together with detailed contextual information. This article describes the development of a set of generic task-based models capable of predicting potential dermal exposure to both solids and liquids in a wide range of situations. To facilitate modelling of the wide variety of dermal exposure situations six separate models were made for groupings of exposure scenarios called Dermal Exposure Operation units (DEO units). These task-based groupings cluster exposure scenarios with regard to the expected routes of dermal exposure and the expected influence of exposure determinants. Within these groupings linear mixed effect models were used to estimate the influence of various exposure determinants and to estimate components of variance. The models predict median potential dermal exposure rates for the hands and the rest of the body from the values of relevant exposure determinants. These rates are expressed as mg or microl product per minute. Using these median potential dermal exposure rates and an accompanying geometric standard deviation allows a range of exposure percentiles to be calculated.

  18. Accumulation of Major Life Events in Childhood and Adult Life and Risk of Type 2 Diabetes Mellitus.

    Directory of Open Access Journals (Sweden)

    Jolene Masters Pedersen

    Full Text Available The aim of the study was to estimate the effect of the accumulation of major life events (MLE in childhood and adulthood, in both the private and working domains, on risk of type 2 diabetes mellitus (T2DM. Furthermore, we aimed to test the possible interaction between childhood and adult MLE and to investigate modification of these associations by educational attainment.The study was based on 4,761 participants from the Copenhagen City Heart Study free of diabetes at baseline and followed for 10 years. MLE were categorized as 0, 1, 2, 3 or more events. Multivariate logistic regression models adjusted for age, sex, education and family history of diabetes were used to estimate the association between MLE and T2DM.In childhood, experiencing 3 or more MLE was associated with a 69% higher risk of developing T2DM (Odds Ratio (OR 1.69; 95% Confidence Interval (CI 1.60, 3.27. The accumulation of MLE in adult private (p-trend = 0.016 and work life (p-trend = 0.049 was associated with risk of T2DM in a dose response manner. There was no evidence that experiencing MLE in both childhood and adult life was more strongly associated with T2DM than experiencing events at only one time point. There was some evidence that being simultaneously exposed to childhood MLE and short education (OR 2.28; 95% C.I. 1.45, 3.59 and work MLE and short education (OR 2.86; 95% C.I. 1.62, 5.03 was associated with higher risk of T2DM, as the joint effects were greater than the sum of their individual effects.Findings from this study suggest that the accumulation of MLE in childhood, private adult life and work life, respectively, are risk factors for developing T2DM.

  19. Value of Progression of Coronary Artery Calcification for Risk Prediction of Coronary and Cardiovascular Events: Result of the HNR Study (Heinz Nixdorf Recall).

    Science.gov (United States)

    Lehmann, Nils; Erbel, Raimund; Mahabadi, Amir A; Rauwolf, Michael; Möhlenkamp, Stefan; Moebus, Susanne; Kälsch, Hagen; Budde, Thomas; Schmermund, Axel; Stang, Andreas; Führer-Sakel, Dagmar; Weimar, Christian; Roggenbuck, Ulla; Dragano, Nico; Jöckel, Karl-Heinz

    2018-02-13

    Computed tomography (CT) allows estimation of coronary artery calcium (CAC) progression. We evaluated several progression algorithms in our unselected, population-based cohort for risk prediction of coronary and cardiovascular events. In 3281 participants (45-74 years of age), free from cardiovascular disease until the second visit, risk factors, and CTs at baseline (b) and after a mean of 5.1 years (5y) were measured. Hard coronary and cardiovascular events, and total cardiovascular events including revascularization, as well, were recorded during a follow-up time of 7.8±2.2 years after the second CT. The added predictive value of 10 CAC progression algorithms on top of risk factors including baseline CAC was evaluated by using survival analysis, C-statistics, net reclassification improvement, and integrated discrimination index. A subgroup analysis of risk in CAC categories was performed. We observed 85 (2.6%) hard coronary, 161 (4.9%) hard cardiovascular, and 241 (7.3%) total cardiovascular events. Absolute CAC progression was higher with versus without subsequent coronary events (median, 115 [Q1-Q3, 23-360] versus 8 [0-83], P value of baseline CT and risk assessment in terms of C-statistic or integrated discrimination index, especially for total cardiovascular events. However, CAC progression did not improve models including CAC 5y and 5-year risk factors. An excellent prognosis was found for 921 participants with double-zero CAC b =CAC 5y =0 (10-year coronary and hard/total cardiovascular risk: 1.4%, 2.0%, and 2.8%), which was for participants with incident CAC 1.8%, 3.8%, and 6.6%, respectively. When CAC b progressed from 1 to 399 to CAC 5y ≥400, coronary and total cardiovascular risk were nearly 2-fold in comparison with subjects who remained below CAC 5y =400. Participants with CAC b ≥400 had high rates of hard coronary and hard/total cardiovascular events (10-year risk: 12.0%, 13.5%, and 30.9%, respectively). CAC progression is associated with

  20. Probabilistic modelling of drought events in China via 2-dimensional joint copula

    Science.gov (United States)

    Ayantobo, Olusola O.; Li, Yi; Song, Songbai; Javed, Tehseen; Yao, Ning

    2018-04-01

    Probabilistic modelling of drought events is a significant aspect of water resources management and planning. In this study, popularly applied and several relatively new bivariate Archimedean copulas were employed to derive regional and spatial based copula models to appraise drought risk in mainland China over 1961-2013. Drought duration (Dd), severity (Ds), and peak (Dp), as indicated by Standardized Precipitation Evapotranspiration Index (SPEI), were extracted according to the run theory and fitted with suitable marginal distributions. The maximum likelihood estimation (MLE) and curve fitting method (CFM) were used to estimate the copula parameters of nineteen bivariate Archimedean copulas. Drought probabilities and return periods were analysed based on appropriate bivariate copula in sub-region I-VII and entire mainland China. The goodness-of-fit tests as indicated by the CFM showed that copula NN19 in sub-regions III, IV, V, VI and mainland China, NN20 in sub-region I and NN13 in sub-region VII are the best for modeling drought variables. Bivariate drought probability across mainland China is relatively high, and the highest drought probabilities are found mainly in the Northwestern and Southwestern China. Besides, the result also showed that different sub-regions might suffer varying drought risks. The drought risks as observed in Sub-region III, VI and VII, are significantly greater than other sub-regions. Higher probability of droughts of longer durations in the sub-regions also corresponds to shorter return periods with greater drought severity. These results may imply tremendous challenges for the water resources management in different sub-regions, particularly the Northwestern and Southwestern China.

  1. Physicologically Based Toxicokinetic Models of Tebuconazole and Application in Human Risk Assessment

    DEFF Research Database (Denmark)

    Jonsdottir, Svava Osk; Reffstrup, Trine Klein; Petersen, Annette

    2016-01-01

    (ADME) of tebuconazole. The developed models were validated on in vivo half-life data for rabbit with good results, and on plasma and tissue concentration-time course data of tebuconazole after i.v. administration in rabbit. In most cases, the predicted concentration levels were seen to be within......A series of physiologically based toxicokinetic (PBTK) models for tebuconazole were developed in four species, rat, rabbit, rhesus monkey, and human. The developed models were analyzed with respect to the application of the models in higher tier human risk assessment, and the prospect of using...... such models in risk assessment of cumulative and aggregate exposure is discussed. Relatively simple and biologically sound models were developed using available experimental data as parameters for describing the physiology of the species, as well as the absorption, distribution, metabolism, and elimination...

  2. Identifying risk event in Indonesian fresh meat supply chain

    Science.gov (United States)

    Wahyuni, H. C.; Vanany, I.; Ciptomulyono, U.

    2018-04-01

    The aim of this paper is to identify risk issues in Indonesian fresh meat supply chain from the farm until to the “plate”. The critical points for food safety in physical fresh meat product flow are also identified. The paper employed one case study in the Indonesian fresh meat company by conducting observations and in-depth three stages of interviews. At the first interview, the players, process, and activities in the fresh meat industry were identified. In the second interview, critical points for food safety were recognized. The risk events in each player and process were identified in the last interview. The research will be conducted in three stages, but this article focuses on risk identification process (first stage) only. The second stage is measuring risk and the third stage focuses on determining the value of risk priority. The results showed that there were four players in the fresh meat supply chain: livestock (source), slaughter (make), distributor and retail (deliver). Each player has different activities and identified 16 risk events in the fresh meat supply chain. Some of the strategies that can be used to reduce the occurrence of such risks include improving the ability of laborers on food safety systems, improving cutting equipment and distribution processes

  3. Mean-variance model for portfolio optimization with background risk based on uncertainty theory

    Science.gov (United States)

    Zhai, Jia; Bai, Manying

    2018-04-01

    The aim of this paper is to develop a mean-variance model for portfolio optimization considering the background risk, liquidity and transaction cost based on uncertainty theory. In portfolio selection problem, returns of securities and assets liquidity are assumed as uncertain variables because of incidents or lacking of historical data, which are common in economic and social environment. We provide crisp forms of the model and a hybrid intelligent algorithm to solve it. Under a mean-variance framework, we analyze the portfolio frontier characteristic considering independently additive background risk. In addition, we discuss some effects of background risk and liquidity constraint on the portfolio selection. Finally, we demonstrate the proposed models by numerical simulations.

  4. Procedures for the external event core damage frequency analyses for NUREG-1150

    International Nuclear Information System (INIS)

    Bohn, M.P.; Lambright, J.A.

    1990-11-01

    This report presents methods which can be used to perform the assessment of risk due to external events at nuclear power plants. These methods were used to perform the external events risk assessments for the Surry and Peach Bottom nuclear power plants as part of the NRC-sponsored NUREG-1150 risk assessments. These methods apply to the full range of hazards such as earthquakes, fires, floods, etc. which are collectively known as external events. The methods described in this report have been developed under NRC sponsorship and represent, in many cases, both advancements and simplifications over techniques that have been used in past years. They also include the most up-to-date data bases on equipment seismic fragilities, fire occurrence frequencies and fire damageability thresholds. The methods described here are based on making full utilization of the power plant systems logic models developed in the internal events analyses. By making full use of the internal events models one obtains an external event analysis that is consistent both in nomenclature and in level of detail with the internal events analyses, and in addition, automatically includes all the appropriate random and tests/maintenance unavailabilities as appropriate. 50 refs., 9 figs., 11 tabs

  5. Application of Catastrophe Risk Modelling to Evacuation Public Policy

    Science.gov (United States)

    Woo, G.

    2009-04-01

    The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a

  6. History of Fire Events in the U.S. Commercial Nuclear Industry

    International Nuclear Information System (INIS)

    Bijan Najafi; Joglar-Biloch, Francisco; Kassawara, Robert P.; Khalil, Yehia

    2002-01-01

    Over the past decade, interest in performance-based fire protection has increased within the nuclear industry. In support of this growing interest, in 1997 the Electric Power Research Institute (EPRI) developed a long-range plan to develop/improve data and tools needed to support Risk-Informed/Performance-Based fire protection. This plan calls for continued improvement in collection and use of information obtained from fire events at nuclear plants. The data collection process has the objectives of improving the insights gained from such data and reducing the uncertainty in fire risk and fire modeling methods in order to make them a more reliable basis for performance based fire protection programs. In keeping with these objectives, EPRI continues to collect, review and analyze fire events in support of the nuclear industry. EPRI collects these records in cooperation with the Nuclear Electric Insurance Limited (NEIL), by compiling public fire event reports and by direct solicitation of U.S. nuclear facilities. EPRI fire data collection project is based on the principle that the understanding of history is one of the cornerstones of improving fire protection technology and practice. Therefore, the goal has been to develop and maintain a comprehensive database of fire events with flexibility to support various aspects of fire protection engineering. With more than 1850 fire records over a period of three decades and 2400 reactor years, this is the most comprehensive database of nuclear power industry fire events in existence today. In general, the frequency of fires in the U.S. commercial nuclear industry remains constant. In few cases, e.g., transient fires and fires in BWR offgas/recombiner systems, where either increasing or decreasing trends are observed, these trends tend to slow after 1980. The key issues in improving quality of the data remain to be consistency of the recording and reporting of fire events and difficulties in collection of records. EPRI has

  7. An RES-Based Model for Risk Assessment and Prediction of Backbreak in Bench Blasting

    Science.gov (United States)

    Faramarzi, F.; Ebrahimi Farsangi, M. A.; Mansouri, H.

    2013-07-01

    Most blasting operations are associated with various forms of energy loss, emerging as environmental side effects of rock blasting, such as flyrock, vibration, airblast, and backbreak. Backbreak is an adverse phenomenon in rock blasting operations, which imposes risk and increases operation expenses because of safety reduction due to the instability of walls, poor fragmentation, and uneven burden in subsequent blasts. In this paper, based on the basic concepts of a rock engineering systems (RES) approach, a new model for the prediction of backbreak and the risk associated with a blast is presented. The newly suggested model involves 16 effective parameters on backbreak due to blasting, while retaining simplicity as well. The data for 30 blasts, carried out at Sungun copper mine, western Iran, were used to predict backbreak and the level of risk corresponding to each blast by the RES-based model. The results obtained were compared with the backbreak measured for each blast, which showed that the level of risk achieved is in consistence with the backbreak measured. The maximum level of risk [vulnerability index (VI) = 60] was associated with blast No. 2, for which the corresponding average backbreak was the highest achieved (9.25 m). Also, for blasts with levels of risk under 40, the minimum average backbreaks (<4 m) were observed. Furthermore, to evaluate the model performance for backbreak prediction, the coefficient of correlation ( R 2) and root mean square error (RMSE) of the model were calculated ( R 2 = 0.8; RMSE = 1.07), indicating the good performance of the model.

  8. Prediction impact curve is a new measure integrating intervention effects in the evaluation of risk models.

    Science.gov (United States)

    Campbell, William; Ganna, Andrea; Ingelsson, Erik; Janssens, A Cecile J W

    2016-01-01

    We propose a new measure of assessing the performance of risk models, the area under the prediction impact curve (auPIC), which quantifies the performance of risk models in terms of their average health impact in the population. Using simulated data, we explain how the prediction impact curve (PIC) estimates the percentage of events prevented when a risk model is used to assign high-risk individuals to an intervention. We apply the PIC to the Atherosclerosis Risk in Communities (ARIC) Study to illustrate its application toward prevention of coronary heart disease. We estimated that if the ARIC cohort received statins at baseline, 5% of events would be prevented when the risk model was evaluated at a cutoff threshold of 20% predicted risk compared to 1% when individuals were assigned to the intervention without the use of a model. By calculating the auPIC, we estimated that an average of 15% of events would be prevented when considering performance across the entire interval. We conclude that the PIC is a clinically meaningful measure for quantifying the expected health impact of risk models that supplements existing measures of model performance. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. A Systems Modeling Approach for Risk Management of Command File Errors

    Science.gov (United States)

    Meshkat, Leila

    2012-01-01

    The main cause of commanding errors is often (but not always) due to procedures. Either lack of maturity in the processes, incompleteness of requirements or lack of compliance to these procedures. Other causes of commanding errors include lack of understanding of system states, inadequate communication, and making hasty changes in standard procedures in response to an unexpected event. In general, it's important to look at the big picture prior to making corrective actions. In the case of errors traced back to procedures, considering the reliability of the process as a metric during its' design may help to reduce risk. This metric is obtained by using data from Nuclear Industry regarding human reliability. A structured method for the collection of anomaly data will help the operator think systematically about the anomaly and facilitate risk management. Formal models can be used for risk based design and risk management. A generic set of models can be customized for a broad range of missions.

  10. Habitual sleep duration and insomnia and the risk of cardiovascular events and all-cause death: report from a community-based cohort.

    Science.gov (United States)

    Chien, Kuo-Liong; Chen, Pei-Chung; Hsu, Hsiu-Ching; Su, Ta-Chen; Sung, Fung-Chang; Chen, Ming-Fong; Lee, Yuan-Teh

    2010-02-01

    To investigate the relationship between sleep duration and insomnia severity and the risk of all-cause death and cardiovascular disease (CVD) events. Prospective cohort study. Community-based. A total of 3,430 adults aged 35 years or older. None. During a median 15.9 year (interquartile range, 13.1 to 16.9) follow-up period, 420 cases developed cardiovascular disease and 901 cases died. A U-shape association between sleep duration and all-cause death was found: the age and gender-adjusted relative risks (95% confidence interval [CI]) of all-cause death (with 7 h of daily sleep being considered for the reference group) for individuals reporting or = 9 h were 1.15 (0.91-1.45), 1.02 (0.85-1.25), 1.05 (0.88-1.27), and 1.43 (1.16-1.75); P for trend, 0.019. However, the relationship between sleep duration and risk of CVD were linear. The multivariate-adjusted relative risk (95% CI) for all-cause death (using individuals without insomnia) were 1.02 (0.86-1.20) for occasional insomnia, 1.15 (0.92-1.42) for frequent insomnia, and 1.70 (1.16-2.49) for nearly everyday insomnia (P for trend, 0.028). The multivariate adjusted relative risk (95% CI) was 2.53 (1.71-3.76) for all-cause death and 2.07 (1.11-3.85) for CVD rate in participants sleeping > or = 9 h and for those with frequent insomnia. Sleep duration and insomnia severity were associated with all-cause death and CVD events among ethnic Chinese in Taiwan. Our data indicate that an optimal sleep duration (7-8 h) predicted fewer deaths.

  11. Developing future precipitation events from historic events: An Amsterdam case study.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2016-04-01

    Due to climate change, the frequency and intensity of extreme precipitation events is expected to increase. It is therefore of high importance to develop climate change scenarios tailored towards the local and regional needs of policy makers in order to develop efficient adaptation strategies to reduce the risks from extreme weather events. Current approaches to tailor climate scenarios are often not well adopted in hazard management, since average changes in climate are not a main concern to policy makers, and tailoring climate scenarios to simulate future extremes can be complex. Therefore, a new concept has been introduced recently that uses known historic extreme events as a basis, and modifies the observed data for these events so that the outcome shows how the same event would occur in a warmer climate. This concept is introduced as 'Future Weather', and appeals to the experience of stakeholders and users. This research presents a novel method of projecting a future extreme precipitation event, based on a historic event. The selected precipitation event took place over the broader area of Amsterdam, the Netherlands in the summer of 2014, which resulted in blocked highways, disruption of air transportation, flooded buildings and public facilities. An analysis of rain monitoring stations showed that an event of such intensity has a 5 to 15 years return period. The method of projecting a future event follows a non-linear delta transformation that is applied directly on the observed event assuming a warmer climate to produce an "up-scaled" future precipitation event. The delta transformation is based on the observed behaviour of the precipitation intensity as a function of the dew point temperature during summers. The outcome is then compared to a benchmark method using the HARMONIE numerical weather prediction model, where the boundary conditions of the event from the Ensemble Prediction System of ECMWF (ENS) are perturbed to indicate a warmer climate. The two

  12. A risk based model supporting long term maintenance and reinvestment strategy decision making

    Energy Technology Data Exchange (ETDEWEB)

    Sand, Kjell; Montard, Julien; Tremoen, Tord H.

    2010-02-15

    This Technical Report is a product from the project Risk-Based Distribution System Asset Management (short: RISK DSAM) - Work Package 3 Risk exposure on company/strategic level. In the report a concept for portfolio distribution system asset management is presented. The approach comprises four main steps: 1. Decide the asset base. 2. Divide the asset base into relevant archetypes. 3. Develop or select relevant maintenance and reinvestment strategies for the different archetypes. 4. Estimate risks and costs for each archetype for the relevant strategies. For the different steps guidelines are given and a proposal for implementation of the concept is given in terms of a proposed IT system architecture.To evaluate the feasibility of such a concept, a prototype was developed in by using Visual Basic macros in Excel using real technical data from a small DSO. The experience from using the prototype shows that the concept is realistic. All assets are included and depending of the ambition of the risk analysis both simple simulation models and more advanced might be embedded. Presentations of the concept for a utility engineers have receive positive feedback indicating that the concept is regarded as a practical way to develop risk based asset management strategies for the asset fleet. It should be noted that the concept should be applied on a company strategic level and is thus not designed to be applied for a specific project or asset decisions. For this, more detailed models with area specific information, topology etc. are needed. (Author)

  13. Modeling Psychological Contract Violation using Dual Regime Models: An Event-based Approach

    Directory of Open Access Journals (Sweden)

    Joeri Hofmans

    2017-11-01

    Full Text Available A good understanding of the dynamics of psychological contract violation requires theories, research methods and statistical models that explicitly recognize that violation feelings follow from an event that violates one's acceptance limits, after which interpretative processes are set into motion, determining the intensity of these violation feelings. Whereas theories—in the form of the dynamic model of the psychological contract—and research methods—in the form of daily diary research and experience sampling research—are available by now, the statistical tools to model such a two-stage process are still lacking. The aim of the present paper is to fill this gap in the literature by introducing two statistical models—the Zero-Inflated model and the Hurdle model—that closely mimic the theoretical process underlying the elicitation violation feelings via two model components: a binary distribution that models whether violation has occurred or not, and a count distribution that models how severe the negative impact is. Moreover, covariates can be included for both model components separately, which yields insight into their unique and shared antecedents. By doing this, the present paper offers a methodological-substantive synergy, showing how sophisticated methodology can be used to examine an important substantive issue.

  14. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  15. Quantitative risk trends deriving from PSA-based event analyses. Analysis of results from U.S.NRC's accident sequence precursor program

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2004-01-01

    The United States Nuclear Regulatory Commission (U.S.NRC) has been carrying out the Accident Sequence Precursor (ASP) Program to identify and categorize precursors to potential severe core damage accident sequences using the probabilistic safety assessment (PSA) technique. The ASP Program has identified a lot of risk significant events as precursors that occurred at U.S. nuclear power plants. Although the results from the ASP Program include valuable information that could be useful for obtaining and characterizing risk significant insights and for monitoring risk trends in nuclear power industry, there are only a few attempts to determine and develop the trends using the ASP results. The present study examines and discusses quantitative risk trends for the industry level, using two indicators, that is, the occurrence frequency of precursors and the annual core damage probability, deriving from the results of the ASP analysis. It is shown that the core damage risk at U.S. nuclear power plants has been lowered and the likelihood of risk significant events has been remarkably decreasing. As well, the present study demonstrates that two risk indicators used here can provide quantitative information useful for examining and monitoring the risk trends and/or risk characteristics in nuclear power industry. (author)

  16. Association of anemia with the risk of cardiovascular adverse events in overweight/obese patients

    DEFF Research Database (Denmark)

    Winther, S. A.; Finer, N.; Sharma, A. M.

    2014-01-01

    Objective:Anemia is associated with increased cardiovascular risks. Obesity may cause anemia in several ways, for example, by low-grade inflammation and relative iron deficit. The outcomes associated with anemia in overweight/obese patients at high cardiovascular risk are however not known....... Therefore, we investigated the cardiovascular prognosis in overweight/obese subjects with anemia.Methods:A total of 9 687 overweight/obese cardiovascular high-risk patients from the Sibutramine Cardiovascular OUTcomes trial were studied. Patients were stratified after baseline hemoglobin level and followed...... for the risks of primary event (comprising nonfatal myocardial infarction, nonfatal stroke, resuscitated cardiac arrest or cardiovascular death) and all-cause mortality. Risk estimates (hazard ratios (HR) with 95% confidence intervals (CI)) were calculated using Cox regression models.Results:Anemia...

  17. Escherichia coli pollution in a Baltic Sea lagoon: a model-based source and spatial risk assessment.

    Science.gov (United States)

    Schippmann, Bianca; Schernewski, Gerald; Gräwe, Ulf

    2013-07-01

    Tourism around the Oder (Szczecin) Lagoon, at the southern Baltic coast, has a long tradition, is an important source of income and shall be further developed. Insufficient bathing water quality and frequent beach closings, especially in the Oder river mouth, hamper tourism development. Monitoring data gives only an incomplete picture of Escherichia coli (E. coli) bacteria sources, spatial transport patterns, risks and does neither support an efficient bathing water quality management nor decision making. We apply a 3D ocean model and a Lagrangian particle tracking model to analyse pollution events and to obtain spatial E. coli pollution maps based on scenario simulations. Model results suggests that insufficient sewage treatment in the city of Szczecin is the major source of faecal pollution, even for beaches 20km downstream. E. coli mortality rate and emission intensity are key parameters for concentration levels downstream. Wind and river discharge play a modifying role. Prevailing southwestern wind conditions cause E. coli transport along the eastern coast and favour high concentration levels at the beaches. Our simulations indicate that beach closings in 2006 would not have been necessary according to the new EU-Bathing Water Quality Directive (2006/7/EC). The implementation of the new directive will, very likely, reduce the number of beach closings, but not the risk for summer tourists. Model results suggest, that a full sewage treatment in Szczecin would allow the establishment of new beaches closer to the city (north of Dabie lake). Copyright © 2013 Elsevier GmbH. All rights reserved.

  18. Radiation risk estimation based on measurement error models

    CERN Document Server

    Masiuk, Sergii; Shklyar, Sergiy; Chepurny, Mykola; Likhtarov, Illya

    2017-01-01

    This monograph discusses statistics and risk estimates applied to radiation damage under the presence of measurement errors. The first part covers nonlinear measurement error models, with a particular emphasis on efficiency of regression parameter estimators. In the second part, risk estimation in models with measurement errors is considered. Efficiency of the methods presented is verified using data from radio-epidemiological studies.

  19. THE USE OF KNOWLEDGE MANAGEMENT SYSTEMS AND EVENT-B MODELLING IN A LEAN ENTERPRISE

    Directory of Open Access Journals (Sweden)

    Ladislav Buřita

    2018-03-01

    Full Text Available This paper provides a case study describing an approach to improving the efficiency of an information system (IS by supporting processes outside the IS, using the ontology-driven knowledge management systems (KMS as a mini-application in the area of so-called lean enterprise. Lean enterprise is focused on creating a maximal value for final customers while eliminating all kinds of waste and unnecessary costs, which significantly helps to increase the level of its competitiveness. It is about managerial decision-making, which can be in some cases contradictory (solving a local problem can cause a problem in another place. In this paper, we describe the KMS ATOM, which supports the innovation process in a lean enterprise. We show how the risk of wrong decisions due to contradictory effects can be eliminated by implementing a safety-critical system into the traditional IS. Our model is supported by Event-B modelling, a refinement-based formal modelling method, which is successfully used in important areas such as infrastructure, medicine, nuclear engineering and transportation (fire alarm systems, robotic surgery machines, braking systems in transportation, etc.. Nowadays, Event-B modelling is starting to be used for various management decision-making activities, and it is becoming a powerful competitiveness tool. This paper introduces a simple example of how Event-B modelling and its proof obligations can help improve and automate the decision-making process by eliminating potential threats of inefficient decisions.

  20. Performance of joint modelling of time-to-event data with time-dependent predictors: an assessment based on transition to psychosis data

    Directory of Open Access Journals (Sweden)

    Hok Pan Yuen

    2016-10-01

    Full Text Available Joint modelling has emerged to be a potential tool to analyse data with a time-to-event outcome and longitudinal measurements collected over a series of time points. Joint modelling involves the simultaneous modelling of the two components, namely the time-to-event component and the longitudinal component. The main challenges of joint modelling are the mathematical and computational complexity. Recent advances in joint modelling have seen the emergence of several software packages which have implemented some of the computational requirements to run joint models. These packages have opened the door for more routine use of joint modelling. Through simulations and real data based on transition to psychosis research, we compared joint model analysis of time-to-event outcome with the conventional Cox regression analysis. We also compared a number of packages for fitting joint models. Our results suggest that joint modelling do have advantages over conventional analysis despite its potential complexity. Our results also suggest that the results of analyses may depend on how the methodology is implemented.

  1. Development of innovative methods for risk assessment in high-rise construction based on clustering of risk factors

    Science.gov (United States)

    Okolelova, Ella; Shibaeva, Marina; Shalnev, Oleg

    2018-03-01

    The article analyses risks in high-rise construction in terms of investment value with account of the maximum probable loss in case of risk event. The authors scrutinized the risks of high-rise construction in regions with various geographic, climatic and socio-economic conditions that may influence the project environment. Risk classification is presented in general terms, that includes aggregated characteristics of risks being common for many regions. Cluster analysis tools, that allow considering generalized groups of risk depending on their qualitative and quantitative features, were used in order to model the influence of the risk factors on the implementation of investment project. For convenience of further calculations, each type of risk is assigned a separate code with the number of the cluster and the subtype of risk. This approach and the coding of risk factors makes it possible to build a risk matrix, which greatly facilitates the task of determining the degree of impact of risks. The authors clarified and expanded the concept of the price risk, which is defined as the expected value of the event, 105 which extends the capabilities of the model, allows estimating an interval of the probability of occurrence and also using other probabilistic methods of calculation.

  2. Extreme weather events in southern Germany - Climatological risk and development of a large-scale identification procedure

    Science.gov (United States)

    Matthies, A.; Leckebusch, G. C.; Rohlfing, G.; Ulbrich, U.

    2009-04-01

    Extreme weather events such as thunderstorms, hail and heavy rain or snowfall can pose a threat to human life and to considerable tangible assets. Yet there is a lack of knowledge about present day climatological risk and its economic effects, and its changes due to rising greenhouse gas concentrations. Therefore, parts of economy particularly sensitve to extreme weather events such as insurance companies and airports require regional risk-analyses, early warning and prediction systems to cope with such events. Such an attempt is made for southern Germany, in close cooperation with stakeholders. Comparing ERA40 and station data with impact records of Munich Re and Munich Airport, the 90th percentile was found to be a suitable threshold for extreme impact relevant precipitation events. Different methods for the classification of causing synoptic situations have been tested on ERA40 reanalyses. An objective scheme for the classification of Lamb's circulation weather types (CWT's) has proved to be most suitable for correct classification of the large-scale flow conditions. Certain CWT's have been turned out to be prone to heavy precipitation or on the other side to have a very low risk of such events. Other large-scale parameters are tested in connection with CWT's to find out a combination that has the highest skill to identify extreme precipitation events in climate model data (ECHAM5 and CLM). For example vorticity advection in 700 hPa shows good results, but assumes knowledge of regional orographic particularities. Therefore ongoing work is focused on additional testing of parameters that indicate deviations of a basic state of the atmosphere like the Eady Growth Rate or the newly developed Dynamic State Index. Evaluation results will be used to estimate the skill of the regional climate model CLM concerning the simulation of frequency and intensity of the extreme weather events. Data of the A1B scenario (2000-2050) will be examined for a possible climate change

  3. WRF-based fire risk modelling and evaluation for years 2010 and 2012 in Poland

    Science.gov (United States)

    Stec, Magdalena; Szymanowski, Mariusz; Kryza, Maciej

    2016-04-01

    Wildfires are one of the main ecosystems' disturbances for forested, seminatural and agricultural areas. They generate significant economic loss, especially in forest management and agriculture. Forest fire risk modeling is therefore essential e.g. for forestry administration. In August 2015 a new method of forest fire risk forecasting entered into force in Poland. The method allows to predict a fire risk level in a 4-degree scale (0 - no risk, 3 - highest risk) and consists of a set of linearized regression equations. Meteorological information is used as predictors in regression equations, with air temperature, relative humidity, average wind speed, cloudiness and rainfall. The equations include also pine litter humidity as a measure of potential fuel characteristics. All these parameters are measured routinely in Poland at 42 basic and 94 auxiliary sites. The fire risk level is estimated for a current (basing on morning measurements) or next day (basing on midday measurements). Entire country is divided into 42 prognostic zones, and fire risk level for each zone is taken from the closest measuring site. The first goal of this work is to assess if the measurements needed for fire risk forecasting may be replaced by the data from mesoscale meteorological model. Additionally, the use of a meteorological model would allow to take into account much more realistic spatial differentiation of weather elements determining the fire risk level instead of discrete point-made measurements. Meteorological data have been calculated using the Weather Research and Forecasting model (WRF). For the purpose of this study the WRF model is run in the reanalysis mode allowing to estimate all required meteorological data in a 5-kilometers grid. The only parameter that cannot be directly calculated using WRF is the litter humidity, which has been estimated using empirical formula developed by Sakowska (2007). The experiments are carried out for two selected years: 2010 and 2012. The

  4. Construction and updating of event models in auditory event processing.

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-02-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event boundaries. Evidence from reading time studies (increased reading times with increasing amount of change) suggest that updating of event models is incremental. We present results from 5 experiments that studied event processing (including memory formation processes and reading times) using an audio drama as well as a transcript thereof as stimulus material. Experiments 1a and 1b replicated the event boundary advantage effect for memory. In contrast to recent evidence from studies using visual stimulus material, Experiments 2a and 2b found no support for incremental updating with normally sighted and blind participants for recognition memory. In Experiment 3, we replicated Experiment 2a using a written transcript of the audio drama as stimulus material, allowing us to disentangle encoding and retrieval processes. Our results indicate incremental updating processes at encoding (as measured with reading times). At the same time, we again found recognition performance to be unaffected by the amount of change. We discuss these findings in light of current event cognition theories. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  5. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    Directory of Open Access Journals (Sweden)

    Moiz Mumtaz

    2012-01-01

    Full Text Available Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures.

  6. Modeling the recurrent failure to thrive in less than two-year children: recurrent events survival analysis.

    Science.gov (United States)

    Saki Malehi, Amal; Hajizadeh, Ebrahim; Ahmadi, Kambiz; Kholdi, Nahid

    2014-01-01

    This study aimes to evaluate the failure to thrive (FTT) recurrent event over time. This longitudinal study was conducted during February 2007 to July 2009. The primary outcome was growth failure. The analysis was done using 1283 children who had experienced FTT several times, based on recurrent events analysis. Fifty-nine percent of the children had experienced the FTT at least one time and 5.3% of them had experienced it up to four times. The Prentice-Williams-Peterson (PWP) model revealed significant relationship between diarrhea (HR=1.26), respiratory infections (HR=1.25), urinary tract infections (HR=1.51), discontinuation of breast-feeding (HR=1.96), teething (HR=1.18), initiation age of complementary feeding (HR=1.11) and hazard rate of the first FTT event. Recurrence nature of the FTT is a main problem, which taking it into account increases the accuracy in analysis of FTT event process and can lead to identify different risk factors for each FTT recurrences.

  7. Modeling the acute health effects of astronauts from exposure to large solar particle events.

    Science.gov (United States)

    Hu, Shaowen; Kim, Myung-Hee Y; McClellan, Gene E; Cucinotta, Francis A

    2009-04-01

    Radiation exposure from Solar Particle Events (SPE) presents a significant health concern for astronauts for exploration missions outside the protection of the Earth's magnetic field, which could impair their performance and result in the possibility of failure of the mission. Assessing the potential for early radiation effects under such adverse conditions is of prime importance. Here we apply a biologically based mathematical model that describes the dose- and time-dependent early human responses that constitute the prodromal syndromes to consider acute risks from SPEs. We examine the possible early effects on crews from exposure to some historically large solar events on lunar and/or Mars missions. The doses and dose rates of specific organs were calculated using the Baryon radiation transport (BRYNTRN) code and a computerized anatomical man model, while the hazard of the early radiation effects and performance reduction were calculated using the Radiation-Induced Performance Decrement (RIPD) code. Based on model assumptions we show that exposure to these historical events would cause moderate early health effects to crew members inside a typical spacecraft or during extra-vehicular activities, if effective shielding and medical countermeasure tactics were not provided. We also calculate possible even worse cases (double intensity, multiple occurrences in a short period of time, etc.) to estimate the severity, onset and duration of various types of early illness. Uncertainties in the calculation due to limited data on relative biological effectiveness and dose-rate modifying factors for protons and secondary radiation, and the identification of sensitive sites in critical organs are discussed.

  8. Integration of an Evidence Base into a Probabilistic Risk Assessment Model. The Integrated Medical Model Database: An Organized Evidence Base for Assessing In-Flight Crew Health Risk and System Design

    Science.gov (United States)

    Saile, Lynn; Lopez, Vilma; Bickham, Grandin; FreiredeCarvalho, Mary; Kerstman, Eric; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) database, which is an organized evidence base for assessing in-flight crew health risk. The database is a relational database accessible to many people. The database quantifies the model inputs by a ranking based on the highest value of the data as Level of Evidence (LOE) and the quality of evidence (QOE) score that provides an assessment of the evidence base for each medical condition. The IMM evidence base has already been able to provide invaluable information for designers, and for other uses.

  9. Preventing Medication Error Based on Knowledge Management Against Adverse Event

    OpenAIRE

    Hastuti, Apriyani Puji; Nursalam, Nursalam; Triharini, Mira

    2017-01-01

    Introductions: Medication error is one of many types of errors that could decrease the quality and safety of healthcare. Increasing number of adverse events (AE) reflects the number of medication errors. This study aimed to develop a model of medication error prevention based on knowledge management. This model is expected to improve knowledge and skill of nurses to prevent medication error which is characterized by the decrease of adverse events (AE). Methods: This study consisted of two sta...

  10. Modeling operational risks of the nuclear industry with Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Wieland, Patricia [Pontificia Univ. Catolica do Rio de Janeiro (PUC-Rio), RJ (Brazil). Dept. de Engenharia Industrial; Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)], e-mail: pwieland@cnen.gov.br; Lustosa, Leonardo J. [Pontificia Univ. Catolica do Rio de Janeiro (PUC-Rio), RJ (Brazil). Dept. de Engenharia Industrial], e-mail: ljl@puc-rio.br

    2009-07-01

    Basically, planning a new industrial plant requires information on the industrial management, regulations, site selection, definition of initial and planned capacity, and on the estimation of the potential demand. However, this is far from enough to assure the success of an industrial enterprise. Unexpected and extremely damaging events may occur that deviates from the original plan. The so-called operational risks are not only in the system, equipment, process or human (technical or managerial) failures. They are also in intentional events such as frauds and sabotage, or extreme events like terrorist attacks or radiological accidents and even on public reaction to perceived environmental or future generation impacts. For the nuclear industry, it is a challenge to identify and to assess the operational risks and their various sources. Early identification of operational risks can help in preparing contingency plans, to delay the decision to invest or to approve a project that can, at an extreme, affect the public perception of the nuclear energy. A major problem in modeling operational risk losses is the lack of internal data that are essential, for example, to apply the loss distribution approach. As an alternative, methods that consider qualitative and subjective information can be applied, for example, fuzzy logic, neural networks, system dynamic or Bayesian networks. An advantage of applying Bayesian networks to model operational risk is the possibility to include expert opinions and variables of interest, to structure the model via causal dependencies among these variables, and to specify subjective prior and conditional probabilities distributions at each step or network node. This paper suggests a classification of operational risks in industry and discusses the benefits and obstacles of the Bayesian networks approach to model those risks. (author)

  11. Modeling operational risks of the nuclear industry with Bayesian networks

    International Nuclear Information System (INIS)

    Wieland, Patricia; Lustosa, Leonardo J.

    2009-01-01

    Basically, planning a new industrial plant requires information on the industrial management, regulations, site selection, definition of initial and planned capacity, and on the estimation of the potential demand. However, this is far from enough to assure the success of an industrial enterprise. Unexpected and extremely damaging events may occur that deviates from the original plan. The so-called operational risks are not only in the system, equipment, process or human (technical or managerial) failures. They are also in intentional events such as frauds and sabotage, or extreme events like terrorist attacks or radiological accidents and even on public reaction to perceived environmental or future generation impacts. For the nuclear industry, it is a challenge to identify and to assess the operational risks and their various sources. Early identification of operational risks can help in preparing contingency plans, to delay the decision to invest or to approve a project that can, at an extreme, affect the public perception of the nuclear energy. A major problem in modeling operational risk losses is the lack of internal data that are essential, for example, to apply the loss distribution approach. As an alternative, methods that consider qualitative and subjective information can be applied, for example, fuzzy logic, neural networks, system dynamic or Bayesian networks. An advantage of applying Bayesian networks to model operational risk is the possibility to include expert opinions and variables of interest, to structure the model via causal dependencies among these variables, and to specify subjective prior and conditional probabilities distributions at each step or network node. This paper suggests a classification of operational risks in industry and discusses the benefits and obstacles of the Bayesian networks approach to model those risks. (author)

  12. Revenue Risk Modelling and Assessment on BOT Highway Project

    Science.gov (United States)

    Novianti, T.; Setyawan, H. Y.

    2018-01-01

    The infrastructure project which is considered as a public-private partnership approach under BOT (Build-Operate-Transfer) arrangement, such as a highway, is risky. Therefore, assessment on risk factors is essential as the project have a concession period and is influenced by macroeconomic factors and consensus period. In this study, pre-construction risks of a highway were examined by using a Delphi method to create a space for offline expert discussions; a fault tree analysis to map intuition of experts and to create a model from the underlying risk events; a fuzzy logic to interpret the linguistic data of risk models. The loss of revenue for risk tariff, traffic volume, force majeure, and income were then measured. The results showed that the loss of revenue caused by the risk tariff was 10.5% of the normal total revenue. The loss of revenue caused by the risk of traffic volume was 21.0% of total revenue. The loss of revenue caused by the force majeure was 12.2% of the normal income. The loss of income caused by the non-revenue events was 6.9% of the normal revenue. It was also found that the volume of traffic was the major risk of a highway project because it related to customer preferences.

  13. Software for occupational health and safety risk analysis based on a fuzzy model.

    Science.gov (United States)

    Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan

    2012-01-01

    Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.

  14. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  15. Blood pressure-lowering treatment strategies based on cardiovascular risk versus blood pressure: A meta-analysis of individual participant data.

    Science.gov (United States)

    Karmali, Kunal N; Lloyd-Jones, Donald M; van der Leeuw, Joep; Goff, David C; Yusuf, Salim; Zanchetti, Alberto; Glasziou, Paul; Jackson, Rodney; Woodward, Mark; Rodgers, Anthony; Neal, Bruce C; Berge, Eivind; Teo, Koon; Davis, Barry R; Chalmers, John; Pepine, Carl; Rahimi, Kazem; Sundström, Johan

    2018-03-01

    Clinical practice guidelines have traditionally recommended blood pressure treatment based primarily on blood pressure thresholds. In contrast, using predicted cardiovascular risk has been advocated as a more effective strategy to guide treatment decisions for cardiovascular disease (CVD) prevention. We aimed to compare outcomes from a blood pressure-lowering treatment strategy based on predicted cardiovascular risk with one based on systolic blood pressure (SBP) level. We used individual participant data from the Blood Pressure Lowering Treatment Trialists' Collaboration (BPLTTC) from 1995 to 2013. Trials randomly assigned participants to either blood pressure-lowering drugs versus placebo or more intensive versus less intensive blood pressure-lowering regimens. We estimated 5-y risk of CVD events using a multivariable Weibull model previously developed in this dataset. We compared the two strategies at specific SBP thresholds and across the spectrum of risk and blood pressure levels studied in BPLTTC trials. The primary outcome was number of CVD events avoided per persons treated. We included data from 11 trials (47,872 participants). During a median of 4.0 y of follow-up, 3,566 participants (7.5%) experienced a major cardiovascular event. Areas under the curve comparing the two treatment strategies throughout the range of possible thresholds for CVD risk and SBP demonstrated that, on average, a greater number of CVD events would be avoided for a given number of persons treated with the CVD risk strategy compared with the SBP strategy (area under the curve 0.71 [95% confidence interval (CI) 0.70-0.72] for the CVD risk strategy versus 0.54 [95% CI 0.53-0.55] for the SBP strategy). Compared with treating everyone with SBP ≥ 150 mmHg, a CVD risk strategy would require treatment of 29% (95% CI 26%-31%) fewer persons to prevent the same number of events or would prevent 16% (95% CI 14%-18%) more events for the same number of persons treated. Compared with treating

  16. Blood pressure-lowering treatment strategies based on cardiovascular risk versus blood pressure: A meta-analysis of individual participant data.

    Directory of Open Access Journals (Sweden)

    Kunal N Karmali

    2018-03-01

    Full Text Available Clinical practice guidelines have traditionally recommended blood pressure treatment based primarily on blood pressure thresholds. In contrast, using predicted cardiovascular risk has been advocated as a more effective strategy to guide treatment decisions for cardiovascular disease (CVD prevention. We aimed to compare outcomes from a blood pressure-lowering treatment strategy based on predicted cardiovascular risk with one based on systolic blood pressure (SBP level.We used individual participant data from the Blood Pressure Lowering Treatment Trialists' Collaboration (BPLTTC from 1995 to 2013. Trials randomly assigned participants to either blood pressure-lowering drugs versus placebo or more intensive versus less intensive blood pressure-lowering regimens. We estimated 5-y risk of CVD events using a multivariable Weibull model previously developed in this dataset. We compared the two strategies at specific SBP thresholds and across the spectrum of risk and blood pressure levels studied in BPLTTC trials. The primary outcome was number of CVD events avoided per persons treated. We included data from 11 trials (47,872 participants. During a median of 4.0 y of follow-up, 3,566 participants (7.5% experienced a major cardiovascular event. Areas under the curve comparing the two treatment strategies throughout the range of possible thresholds for CVD risk and SBP demonstrated that, on average, a greater number of CVD events would be avoided for a given number of persons treated with the CVD risk strategy compared with the SBP strategy (area under the curve 0.71 [95% confidence interval (CI 0.70-0.72] for the CVD risk strategy versus 0.54 [95% CI 0.53-0.55] for the SBP strategy. Compared with treating everyone with SBP ≥ 150 mmHg, a CVD risk strategy would require treatment of 29% (95% CI 26%-31% fewer persons to prevent the same number of events or would prevent 16% (95% CI 14%-18% more events for the same number of persons treated. Compared with

  17. Comparative performance of diabetes-specific and general population-based cardiovascular risk assessment models in people with diabetes mellitus.

    Science.gov (United States)

    Echouffo-Tcheugui, J-B; Kengne, A P

    2013-10-01

    Multivariable models for estimating cardiovascular disease (CVD) risk in people with diabetes comprise general population-based models and those from diabetic cohorts. Whether one set of models should receive preference is unclear. We evaluated the evidence on direct comparisons of the performance of general population vs diabetes-specific CVD risk models in people with diabetes. MEDLINE and EMBASE databases were searched up to March 2013. Two reviewers independently identified studies that compared the performance of general CVD models vs diabetes-specific ones in the same group of people with diabetes. Independent, dual data extraction on study design, risk models, outcomes; and measures of performance was conducted. Eleven articles reporting on 22 pair wise comparisons of a diabetes-specific model (UKPDS, ADVANCE and DCS risk models) to a general population model (three variants of the Framingham model, Prospective Cardiovascular Münster [PROCAM] score, CardioRisk Manager [CRM], Joint British Societies Coronary Risk Chart [JBSRC], Progetto Cuore algorithm and the CHD-Riskard algorithm) were eligible. Absolute differences in C-statistic of diabetes-specific vs general population-based models varied from -0.13 to 0.09. Comparisons for other performance measures were unusual. Outcomes definitions were congruent with those applied during model development. In 14 comparisons, the UKPDS, ADVANCE or DCS diabetes-specific models were superior to the general population CVD risk models. Authors reported better C-statistic for models they developed. The limited existing evidence suggests a possible discriminatory advantage of diabetes-specific over general population-based models for CVD risk stratification in diabetes. More robust head-to-head comparisons are needed to confirm this trend and strengthen recommendations. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  18. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  19. Probalistisk short-term risk modeling for back-end fuel cycle and waste management facilities

    International Nuclear Information System (INIS)

    Kjellbert, N.A.

    1980-03-01

    This study of probabilistic short-term risk modeling of back-end fuel cycle and waste management facilities represents the continuation of work started in 1977. The purpose of the report is to present a more detailed survey of models and analysis techniques that mey be applicable. The definition of the risk concept and the nature of the facilities and events which are to be analyzed are described. The most important criteria are that the model or method shall be quantitative, logically/scientifically based, and be able to handle systems of some complexity. Several formalized analysis methods are described, most of them emanating from reliability theory. No single model will fulfill all criteria simultaneously, to the degree desired. Nevertheless, fault tree analysis seems to be an efficient tool in many applications, although it must probably be used together with other models in most cases. Other methodologies described can also be useful, such as failure modes and effects analysis, renewal theory and Markov chains, reliability block diagrams, event trees and cause/consequence diagrams, the GO methodology, Monte Carlo simulation, and, often necessary, various consequence modeling techniques. (author)

  20. Validated Competing Event Model for the Stage I-II Endometrial Cancer Population

    Energy Technology Data Exchange (ETDEWEB)

    Carmona, Ruben; Gulaya, Sachin; Murphy, James D. [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States); Rose, Brent S. [Harvard Radiation Oncology Program, Harvard Medical School, Boston, Massachusetts (United States); Wu, John; Noticewala, Sonal [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States); McHale, Michael T. [Department of Reproductive Medicine, Division of Gynecologic Oncology, University of California San Diego, La Jolla, California (United States); Yashar, Catheryn M. [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States); Vaida, Florin [Department of Family and Preventive Medicine, Biostatistics and Bioinformatics, University of California San Diego Medical Center, San Diego, California (United States); Mell, Loren K., E-mail: lmell@ucsd.edu [Department of Radiation Medicine and Applied Sciences, University of California San Diego, La Jolla, California (United States)

    2014-07-15

    Purpose/Objectives(s): Early-stage endometrial cancer patients are at higher risk of noncancer mortality than of cancer mortality. Competing event models incorporating comorbidity could help identify women most likely to benefit from treatment intensification. Methods and Materials: 67,397 women with stage I-II endometrioid adenocarcinoma after total hysterectomy diagnosed from 1988 to 2009 were identified in Surveillance, Epidemiology, and End Results (SEER) and linked SEER-Medicare databases. Using demographic and clinical information, including comorbidity, we sought to develop and validate a risk score to predict the incidence of competing mortality. Results: In the validation cohort, increasing competing mortality risk score was associated with increased risk of noncancer mortality (subdistribution hazard ratio [SDHR], 1.92; 95% confidence interval [CI], 1.60-2.30) and decreased risk of endometrial cancer mortality (SDHR, 0.61; 95% CI, 0.55-0.78). Controlling for other variables, Charlson Comorbidity Index (CCI) = 1 (SDHR, 1.62; 95% CI, 1.45-1.82) and CCI >1 (SDHR, 3.31; 95% CI, 2.74-4.01) were associated with increased risk of noncancer mortality. The 10-year cumulative incidences of competing mortality within low-, medium-, and high-risk strata were 27.3% (95% CI, 25.2%-29.4%), 34.6% (95% CI, 32.5%-36.7%), and 50.3% (95% CI, 48.2%-52.6%), respectively. With increasing competing mortality risk score, we observed a significant decline in omega (ω), indicating a diminishing likelihood of benefit from treatment intensification. Conclusion: Comorbidity and other factors influence the risk of competing mortality among patients with early-stage endometrial cancer. Competing event models could improve our ability to identify patients likely to benefit from treatment intensification.

  1. Validated Competing Event Model for the Stage I-II Endometrial Cancer Population

    International Nuclear Information System (INIS)

    Carmona, Ruben; Gulaya, Sachin; Murphy, James D.; Rose, Brent S.; Wu, John; Noticewala, Sonal; McHale, Michael T.; Yashar, Catheryn M.; Vaida, Florin; Mell, Loren K.

    2014-01-01

    Purpose/Objectives(s): Early-stage endometrial cancer patients are at higher risk of noncancer mortality than of cancer mortality. Competing event models incorporating comorbidity could help identify women most likely to benefit from treatment intensification. Methods and Materials: 67,397 women with stage I-II endometrioid adenocarcinoma after total hysterectomy diagnosed from 1988 to 2009 were identified in Surveillance, Epidemiology, and End Results (SEER) and linked SEER-Medicare databases. Using demographic and clinical information, including comorbidity, we sought to develop and validate a risk score to predict the incidence of competing mortality. Results: In the validation cohort, increasing competing mortality risk score was associated with increased risk of noncancer mortality (subdistribution hazard ratio [SDHR], 1.92; 95% confidence interval [CI], 1.60-2.30) and decreased risk of endometrial cancer mortality (SDHR, 0.61; 95% CI, 0.55-0.78). Controlling for other variables, Charlson Comorbidity Index (CCI) = 1 (SDHR, 1.62; 95% CI, 1.45-1.82) and CCI >1 (SDHR, 3.31; 95% CI, 2.74-4.01) were associated with increased risk of noncancer mortality. The 10-year cumulative incidences of competing mortality within low-, medium-, and high-risk strata were 27.3% (95% CI, 25.2%-29.4%), 34.6% (95% CI, 32.5%-36.7%), and 50.3% (95% CI, 48.2%-52.6%), respectively. With increasing competing mortality risk score, we observed a significant decline in omega (ω), indicating a diminishing likelihood of benefit from treatment intensification. Conclusion: Comorbidity and other factors influence the risk of competing mortality among patients with early-stage endometrial cancer. Competing event models could improve our ability to identify patients likely to benefit from treatment intensification

  2. Existential risks: exploring a robust risk reduction strategy.

    Science.gov (United States)

    Jebari, Karim

    2015-06-01

    A small but growing number of studies have aimed to understand, assess and reduce existential risks, or risks that threaten the continued existence of mankind. However, most attention has been focused on known and tangible risks. This paper proposes a heuristic for reducing the risk of black swan extinction events. These events are, as the name suggests, stochastic and unforeseen when they happen. Decision theory based on a fixed model of possible outcomes cannot properly deal with this kind of event. Neither can probabilistic risk analysis. This paper will argue that the approach that is referred to as engineering safety could be applied to reducing the risk from black swan extinction events. It will also propose a conceptual sketch of how such a strategy may be implemented: isolated, self-sufficient, and continuously manned underground refuges. Some characteristics of such refuges are also described, in particular the psychosocial aspects. Furthermore, it is argued that this implementation of the engineering safety strategy safety barriers would be effective and plausible and could reduce the risk of an extinction event in a wide range of possible (known and unknown) scenarios. Considering the staggering opportunity cost of an existential catastrophe, such strategies ought to be explored more vigorously.

  3. Credit Risk Modeling

    DEFF Research Database (Denmark)

    Lando, David

    Credit risk is today one of the most intensely studied topics in quantitative finance. This book provides an introduction and overview for readers who seek an up-to-date reference to the central problems of the field and to the tools currently used to analyze them. The book is aimed at researchers...... and students in finance, at quantitative analysts in banks and other financial institutions, and at regulators interested in the modeling aspects of credit risk. David Lando considers the two broad approaches to credit risk analysis: that based on classical option pricing models on the one hand...

  4. Conceptual Modeling of Events as Information Objects and Change Agents

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    as a totality of an information object and a change agent. When an event is modeled as an information object it is comparable to an entity that exists only at a specific point in time. It has attributes and can be used for querying and specification of constraints. When an event is modeled as a change agent...... it is comparable to an executable transaction schema. Finally, we briefly compare our approach to object-oriented approaches based on encapsulated objects....

  5. Diablo Canyon internal events PRA [Probabilistic Risk Assessment] review: Methodology and findings

    International Nuclear Information System (INIS)

    Fitzpatrick, R.G.; Bozoki, G.; Sabek, M.

    1990-01-01

    The review of the Diablo Canyon Probabilistic Risk Assessment (DCRPA) incorporated some new and innovative approaches. These were necessitated by the unprecedented size, scope and level of detail of the DCRPA, which was submitted to the NRC for licensing purposes. This paper outlines the elements of the internal events portion of the review citing selected findings to illustrate the various approaches employed. The paper also provides a description of the extensive and comprehensive importance analysis applied by BNL to the DCRPA model. Importance calculations included: top event/function level; individual split fractions; pair importances between frontline-support and support-support systems; system importance by initiator; and others. The paper concludes with a brief discussion of the effectiveness of the applied methodology. 3 refs., 5 tabs

  6. An interplay model for authorities' actions and rumor spreading in emergency event

    Science.gov (United States)

    Huo, Liang-an; Huang, Peiqing; Fang, Xing

    2011-10-01

    Rumor spreading influences how rational individuals assess risks and evaluate needs, especially, it affects authorities to make decisions in an emergency-affected environments. Conversely, authorities' response to emergency will induct public opinions as well. In this paper, we present a simple model to describe the interplay between rumor spreading and authorities' actions in emergency situation based on utility theory. By drawing from differential equations we found that it is possible to minimize negative social utility of rumor spreading in the control of situation. At the same time, authorities' proactive actions can improve rumor management in emergency situation and yield positive social utility. Finally, we outline strategies for authorities that can contribute to rumor management in an emergency event.

  7. A coupled physical and economic model of the response of coastal real estate to climate risk

    Science.gov (United States)

    McNamara, Dylan E.; Keeler, Andrew

    2013-06-01

    Barring an unprecedented large-scale effort to raise island elevation, barrier-island communities common along the US East Coast are likely to eventually face inundation of the existing built environment on a timescale that depends on uncertain climatic forcing. Between the present and when a combination of sea-level rise and erosion renders these areas uninhabitable, communities must choose levels of defensive expenditures to reduce risks and individual residents must assess whether and when risk levels are unacceptably high to justify investment in housing. We model the dynamics of coastal adaptation as the interplay of underlying climatic risks, collective actions to mitigate those risks, and individual risk assessments based on beliefs in model predictions and processing of past climate events. Efforts linking physical and behavioural models to explore shoreline dynamics have not yet brought together this set of essential factors. We couple a barrier-island model with an agent-based model of real-estate markets to show that, relative to people with low belief in model predictions about climate change, informed property owners invest heavily in defensive expenditures in the near term and then abandon coastal real estate at some critical risk threshold that presages a period of significant price volatility.

  8. Are Masking-Based Models of Risk Useful?

    Science.gov (United States)

    Gisiner, Robert C

    2016-01-01

    As our understanding of directly observable effects from anthropogenic sound exposure has improved, concern about "unobservable" effects such as stress and masking have received greater attention. Equal energy models of masking such as power spectrum models have the appeal of simplicity, but do they offer biologically realistic assessments of the risk of masking? Data relevant to masking such as critical ratios, critical bandwidths, temporal resolution, and directional resolution along with what is known about general mammalian antimasking mechanisms all argue for a much more complicated view of masking when making decisions about the risk of masking inherent in a given anthropogenic sound exposure scenario.

  9. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  10. A technology path to tactical agent-based modeling

    Science.gov (United States)

    James, Alex; Hanratty, Timothy P.

    2017-05-01

    Wargaming is a process of thinking through and visualizing events that could occur during a possible course of action. Over the past 200 years, wargaming has matured into a set of formalized processes. One area of growing interest is the application of agent-based modeling. Agent-based modeling and its additional supporting technologies has potential to introduce a third-generation wargaming capability to the Army, creating a positive overmatch decision-making capability. In its simplest form, agent-based modeling is a computational technique that helps the modeler understand and simulate how the "whole of a system" responds to change over time. It provides a decentralized method of looking at situations where individual agents are instantiated within an environment, interact with each other, and empowered to make their own decisions. However, this technology is not without its own risks and limitations. This paper explores a technology roadmap, identifying research topics that could realize agent-based modeling within a tactical wargaming context.

  11. Understanding the NSAID related risk of vascular events

    NARCIS (Netherlands)

    Vonkeman, Harald Erwin; Brouwers, Jacobus R.B.J.; van de Laar, Mart A F J

    2006-01-01

    Concern is growing about an increased risk of thrombotic events (including myocardial infarction and stroke) during the use of non-steroidal anti-inflammatory drugs (NSAIDs), in particular the so called selective cyclo-oxygenase-2 (COX 2) inhibitors. Although clinical trials give conflicting results

  12. Job loss and broken partnerships: do the number of stressful life events influence the risk of ischemic heart disease in men?

    DEFF Research Database (Denmark)

    Kriegbaum, Margit; Christensen, Ulla; Lund, Rikke

    2008-01-01

    % confidence interval 1.02-1.85). We found no indication of dose-response relationship between number of events and risk of IHD. CONCLUSION: In this study of middle-aged men, we found only weak support for the effect of psychosocial stress on IHD measured with register based life events; we found that IHD...... was associated with broken partnerships but not with job loss. We did not find that the risk of incident IHD varied with the number of these stressful life events....

  13. Breast cancer risks and risk prediction models.

    Science.gov (United States)

    Engel, Christoph; Fischer, Christine

    2015-02-01

    BRCA1/2 mutation carriers have a considerably increased risk to develop breast and ovarian cancer. The personalized clinical management of carriers and other at-risk individuals depends on precise knowledge of the cancer risks. In this report, we give an overview of the present literature on empirical cancer risks, and we describe risk prediction models that are currently used for individual risk assessment in clinical practice. Cancer risks show large variability between studies. Breast cancer risks are at 40-87% for BRCA1 mutation carriers and 18-88% for BRCA2 mutation carriers. For ovarian cancer, the risk estimates are in the range of 22-65% for BRCA1 and 10-35% for BRCA2. The contralateral breast cancer risk is high (10-year risk after first cancer 27% for BRCA1 and 19% for BRCA2). Risk prediction models have been proposed to provide more individualized risk prediction, using additional knowledge on family history, mode of inheritance of major genes, and other genetic and non-genetic risk factors. User-friendly software tools have been developed that serve as basis for decision-making in family counseling units. In conclusion, further assessment of cancer risks and model validation is needed, ideally based on prospective cohort studies. To obtain such data, clinical management of carriers and other at-risk individuals should always be accompanied by standardized scientific documentation.

  14. Lessons learnt from past Flash Floods and Debris Flow events to propose future strategies on risk management

    Science.gov (United States)

    Cabello, Angels; Velasco, Marc; Escaler, Isabel

    2010-05-01

    Floods, including flash floods and debris flow events, are one of the most important hazards in Europe regarding both economic and life loss. Moreover, changes in precipitation patterns and intensity are very likely to increase due to the observed and predicted global warming, rising the risk in areas that are already vulnerable to floods. Therefore, it is very important to carry out new strategies to improve flood protection, but it is also crucial to take into account historical data to identify high risk areas. The main objective of this paper is to show a comparative analysis of the flood risk management information compiled in four test-bed basins (Llobregat, Guadalhorce, Gardon d'Anduze and Linth basins) from three different European countries (Spain, France and Switzerland) and to identify which are the lessons learnt from their past experiences in order to propose future strategies on risk management. This work is part of the EU 7th FP project IMPRINTS which aims at reducing loss of life and economic damage through the improvement of the preparedness and the operational risk management of flash flood and debris flow (FF & DF) events. The methodology followed includes the following steps: o Specific survey on the effectivity of the implemented emergency plans and risk management procedures sent to the test-bed basin authorities that participate in the project o Analysis of the answers from the questionnaire and further research on their methodologies for risk evaluation o Compilation of available follow-up studies carried out after major flood events in the four test-bed basins analyzed o Collection of the lessons learnt through a comparative analysis of the previous information o Recommendations for future strategies on risk management based on lessons learnt and management gaps detected through the process As the Floods Directive (FD) already states, the flood risks associated to FF & DF events should be assessed through the elaboration of Flood Risk

  15. Genetic risk, coronary heart disease events, and the clinical benefit of statin therapy: an analysis of primary and secondary prevention trials.

    Science.gov (United States)

    Mega, J L; Stitziel, N O; Smith, J G; Chasman, D I; Caulfield, M; Devlin, J J; Nordio, F; Hyde, C; Cannon, C P; Sacks, F; Poulter, N; Sever, P; Ridker, P M; Braunwald, E; Melander, O; Kathiresan, S; Sabatine, M S

    2015-06-06

    Genetic variants have been associated with the risk of coronary heart disease. In this study, we tested whether or not a composite of these variants could ascertain the risk of both incident and recurrent coronary heart disease events and identify those individuals who derive greater clinical benefit from statin therapy. A community-based cohort study (the Malmo Diet and Cancer Study) and four randomised controlled trials of both primary prevention (JUPITER and ASCOT) and secondary prevention (CARE and PROVE IT-TIMI 22) with statin therapy, comprising a total of 48,421 individuals and 3477 events, were included in these analyses. We studied the association of a genetic risk score based on 27 genetic variants with incident or recurrent coronary heart disease, adjusting for traditional clinical risk factors. We then investigated the relative and absolute risk reductions in coronary heart disease events with statin therapy stratified by genetic risk. We combined data from the different studies using a meta-analysis. When individuals were divided into low (quintile 1), intermediate (quintiles 2-4), and high (quintile 5) genetic risk categories, a significant gradient in risk for incident or recurrent coronary heart disease was shown. Compared with the low genetic risk category, the multivariable-adjusted hazard ratio for coronary heart disease for the intermediate genetic risk category was 1·34 (95% CI 1·22-1·47, pgenetic risk category was 1·72 (1·55-1·92, pgenetic risk categories. Similarly, we noted greater absolute risk reductions in those individuals in higher genetic risk categories (p=0·0101), resulting in a roughly threefold decrease in the number needed to treat to prevent one coronary heart disease event in the primary prevention trials. Specifically, in the primary prevention trials, the number needed to treat to prevent one such event in 10 years was 66 in people at low genetic risk, 42 in those at intermediate genetic risk, and 25 in those at high

  16. Risk Level Based Management System: a control banding model for occupational health and safety risk management in a highly regulated environment

    Energy Technology Data Exchange (ETDEWEB)

    Zalk, D; Kamerzell, R; Paik, S; Kapp, J; Harrington, D; Swuste, P

    2009-05-27

    The Risk Level Based Management System (RLBMS) is an occupational risk management (ORM) model that focuses occupational safety, hygeiene, and health (OSHH) resources on the highest risk procedures at work. This article demonstrates the model's simplicity through an implementation within a heavily regulated research institution. The model utilizes control banding strategies with a stratification of four risk levels (RLs) for many commonly performed maintenance and support activities, characterizing risk consistently for comparable tasks. RLBMS creates an auditable tracking of activities, maximizes OSHH professional field time, and standardizes documentation and control commensurate to a given task's RL. Validation of RLs and their exposure control effectiveness is collected in a traditional quantitative collection regime for regulatory auditing. However, qualitative risk assessment methods are also used within this validation process. Participatory approaches are used throughout the RLBMS process. Workers are involved in all phases of building, maintaining, and improving this model. This work participation also improves the implementation of established controls.

  17. Evaluation of three physiologically based pharmacokinetic (PBPK) modeling tools for emergency risk assessment after acute dichloromethane exposure

    NARCIS (Netherlands)

    Boerleider, R. Z.; Olie, J. D N; van Eijkeren, J. C H; Bos, P. M J; Hof, B. G H; de Vries, I.; Bessems, J. G M; Meulenbelt, J.; Hunault, C. C.

    2015-01-01

    Introduction: Physiologically based pharmacokinetic (PBPK) models may be useful in emergency risk assessment, after acute exposure to chemicals, such as dichloromethane (DCM). We evaluated the applicability of three PBPK models for human risk assessment following a single exposure to DCM: one model

  18. Risk of Death in Infants Who Have Experienced a Brief Resolved Unexplained Event: A Meta-Analysis.

    Science.gov (United States)

    Brand, Donald A; Fazzari, Melissa J

    2018-06-01

    To estimate an upper bound on the risk of death after a brief resolved unexplained event (BRUE), a sudden alteration in an infant's breathing, color, tone, or responsiveness, previously labeled "apparent life-threatening event" (ALTE). The meta-analysis incorporated observational studies of patients with ALTE that included data on in-hospital and post-discharge deaths with at least 1 week of follow-up after hospital discharge. Pertinent studies were identified from a published review of the literature from 1970 through 2014 and a supplementary PubMed query through February 2017. The 12 included studies (n = 3005) reported 12 deaths, of which 8 occurred within 4 months of the event. Applying a Poisson-normal random effects model to the 8 proximate deaths using a 4-month time horizon yielded a post-ALTE mortality rate of about 1 in 800, which constitutes an upper bound on the risk of death after a BRUE. This risk is about the same as the baseline risk of death during the first year of life. The meta-analysis therefore supports the return-home approach advocated in a recently published clinical practice guideline-not routine hospitalization-for BRUE patients who have been evaluated in the emergency department and determined to be at lower risk. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Advantages of a multi-state approach in surgical research: how intermediate events and risk factor profile affect the prognosis of a patient with locally advanced rectal cancer.

    Science.gov (United States)

    Manzini, G; Ettrich, T J; Kremer, M; Kornmann, M; Henne-Bruns, D; Eikema, D A; Schlattmann, P; de Wreede, L C

    2018-02-13

    Standard survival analysis fails to give insight into what happens to a patient after a first outcome event (like first relapse of a disease). Multi-state models are a useful tool for analyzing survival data when different treatments and results (intermediate events) can occur. Aim of this study was to implement a multi-state model on data of patients with rectal cancer to illustrate the advantages of multi-state analysis in comparison to standard survival analysis. We re-analyzed data from the RCT FOGT-2 study by using a multi-state model. Based on the results we defined a high and low risk reference patient. Using dynamic prediction, we estimated how the survival probability changes as more information about the clinical history of the patient becomes available. A patient with stage UICC IIIc (vs UICC II) has a higher risk to develop distant metastasis (DM) or both DM and local recurrence (LR) if he/she discontinues chemotherapy within 6 months or between 6 and 12 months, as well as after the completion of 12 months CTx with HR 3.55 (p = 0.026), 5.33 (p = 0.001) and 3.37 (p start of CTx, whereas for a low risk patient this is 79%. After the development of DM 1 year later, the high risk patient has an estimated 5-year survival probability of 11% and the low risk patient one of 21%. Multi-state models help to gain additional insight into the complex events after start of treatment. Dynamic prediction shows how survival probabilities change by progression of the clinical history.

  20. Estimated burden of cardiovascular disease and value-based price range for evolocumab in a high-risk, secondary-prevention population in the US payer context.

    Science.gov (United States)

    Toth, Peter P; Danese, Mark; Villa, Guillermo; Qian, Yi; Beaubrun, Anne; Lira, Armando; Jansen, Jeroen P

    2017-06-01

    To estimate real-world cardiovascular disease (CVD) burden and value-based price range of evolocumab for a US-context, high-risk, secondary-prevention population. Burden of CVD was assessed using the UK-based Clinical Practice Research Datalink (CPRD) in order to capture complete CV burden including CV mortality. Patients on standard of care (SOC; high-intensity statins) in CPRD were selected based on eligibility criteria of FOURIER, a phase 3 CV outcomes trial of evolocumab, and categorized into four cohorts: high-risk prevalent atherosclerotic CVD (ASCVD) cohort (n = 1448), acute coronary syndrome (ACS) (n = 602), ischemic stroke (IS) (n = 151), and heart failure (HF) (n = 291) incident cohorts. The value-based price range for evolocumab was assessed using a previously published economic model. The model incorporated CPRD CV event rates and considered CV event reduction rate ratios per 1 mmol/L reduction in low-density lipoprotein-cholesterol (LDL-C) from a meta-analysis of statin trials by the Cholesterol Treatment Trialists Collaboration (CTTC), i.e. CTTC relationship. Multiple-event rates of composite CV events (ACS, IS, or coronary revascularization) per 100 patient-years were 12.3 for the high-risk prevalent ASCVD cohort, and 25.7, 13.3, and 23.3, respectively, for incident ACS, IS, and HF cohorts. Approximately one-half (42%) of the high-risk ASCVD patients with a new CV event during follow-up had a subsequent CV event. Combining these real-world event rates and the CTTC relationship in the economic model, the value-based price range (credible interval) under a willingness-to-pay threshold of $150,000/quality-adjusted life-year gained for evolocumab was $11,990 ($9,341-$14,833) to $16,856 ($12,903-$20,678) in ASCVD patients with baseline LDL-C levels ≥70 mg/dL and ≥100 mg/dL, respectively. Real-world CVD burden is substantial. Using the observed CVD burden in CPRD and the CTTC relationship, the cost-effectiveness analysis showed

  1. Emerging risk – Conceptual definition and a relation to black swan type of events

    International Nuclear Information System (INIS)

    Flage, R.; Aven, T.

    2015-01-01

    The concept of emerging risk has gained increasing attention in recent years. The term has an intuitive appeal and meaning but a consistent and agreed definition is missing. We perform an in-depth analysis of this concept, in particular its relation to black swan type of events, and show that these can be considered meaningful and complementary concepts by relating emerging risk to known unknowns and black swans to unknown knowns, unknown unknowns and a subset of known knowns. The former is consistent with saying that we face emerging risk related to an activity when the background knowledge is weak but contains indications/justified beliefs that a new type of event (new in the context of that activity) could occur in the future and potentially have severe consequences to something humans value. The weak background knowledge among other things results in difficulty specifying consequences and possibly also in fully specifying the event itself; i.e. in difficulty specifying scenarios. Here knowledge becomes the key concept for both emerging risk and black swan type of events, allowing for taking into consideration time dynamics since knowledge develops over time. Some implications of our findings in terms of risk assessment and risk management are pointed out. - Highlights: • We perform an in-depth analysis of the concept of emerging risk. • Emerging risk and black swan type of events are shown to be complementary concepts. • We propose a definition of emerging risk where knowledge becomes the key term. • Some implications for risk assessment and risk management are pointed out.

  2. A Risk-Based Interval Two-Stage Programming Model for Agricultural System Management under Uncertainty

    Directory of Open Access Journals (Sweden)

    Ye Xu

    2016-01-01

    Full Text Available Nonpoint source (NPS pollution caused by agricultural activities is main reason that water quality in watershed becomes worse, even leading to deterioration. Moreover, pollution control is accompanied with revenue’s fall for agricultural system. How to design and generate a cost-effective and environmentally friendly agricultural production pattern is a critical issue for local managers. In this study, a risk-based interval two-stage programming model (RBITSP was developed. Compared to general ITSP model, significant contribution made by RBITSP model was that it emphasized importance of financial risk under various probabilistic levels, rather than only being concentrated on expected economic benefit, where risk is expressed as the probability of not meeting target profit under each individual scenario realization. This way effectively avoided solutions’ inaccuracy caused by traditional expected objective function and generated a variety of solutions through adjusting weight coefficients, which reflected trade-off between system economy and reliability. A case study of agricultural production management with the Tai Lake watershed was used to demonstrate superiority of proposed model. Obtained results could be a base for designing land-structure adjustment patterns and farmland retirement schemes and realizing balance of system benefit, system-failure risk, and water-body protection.

  3. Modelling domestic stock energy use and heat-related health risk : a GIS-based bottom-up modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Mavrogianni, A.; Davies, M. [Univ. College London, London (United Kingdom). Bartlett School of Graduate Studies; Chalabi, Z.; Wilkinson, P. [London School of Hygiene and Tropical Medecine, London (United Kingdom); Kolokotroni, M. [Brunel Univ., London (United Kingdom). School of Engineering Design

    2009-07-01

    Approximately 8 per cent of the carbon dioxide (CO{sub 2}) emissions produced in the United Kingdom are produced in London, one of the fastest growing cities worldwide. Based on the projected rates of population and economic growth, a 15 per cent increase of emissions is predicted. In addition to the national target to cut emissions by 80 per cent by 2050, the Mayor of London Climate Change Action Plan set a target to reduce London's CO{sub 2} emissions by 60 per cent by 2025. Significant carbon savings can be achieved in the building sector, particularly since 38 per cent of the total delivered energy in London is associated with domestic energy use. This paper demonstrated a systematic approach towards exploring the impact of urban built form and the combined effect of climate change and the urban heat island (UHI) phenomenon on the levels of domestic energy consumption and heat-related health risk in London. It presented work in progress on the development of a GIS-based energy consumption model and heat vulnerability index of the Greater London Area domestic stock. Comparison of the model output for 10 case study areas with topdown energy statistics revealed that the model successfully ranks areas based on their domestic space heating demand. The health module can be used to determine environments prone to higher risk of heat stress by investigating urban texture factors. A newly developed epidemiological model will be feed into the health module to examine the influence on risk of heat-related mortality of local urban built form characteristics. The epidemiological model is based on multi-variable analysis of deaths during heat wave and non-heat wave days. 29 refs., 1 tab., 7 figs.

  4. Risk assessment and model for community-based construction ...

    African Journals Online (AJOL)

    It, therefore, becomes necessary to systematically manage uncertainty in community-based construction in order to increase the likelihood of meeting project objectives using necessary risk management strategies. Risk management, which is an iterative process due to the dynamic nature of many risks, follows three main ...

  5. Plasma B-type natriuretic peptide as a predictor of cardiovascular events in subjects with atrial fibrillation: a community-based study.

    Directory of Open Access Journals (Sweden)

    Motoyuki Nakamura

    Full Text Available OBJECTIVES: Atrial fibrillation (AF is a significant public health issue due to its high prevalence in the general population, and is associated with an increased risk of cardiovascular (CV events including systemic thrombo-embolism, heart failure, and coronary artery disease. The relationship between plasma B-type natriuretic peptide (BNP and CV risk in real world AF subjects remains unknown. METHODS: The subject of the study (n = 228; mean age = 69 years was unselected individuals with AF in a community-based population (n = 15,394; AF prevalence rate = 1.5%. The CV event free rate within each BNP tertile was estimated, and Cox regression analysis was performed to examine the relative risk of the onset of CV events among the tertiles. The prognostic ability of BNP was compared to an established risk score for embolic events (CHADS2 score. In addition, to determine the usefulness of BNP as a predictor in addition to CHADS2 score, we calculated Net Reclassification Improvement (NRI and Integrated Discrimination Improvement (IDI indices. RESULTS: During the follow-up period 58 subjects experienced CV events (52 per 1,000 person-years. The event-free ratio was significantly lower in the highest tertile (p < 0.02. After adjustment for established CV risk factors, the hazard ratio (HR of the highest tertile was significantly higher than that of the lowest tertile (HR = 2.38; p < 0.02. The predictive abilities of plasma BNP in terms of sensitivity and specificity for general CV events were comparable to those of CHADS2 score. Adding BNP to the CHADS2 score only model improved the NRI (0.319; p < 0.05 and the IDI (0.046; p < 0.05. CONCLUSION: Plasma BNP is a valuable biomarker both singly or in combination with an established scoring system for assessing general CV risk including stroke, heart failure and acute coronary syndrome in real-world AF subjects.

  6. Model of MSD Risk Assessment at Workplace

    OpenAIRE

    K. Sekulová; M. Šimon

    2015-01-01

    This article focuses on upper-extremity musculoskeletal disorders risk assessment model at workplace. In this model are used risk factors that are responsible for musculoskeletal system damage. Based on statistic calculations the model is able to define what risk of MSD threatens workers who are under risk factors. The model is also able to say how MSD risk would decrease if these risk factors are eliminated.

  7. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Sezen, Halil [The Ohio State Univ., Columbus, OH (United States). Dept. of Civil, Environmental and Geodetic Engineering; Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States). College of Engineering, Nuclear Engineering Program, Dept. of Mechanical and Aerospace Engineering; Denning, R. [The Ohio State Univ., Columbus, OH (United States); Vaidya, N. [Rizzo Associates, Pittsburgh, PA (United States)

    2017-12-29

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  8. safety risk management based on fuzzy logic at underground projects

    Directory of Open Access Journals (Sweden)

    Farhad Taherkhani

    2017-11-01

    Conclusion: In the present article, a new model was developed to calculate the probability of occurrence of the event, which so far has not been addressed in other studies. Finally, effective measures can be taken to reduce the risk of a project by eliminating the high risk factors.

  9. Environmental risk assessment of selected organic chemicals based on TOC test and QSAR estimation models.

    Science.gov (United States)

    Chi, Yulang; Zhang, Huanteng; Huang, Qiansheng; Lin, Yi; Ye, Guozhu; Zhu, Huimin; Dong, Sijun

    2018-02-01

    Environmental risks of organic chemicals have been greatly determined by their persistence, bioaccumulation, and toxicity (PBT) and physicochemical properties. Major regulations in different countries and regions identify chemicals according to their bioconcentration factor (BCF) and octanol-water partition coefficient (Kow), which frequently displays a substantial correlation with the sediment sorption coefficient (Koc). Half-life or degradability is crucial for the persistence evaluation of chemicals. Quantitative structure activity relationship (QSAR) estimation models are indispensable for predicting environmental fate and health effects in the absence of field- or laboratory-based data. In this study, 39 chemicals of high concern were chosen for half-life testing based on total organic carbon (TOC) degradation, and two widely accepted and highly used QSAR estimation models (i.e., EPI Suite and PBT Profiler) were adopted for environmental risk evaluation. The experimental results and estimated data, as well as the two model-based results were compared, based on the water solubility, Kow, Koc, BCF and half-life. Environmental risk assessment of the selected compounds was achieved by combining experimental data and estimation models. It was concluded that both EPI Suite and PBT Profiler were fairly accurate in measuring the physicochemical properties and degradation half-lives for water, soil, and sediment. However, the half-lives between the experimental and the estimated results were still not absolutely consistent. This suggests deficiencies of the prediction models in some ways, and the necessity to combine the experimental data and predicted results for the evaluation of environmental fate and risks of pollutants. Copyright © 2016. Published by Elsevier B.V.

  10. Methods and Models of Market Risk Stress-Testing of the Portfolio of Financial Instruments

    Directory of Open Access Journals (Sweden)

    Alexander M. Karminsky

    2015-01-01

    Full Text Available Amid instability of financial markets and macroeconomic situation the necessity of improving bank risk-management instrument arises. New economic reality defines the need for searching for more advanced approaches of estimating banks vulnerability to exceptional, but plausible events. Stress-testing belongs to such instruments. The paper reviews and compares the models of market risk stress-testing of the portfolio of different financial instruments. These days the topic of the paper is highly acute due to the fact that now stress-testing is becoming an integral part of anticrisis risk-management amid macroeconomic instability and appearance of new risks together with close interest to the problem of risk-aggregation. The paper outlines the notion of stress-testing and gives coverage of goals, functions of stress-tests and main criteria for market risk stress-testing classification. The paper also stresses special aspects of scenario analysis. Novelty of the research is explained by elaborating the programme of aggregated complex multifactor stress-testing of the portfolio risk based on scenario analysis. The paper highlights modern Russian and foreign models of stress-testing both on solo-basis and complex. The paper lays emphasis on the results of stress-testing and revaluations of positions for all three complex models: methodology of the Central Bank of stress-testing portfolio risk, model relying on correlations analysis and copula model. The models of stress-testing on solo-basis are different for each financial instrument. Parametric StressVaR model is applicable to shares and options stress-testing;model based on "Grek" indicators is used for options; for euroobligation regional factor model is used. Finally some theoretical recommendations about managing market risk of the portfolio are given.

  11. Calcium Channel Blockers in Secondary Cardiovascular Prevention and Risk of Acute Events: Real-World Evidence from Nested Case-Control Studies on Italian Hypertensive Elderly.

    Science.gov (United States)

    Bettiol, Alessandra; Lucenteforte, Ersilia; Vannacci, Alfredo; Lombardi, Niccolò; Onder, Graziano; Agabiti, Nera; Vitale, Cristiana; Trifirò, Gianluca; Corrao, Giovanni; Roberto, Giuseppe; Mugelli, Alessandro; Chinellato, Alessandro

    2017-12-01

    Antihypertensive treatment with calcium channel blockers (CCBs) is consolidated in clinical practice; however, different studies observed increased risks of acute events for short-acting CCBs. This study aimed to provide real-world evidence on risks of acute cardiovascular (CV) events, hospitalizations and mortality among users of different CCB classes in secondary CV prevention. Three case-control studies were nested in a cohort of Italian elderly hypertensive CV-compromised CCBs users. Cases were subjects with CV events (n = 25,204), all-cause hospitalizations (n = 19,237), or all-cause mortality (n = 17,996) during the follow-up. Up to four controls were matched for each case. Current or past exposition to CCBs at index date was defined based on molecule, formulation and daily doses of the last CCB delivery. The odds ratio (OR) and 95% confidence intervals (CI) were estimated using conditional logistic regression models. Compared to past users, current CCB users had significant reductions in risks of CV events [OR 0.88 (95% CI: 0.84-0.91)], hospitalization [0.90 (0.88-0.93)] and mortality [0.48 (0.47-0.49)]. Current users of long-acting dihydropyridines (DHPs) had the lowest risk [OR 0.87 (0.84-0.90), 0.86 (0.83-0.90), 0.55 (0.54-0.56) for acute CV events, hospitalizations and mortality], whereas current users of short-acting CCBs had an increased risk of acute CV events [OR 1.77 (1.13-2.78) for short-acting DHPs; 1.19 (1.07-1.31) for short-acting non-DHPs] and hospitalizations [OR 1.84 (0.96-3.51) and 1.23 (1.08-1.42)]. The already-existing warning on short-acting CCBs should be potentiated, addressing clinicians towards the choice of long-acting formulations.

  12. A sequential threshold cure model for genetic analysis of time-to-event data

    DEFF Research Database (Denmark)

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    In analysis of time-to-event data, classical survival models ignore the presence of potential nonsusceptible (cured) individuals, which, if present, will invalidate the inference procedures. Existence of nonsusceptible individuals is particularly relevant under challenge testing with specific...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...... survival data were generated through 2 underlying traits: susceptibility and endurance (risk of dying per time-unit), associated with 2 sets of underlying liabilities. Despite considerable phenotypic confounding, the proposed model was largely able to distinguish the 2 traits. Furthermore, if selection...

  13. Fracture risk assessment for the pressurized water reactor pressure vessel under pressurized thermal shock events

    International Nuclear Information System (INIS)

    Chou, Hsoung-Wei; Huang, Chin-Cheng

    2016-01-01

    Highlight: • The PTS loading conditions consistent with the USNRC's new PTS rule are applied as the loading condition for a Taiwan domestic PWR. • The state-of-the-art PFM technique is employed to analyze a reactor pressure vessel. • Novel flaw model and embrittlement correlation are considered in the study. • The RT-based regression formula of NUREG-1874 was also utilized to evaluate the failure risks of RPV. • For slightly embrittled RPV, the SO-1 type PTSs play more important role than other types of PTS. - Abstract: The fracture risk of the pressurized water reactor pressure vessel of a Taiwan domestic nuclear power plant has been evaluated according to the technical basis of the U.S.NRC's new pressurized thermal shock (PTS) screening criteria. The ORNL's FAVOR code and the PNNL's flaw models were employed to perform the probabilistic fracture mechanics analysis associated with plant specific parameters of the domestic reactor pressure vessel. Meanwhile, the PTS thermal hydraulic and probabilistic risk assessment data analyzed from a similar nuclear power plant in the United States for establishing the new PTS rule were applied as the loading conditions. Besides, an RT-based regression formula derived by the U.S.NRC was also utilized to verify the through-wall cracking frequencies. It is found that the through-wall cracking of the analyzed reactor pressure vessel only occurs during the PTS events resulted from the stuck-open primary safety relief valves that later reclose, but with only an insignificant failure risk. The results indicate that the Taiwan domestic PWR pressure vessel has sufficient structural margin for the PTS attack until either the current license expiration dates or during the proposed extended operation periods.

  14. Use of Hypoprothrombinemia-Inducing Cephalosporins and the Risk of Hemorrhagic Events: A Nationwide Nested Case-Control Study

    Science.gov (United States)

    Shen, Li-Jiuan; Wu, Fe-Lin Lin; Tsay, Woei; Hung, Chien-Ching; Lin, Shu-Wen

    2016-01-01

    Objective Existing data regarding the risk of hemorrhagic events associated with exposure to hypoprothrombinemia-inducing cephalosporins are limited by the small sample size. This population-based study aimed to examine the association between exposure to hypoprothrombinemia-inducing cephalosporins and hemorrhagic events using National Health Insurance Research Database in Taiwan. Design A nationwide nested case-control study. Setting National Health Insurance Research database. Participants We conducted a nested case-control study within a cohort of 6191 patients who received hypoprothrombinemia-inducing cephalosporins and other antibiotics for more than 48 hours. Multivariable conditional logistic regressions were used to calculate the adjusted odds ratio (aOR) and 95% confidence interval (CI) for hemorrhagic events associated with exposure to hypoprothrombinemia-inducing cephalosporins (overall, cumulative dose measured as defined daily dose (DDD), and individual cephalosporins). Results Within the cohort, we identified 704 patients with hemorrhagic events and 2816 matched controls. Use of hypoprothrombinemia-inducing cephalosporins was associated with increased risk of hemorrhagic events (aOR, 1.71; 95% CI, 1.42–2.06), which increased with higher cumulative doses (5 DDDs, aOR 1.89). The aOR for individual cephalosporin was 2.88 (95% CI, 2.08–4.00), 1.35 (1.09–1.67) and 4.57 (2.63–7.95) for cefmetazole, flomoxef, and cefoperazone, respectively. Other risk factors included use of anticoagulants (aOR 2.08 [95% CI, 1.64–2.63]), liver failure (aOR 1.69 [1.30–2.18]), poor nutritional status (aOR 1.41 [1.15–1.73]), and history of hemorrhagic events (aOR 2.57 [1.94–3.41]) 6 months prior to the index date. Conclusions Use of hypoprothrombinemia-inducing cephalosporins increases risk of hemorrhagic events. Close watch for hemorrhagic events is recommended when prescribing these cephalosporins, especially in patients who are at higher risk. PMID:27463687

  15. Recognition of risk situations based on endoscopic instrument tracking and knowledge based situation modeling

    Science.gov (United States)

    Speidel, Stefanie; Sudra, Gunther; Senemaud, Julien; Drentschew, Maximilian; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger

    2008-03-01

    Minimally invasive surgery has gained significantly in importance over the last decade due to the numerous advantages on patient-side. The surgeon has to adapt special operation-techniques and deal with difficulties like the complex hand-eye coordination, limited field of view and restricted mobility. To alleviate these constraints we propose to enhance the surgeon's capabilities by providing a context-aware assistance using augmented reality (AR) techniques. In order to generate a context-aware assistance it is necessary to recognize the current state of the intervention using intraoperatively gained sensor data and a model of the surgical intervention. In this paper we present the recognition of risk situations, the system warns the surgeon if an instrument gets too close to a risk structure. The context-aware assistance system starts with an image-based analysis to retrieve information from the endoscopic images. This information is classified and a semantic description is generated. The description is used to recognize the current state and launch an appropriate AR visualization. In detail we present an automatic vision-based instrument tracking to obtain the positions of the instruments. Situation recognition is performed using a knowledge representation based on a description logic system. Two augmented reality visualization programs are realized to warn the surgeon if a risk situation occurs.

  16. Change of flood risk under climate change based on Discharge Probability Index in Japan

    Science.gov (United States)

    Nitta, T.; Yoshimura, K.; Kanae, S.; Oki, T.

    2010-12-01

    Water-related disasters under the climate change have recently gained considerable interest, and there have been many studies referring to flood risk at the global scale (e.g. Milly et al., 2002; Hirabayashi et al., 2008). In order to build adaptive capacity, however, regional impact evaluation is needed. We thus focus on the flood risk over Japan in the present study. The output from the Regional Climate Model 20 (RCM20), which was developed by the Meteorological Research Institute, was used. The data was first compared with observed data based on Automated Meteorological Data Acquisition System and ground weather observations, and the model biases were corrected using the ratio and difference of the 20-year mean values. The bias-corrected RCM20 atmospheric data were then forced to run a land surface model and a river routing model (Yoshimura et al., 2007; Ngo-Duc, T. et al. 2007) to simulate river discharge during 1981-2000, 2031-2050, and 2081-2100. Simulated river discharge was converted to Discharge Probability Index (DPI), which was proposed by Yoshimura et al based on a statistical approach. The bias and uncertainty of the models are already taken into account in the concept of DPI, so that DPI serves as a good indicator of flood risk. We estimated the statistical parameters for DPI using the river discharge for 1981-2000 with an assumption that the parameters stay the same in the different climate periods. We then evaluated the occurrence of flood events corresponding to DPI categories in each 20 years and averaged them in 9 regions. The results indicate that low DPI flood events (return period of 2 years) will become more frequent in 2031-2050 and high DPI flood events (return period of 200 years) will become more frequent in 2081-2100 compared with the period of 1981-2000, though average precipitation will become larger during 2031-2050 than during 2081-2100 in most regions. It reflects the increased extreme precipitation during 2081-2100.

  17. Statistical Prediction of Solar Particle Event Frequency Based on the Measurements of Recent Solar Cycles for Acute Radiation Risk Analysis

    Science.gov (United States)

    Myung-Hee, Y. Kim; Shaowen, Hu; Cucinotta, Francis A.

    2009-01-01

    Large solar particle events (SPEs) present significant acute radiation risks to the crew members during extra-vehicular activities (EVAs) or in lightly shielded space vehicles for space missions beyond the protection of the Earth's magnetic field. Acute radiation sickness (ARS) can impair performance and result in failure of the mission. Improved forecasting capability and/or early-warning systems and proper shielding solutions are required to stay within NASA's short-term dose limits. Exactly how to make use of observations of SPEs for predicting occurrence and size is a great challenge, because SPE occurrences themselves are random in nature even though the expected frequency of SPEs is strongly influenced by the time position within the solar activity cycle. Therefore, we developed a probabilistic model approach, where a cumulative expected occurrence curve of SPEs for a typical solar cycle was formed from a non-homogeneous Poisson process model fitted to a database of proton fluence measurements of SPEs that occurred during the past 5 solar cycles (19 - 23) and those of large SPEs identified from impulsive nitrate enhancements in polar ice. From the fitted model, the expected frequency of SPEs was estimated at any given proton fluence threshold (Phi(sub E)) with energy (E) >30 MeV during a defined space mission period. Corresponding Phi(sub E) (E=30, 60, and 100 MeV) fluence distributions were simulated with a random draw from a gamma distribution, and applied for SPE ARS risk analysis for a specific mission period. It has been found that the accurate prediction of deep-seated organ doses was more precisely predicted at high energies, Phi(sub 100), than at lower energies such as Phi(sub 30) or Phi(sub 60), because of the high penetration depth of high energy protons. Estimates of ARS are then described for 90th and 95th percentile events for several mission lengths and for several likely organ dose-rates. The ability to accurately measure high energy protons

  18. A Simple Model to Rank Shellfish Farming Areas Based on the Risk of Disease Introduction and Spread.

    Science.gov (United States)

    Thrush, M A; Pearce, F M; Gubbins, M J; Oidtmann, B C; Peeler, E J

    2017-08-01

    The European Union Council Directive 2006/88/EC requires that risk-based surveillance (RBS) for listed aquatic animal diseases is applied to all aquaculture production businesses. The principle behind this is the efficient use of resources directed towards high-risk farm categories, animal types and geographic areas. To achieve this requirement, fish and shellfish farms must be ranked according to their risk of disease introduction and spread. We present a method to risk rank shellfish farming areas based on the risk of disease introduction and spread and demonstrate how the approach was applied in 45 shellfish farming areas in England and Wales. Ten parameters were used to inform the risk model, which were grouped into four risk themes based on related pathways for transmission of pathogens: (i) live animal movement, (ii) transmission via water, (iii) short distance mechanical spread (birds) and (iv) long distance mechanical spread (vessels). Weights (informed by expert knowledge) were applied both to individual parameters and to risk themes for introduction and spread to reflect their relative importance. A spreadsheet model was developed to determine quantitative scores for the risk of pathogen introduction and risk of pathogen spread for each shellfish farming area. These scores were used to independently rank areas for risk of introduction and for risk of spread. Thresholds were set to establish risk categories (low, medium and high) for introduction and spread based on risk scores. Risk categories for introduction and spread for each area were combined to provide overall risk categories to inform a risk-based surveillance programme directed at the area level. Applying the combined risk category designation framework for risk of introduction and spread suggested by European Commission guidance for risk-based surveillance, 4, 10 and 31 areas were classified as high, medium and low risk, respectively. © 2016 Crown copyright.

  19. Crisis and emergency risk communication as an integrative model.

    Science.gov (United States)

    Reynolds, Barbara; W Seeger, Matthew

    2005-01-01

    This article describes a model of communication known as crisis and emergency risk communication (CERC). The model is outlined as a merger of many traditional notions of health and risk communication with work in crisis and disaster communication. The specific kinds of communication activities that should be called for at various stages of disaster or crisis development are outlined. Although crises are by definition uncertain, equivocal, and often chaotic situations, the CERC model is presented as a tool health communicators can use to help manage these complex events.

  20. IMPROVING THE EFFICIENCY OF THE FINANCIAL CONTROL SYSTEM IN TERMS OF THE RISK-ORIENTED MODEL

    Directory of Open Access Journals (Sweden)

    M. N. Ponkratova

    2015-01-01

    Full Text Available The imperfection of the legislative and methodological basis of fi nancial control in the Russian Federation shall determine the list of issues on illicit and misuse of budget funds, including with the use of schemes for the withdrawal of capital abroad through fi nancial instruments (cash, Bank transfers and deposits, securities and bills of exchange operations which are common practice in our country. In this regard, of particular importance is the use of a risk-oriented model of financial control, adapted to the conditions of the Russian Federation.The article discusses the risk-based model as part of a system of financial control. The concept model aimed at check point, implying the identification of reference points in the organization's and customers' risk event.

  1. Effect of intravitreal anti-vascular endothelial growth factor therapy on the risk of arterial thromboembolic events: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Jin-Wei Cheng

    Full Text Available Intravitreal anti-vascular endothelial growth factor (VEGF monoclonal antibodies are used in ocular neovascular diseases. A consensus has emerged that intravenous anti-VEGF can increase the risk of arterial thromboembolic events. However, the role of intravitreal anti-VEGF in arterial thromboembolism is controversial. Therefore, we did a systematic review and meta-analysis to investigate the effects of intravitreal anti-VEGF on the risk of arterial thromboembolic events.Electronic databases were searched to identify relevant randomized clinical trials comparing intravitreal anti-VEGF with controls. Criteria for inclusion in our meta-analysis included a study duration of no less than 12 months, the use of a randomized control group not receiving any intravitreal active agent, and the availability of outcome data for arterial thromboembolic events, myocardial infarction, cerebrovascular accidents, and vascular death. The risk ratios and 95% CIs were calculated using a fixed-effects or random-effects model, depending on the heterogeneity of the included studies.A total of 4942 patients with a variety of ocular neovascular diseases from 13 randomized controlled trials were identified and included for analysis. There was no significant difference between intravitreal anti-VEGF and control in the risk of all events, with risk ratios of 0.87 (95% CI, 0.64 to 1.19 for arterial thromboembolic events, 0.96 (95% CI, 0.55-1.68 for cerebrovascular accidents, 0.69 (95% CI 0.40-1.21 for myocardial infarctions, and 0.68 (95% CI, 0.37-1.27 for vascular death.The strength evidence suggests that the intravitreal use of anti-VEGF antibodies is not associated with an increased risk of arterial thromboembolic events.

  2. Effect of risk-based payment model on caries inequalities in preschool children assessed by geo-mapping.

    Science.gov (United States)

    Holmén, Anders; Strömberg, Ulf; Håkansson, Gunnel; Twetman, Svante

    2018-01-05

    To describe, with aid of geo-mapping, the effects of a risk-based capitation model linked to caries-preventive guidelines on the polarization of caries in preschool children living in the Halland region of Sweden. The new capitation model was implemented in 2013 in which more money was allocated to Public Dental Clinics surrounded by administrative parishes inhabited by children with increased caries risk, while a reduced capitation was allocated to those clinics with a low burden of high risk children. Regional geo-maps of caries risk based on caries prevalence, level of education and the families purchasing power were produced for 3-6-year-old children in 2010 (n = 10,583) and 2016 (n = 7574). Newly migrated children to the region (n = 344 in 2010 and n = 522 in 2016) were analyzed separately. A regional caries polarization index was calculated as the ratio between the maximum and minimum estimates of caries frequency on parish-level, based on a Bayesian hierarchical mapping model. Overall, the total caries prevalence (dmfs > 0) remained unchanged from 2010 (10.6%) to 2016 (10.5%). However, the polarization index decreased from 7.0 in 2010 to 5.6 in 2016. Newly arrived children born outside Sweden had around four times higher caries prevalence than their Swedish-born peers. A risk-based capitation model could reduce the socio-economic inequalities in dental caries among preschool children living in Sweden. Although updated evidence-based caries-preventive guidelines were released, the total prevalence of caries on dentin surface level was unaffected 4 years after the implementation.

  3. Discrete event simulation model of sudden cardiac death predicts high impact of preventive interventions.

    Science.gov (United States)

    Andreev, Victor P; Head, Trajen; Johnson, Neil; Deo, Sapna K; Daunert, Sylvia; Goldschmidt-Clermont, Pascal J

    2013-01-01

    Sudden Cardiac Death (SCD) is responsible for at least 180,000 deaths a year and incurs an average cost of $286 billion annually in the United States alone. Herein, we present a novel discrete event simulation model of SCD, which quantifies the chains of events associated with the formation, growth, and rupture of atheroma plaques, and the subsequent formation of clots, thrombosis and on-set of arrhythmias within a population. The predictions generated by the model are in good agreement both with results obtained from pathological examinations on the frequencies of three major types of atheroma, and with epidemiological data on the prevalence and risk of SCD. These model predictions allow for identification of interventions and importantly for the optimal time of intervention leading to high potential impact on SCD risk reduction (up to 8-fold reduction in the number of SCDs in the population) as well as the increase in life expectancy.

  4. Event-based aquifer-to-atmosphere modeling over the European CORDEX domain

    Science.gov (United States)

    Keune, J.; Goergen, K.; Sulis, M.; Shrestha, P.; Springer, A.; Kusche, J.; Ohlwein, C.; Kollet, S. J.

    2014-12-01

    Despite the fact that recent studies focus on the impact of soil moisture on climate and especially land-energy feedbacks, groundwater dynamics are often neglected or conceptual groundwater flow models are used. In particular, in the context of climate change and the occurrence of droughts and floods, a better understanding and an improved simulation of the physical processes involving groundwater on continental scales is necessary. This requires the implementation of a physically consistent terrestrial modeling system, which explicitly incorporates groundwater dynamics and the connection with shallow soil moisture. Such a physics-based system enables simulations and monitoring of groundwater storage and enhanced representations of the terrestrial energy and hydrologic cycles over long time periods. On shorter timescales, the prediction of groundwater-related extremes, such as floods and droughts, are expected to improve, because of the improved simulation of components of the hydrological cycle. In this study, we present a fully coupled aquifer-to-atmosphere modeling system over the European CORDEX domain. The integrated Terrestrial Systems Modeling Platform, TerrSysMP, consisting of the three-dimensional subsurface model ParFlow, the Community Land Model CLM3.5 and the numerical weather prediction model COSMO of the German Weather Service, is used. The system is set up with a spatial resolution of 0.11° (12.5km) and closes the terrestrial water and energy cycles from aquifers into the atmosphere. Here, simulations of the fully coupled system are performed over events, such as the 2013 flood in Central Europe and the 2003 European heat wave, and over extended time periods on the order of 10 years. State and flux variables of the terrestrial hydrologic and energy cycle are analyzed and compared to both in situ (e.g. stream and water level gauge networks, FLUXNET) and remotely sensed observations (e.g. GRACE, ESA ICC ECV soil moisture and SMOS). Additionally, the

  5. Developing a phenological model for grapevine to assess future frost risk in Luxembourg

    Science.gov (United States)

    Caffarra, A.; Molitor, D.; Pertot, I.; Sinigoy, P.; Junk, J.

    2012-04-01

    Late frost damage represents a significant hazard to grape production in cool climate viticulture regions such as Luxembourg. The main aim of our study is to analyze the frequency of these events for the Luxembourg's winegrowing region in the future. Spring frost injuries on grape may occur when young green parts are exposed to air temperature below 0°C. The potential risk is determined by: (i) minimum air temperature conditions and the (ii) the timing of bud burst. Therefore, we developed and validated a model for budburst of the grapevine (*Vitis vinifera)* cultivar Rivaner, the most grown local variety, based on multi-annual data from 7 different sites across Europe and the US. An advantage of this approach is, that it could be applied to a wide range of climate conditions. Higher spring temperatures were projected for the future and could lead to earlier dates of budburst as well as earlier dates of last frost events in the season. However, so far it is unknown if this will increase or decrease the risk of severe late frost damages for Luxembourg's winegrowing region. To address this question results of 10 regional climate change projections from the FP6 ENSEMBLES project (spatial resolution = 25km; A1B emission scenario) were combined with the new bud burst model. The use of a multi model ensemble of climate change projections allows for a better quantification of the uncertainties. A bias corrections scheme, based on local observations, was applied to the model output. Projected daily minimum air temperatures, up to 2098, were compared to the projected date of bud burst in order to quantify the future frost risk for Luxembourg.

  6. External Events PSA for the Paks NPP

    International Nuclear Information System (INIS)

    Bareith, Attila; Karsa, Zoltan; Siklossy, Tamas; Vida, Zoltan

    2014-01-01

    quantification and interpretation of results. The risk of core damage induced by natural external hazards was quantified to the extent seen feasible. In addition to risk quantification, unresolved issues and necessary follow-on analyses were identified and proposed. At present an action plan is being developed for these analyses. Core damage risk has been assessed quantitatively for wind, snow and frost hazards. Detailed importance, sensitivity and uncertainty analyses were conducted. Moreover the main risk contributors induced by these external events were also identified. Additional follow-on analyses were proposed to enable an improved risk quantification by means of reducing uncertainties, establishing a better technical basis for the applied analytical assumptions, or decreasing unnecessarily high conservatism. Based on the findings of hazard assessment and plant response analysis, the core damage risk induced by extreme rainfall and lightning was found to be insignificant. However, some follow-on analyses were proposed and safety enhancement measures were conceptualised to fully underpin this conclusion. Due to lack of appropriate data and supporting analysis on the capacity of plant systems and components no PSA model has been developed yet for extreme temperatures. Follow-on analyses necessary for quantifying the risk of core damage induced by extreme temperatures have been identified. (authors)

  7. Individual-based model for radiation risk assessment

    Science.gov (United States)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  8. Wildfire risk for main vegetation units in a biodiversity hotspot: modeling approach in New Caledonia, South Pacific.

    Science.gov (United States)

    Gomez, Céline; Mangeas, Morgan; Curt, Thomas; Ibanez, Thomas; Munzinger, Jérôme; Dumas, Pascal; Jérémy, André; Despinoy, Marc; Hély, Christelle

    2015-01-01

    Wildfire has been recognized as one of the most ubiquitous disturbance agents to impact on natural environments. In this study, our main objective was to propose a modeling approach to investigate the potential impact of wildfire on biodiversity. The method is illustrated with an application example in New Caledonia where conservation and sustainable biodiversity management represent an important challenge. Firstly, a biodiversity loss index, including the diversity and the vulnerability indexes, was calculated for every vegetation unit in New Caledonia and mapped according to its distribution over the New Caledonian mainland. Then, based on spatially explicit fire behavior simulations (using the FLAMMAP software) and fire ignition probabilities, two original fire risk assessment approaches were proposed: a one-off event model and a multi-event burn probability model. The spatial distribution of fire risk across New Caledonia was similar for both indices with very small localized spots having high risk. The patterns relating to highest risk are all located around the remaining sclerophyll forest fragments and are representing 0.012% of the mainland surface. A small part of maquis and areas adjacent to dense humid forest on ultramafic substrates should also be monitored. Vegetation interfaces between secondary and primary units displayed high risk and should represent priority zones for fire effects mitigation. Low fire ignition probability in anthropogenic-free areas decreases drastically the risk. A one-off event associated risk allowed localizing of the most likely ignition areas with potential for extensive damage. Emergency actions could aim limiting specific fire spread known to have high impact or consist of on targeting high risk areas to limit one-off fire ignitions. Spatially explicit information on burning probability is necessary for setting strategic fire and fuel management planning. Both risk indices provide clues to preserve New Caledonia hot spot of

  9. Transcription-based model for the induction of chromosomal exchange events by ionising radiation

    International Nuclear Information System (INIS)

    Radford, I.A.

    2003-01-01

    The mechanistic basis for chromosomal aberration formation, following exposure of mammalian cells to ionising radiation, has long been debated. Although chromosomal aberrations are probably initiated by DNA double-strand breaks (DSB), little is understood about the mechanisms that generate and modulate DNA rearrangement. Based on results from our laboratory and data from the literature, a novel model of chromosomal aberration formation has been suggested (Radford 2002). The basic postulates of this model are that: (1) DSB, primarily those involving multiple individual damage sites (i.e. complex DSB), are the critical initiating lesion; (2) only those DSB occurring in transcription units that are associated with transcription 'factories' (complexes containing multiple transcription units) induce chromosomal exchange events; (3) such DSB are brought into contact with a DNA topoisomerase I molecule through RNA polymerase II catalysed transcription and give rise to trapped DNA-topo I cleavage complexes; and (4) trapped complexes interact with another topo I molecule on a temporarily inactive transcription unit at the same transcription factory leading to DNA cleavage and subsequent strand exchange between the cleavage complexes. We have developed a method using inverse PCR that allows the detection and sequencing of putative ionising radiation-induced DNA rearrangements involving different regions of the human genome (Forrester and Radford 1998). The sequences detected by inverse PCR can provide a test of the prediction of the transcription-based model that ionising radiation-induced DNA rearrangements occur between sequences in active transcription units. Accordingly, reverse transcriptase PCR was used to determine if sequences involved in rearrangements were transcribed in the test cells. Consistent with the transcription-based model, nearly all of the sequences examined gave a positive result to reverse transcriptase PCR (Forrester and Radford unpublished)

  10. Is albuminuria a myocardial infarction risk equivalent for atherothrombotic events?

    Science.gov (United States)

    Rein, Philipp; Saely, Christoph H; Vonbank, Alexander; Fraunberger, Peter; Drexel, Heinz

    2015-05-01

    People with chronic kidney disease frequently experience cardiovascular events. This study sought to investigate whether the presence of albuminuria displays a vascular risk equivalent to that in patients with prior myocardial infarction. Albuminuria was defined as a urinary albumin to creatinine ratio of 30 μg/mg or greater in 852 consecutive patients undergoing coronary angiography. Prospectively, we recorded vascular events over 3.2±1.2 years. From our patients, 513 (60.2%) had neither albuminuria nor a history of MI, 126 (14.8%) had albuminuria without prior MI, 137 (16.1%) did not have albuminuria but had a history of MI, and 76 (8.9%) had both, albuminuria and prior MI. Compared with the incidence of the composite endpoint among normoalbuminuric patients with no prior MI (11.9%), event rates nearly doubled both in patients with albuminuria without prior MI (24.6%; p=0.003) and in normoalbuminuric patients with a history of prior MI (21.2%; p=0.004) and were highest in patients with both, albuminuria and prior MI (36.8%; p<0.001). Importantly, event rates were not significantly different between patients with albuminuria and no prior history of MI and those with normoalbuminuria but prior MI (p=0.972). Moreover, the event rate in patients with both, albuminuria and history of MI, was significantly higher (p<0.05) than in the two groups exhibiting only one of the two conditions. This is the first study demonstrating that albuminuria is a CAD risk equivalent. Thus, cardiovascular risk factors in albuminuric patients should be treated as aggressively as in patients with prior MI. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Traditional Cardiovascular Risk Factors as Predictors of Cardiovascular Events in the U.S. Astronaut Corps

    Science.gov (United States)

    Halm, M. K.; Clark, A.; Wear, M. L.; Murray, J. D.; Polk, J. D.; Amirian, E.

    2009-01-01

    Risk prediction equations from the Framingham Heart Study are commonly used to predict the absolute risk of myocardial infarction (MI) and coronary heart disease (CHD) related death. Predicting CHD-related events in the U.S. astronaut corps presents a monumental challenge, both because astronauts tend to live healthier lifestyles and because of the unique cardiovascular stressors associated with being trained for and participating in space flight. Traditional risk factors may not hold enough predictive power to provide a useful indicator of CHD risk in this unique population. It is important to be able to identify individuals who are at higher risk for CHD-related events so that appropriate preventive care can be provided. This is of special importance when planning long duration missions since the ability to provide advanced cardiac care and perform medical evacuation is limited. The medical regimen of the astronauts follows a strict set of clinical practice guidelines in an effort to ensure the best care. The purpose of this study was to evaluate the utility of the Framingham risk score (FRS), low-density lipoprotein (LDL) and high-density lipoprotein levels, blood pressure, and resting pulse as predictors of CHD-related death and MI in the astronaut corps, using Cox regression. Of these factors, only two, LDL and pulse at selection, were predictive of CHD events (HR(95% CI)=1.12 (1.00-1.25) and HR(95% CI)=1.70 (1.05-2.75) for every 5-unit increase in LDL and pulse, respectively). Since traditional CHD risk factors may lack the specificity to predict such outcomes in astronauts, the development of a new predictive model, using additional measures such as electron-beam computed tomography and carotid intima-media thickness ultrasound, is planned for the future.

  12. Probabilistic risk assessment using event tables and the BNL [Brookhaven National Laboratory] event-tree analyzer

    International Nuclear Information System (INIS)

    Fullwood, R.R.; Shier, W.G.

    1989-01-01

    Probabilistic risk analysis (PRA) is being used to study design alternatives for the advanced neutron source research reactor being designed at Oak Ridge National Laboratory for operation in the 1990s. Major communication paths between the designers and the safety analysts are accident discussions supported by event tables, event-tree graphics, and accident sequence probabilities. The BETA code used in conjunction with a word processor provides this linkage. This paper describes the process, features of the BETA, how it works, and some examples of usage

  13. A comprehensive Network Security Risk Model for process control networks.

    Science.gov (United States)

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example.

  14. Korean risk assessment model for breast cancer risk prediction.

    Science.gov (United States)

    Park, Boyoung; Ma, Seung Hyun; Shin, Aesun; Chang, Myung-Chul; Choi, Ji-Yeob; Kim, Sungwan; Han, Wonshik; Noh, Dong-Young; Ahn, Sei-Hyun; Kang, Daehee; Yoo, Keun-Young; Park, Sue K

    2013-01-01

    We evaluated the performance of the Gail model for a Korean population and developed a Korean breast cancer risk assessment tool (KoBCRAT) based upon equations developed for the Gail model for predicting breast cancer risk. Using 3,789 sets of cases and controls, risk factors for breast cancer among Koreans were identified. Individual probabilities were projected using Gail's equations and Korean hazard data. We compared the 5-year and lifetime risk produced using the modified Gail model which applied Korean incidence and mortality data and the parameter estimators from the original Gail model with those produced using the KoBCRAT. We validated the KoBCRAT based on the expected/observed breast cancer incidence and area under the curve (AUC) using two Korean cohorts: the Korean Multicenter Cancer Cohort (KMCC) and National Cancer Center (NCC) cohort. The major risk factors under the age of 50 were family history, age at menarche, age at first full-term pregnancy, menopausal status, breastfeeding duration, oral contraceptive usage, and exercise, while those at and over the age of 50 were family history, age at menarche, age at menopause, pregnancy experience, body mass index, oral contraceptive usage, and exercise. The modified Gail model produced lower 5-year risk for the cases than for the controls (p = 0.017), while the KoBCRAT produced higher 5-year and lifetime risk for the cases than for the controls (pKorean women, especially urban women.

  15. Brain Arterial Diameters as a Risk Factor for Vascular Events.

    Science.gov (United States)

    Gutierrez, Jose; Cheung, Ken; Bagci, Ahmet; Rundek, Tatjana; Alperin, Noam; Sacco, Ralph L; Wright, Clinton B; Elkind, Mitchell S V

    2015-08-06

    Arterial luminal diameters are routinely used to assess for vascular disease. Although small diameters are typically considered pathological, arterial dilatation has also been associated with disease. We hypothesize that extreme arterial diameters are biomarkers of the risk of vascular events. Participants in the Northern Manhattan Study who had a time-of-flight magnetic resonance angiography were included in this analysis (N=1034). A global arterial Z-score, called the brain arterial remodeling (BAR) score, was obtained by averaging the measured diameters within each individual. Individuals with a BAR score -2 and 2 SDs had the largest diameters. All vascular events were recorded prospectively after the brain magnetic resonance imaging. Spline curves and incidence rates were used to test our hypothesis. The association of the BAR score with death (P=0.001), vascular death (P=0.02), any vascular event (P=0.05), and myocardial infarction (P=0.10) was U-shaped except for ischemic stroke (P=0.74). Consequently, incidence rates for death, vascular death, myocardial infarction, and any vascular event were higher in individuals with the largest diameters, whereas individuals with the smallest diameters had a higher incidence of death, vascular death, any vascular event, and ischemic stroke compared with individuals with average diameters. The risk of death, vascular death, and any vascular event increased at both extremes of brain arterial diameters. The pathophysiology linking brain arterial remodeling to systemic vascular events needs further research. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  16. Cardiovascular risk factors and events in women with androgen excess.

    Science.gov (United States)

    Macut, D; Antić, I B; Bjekić-Macut, J

    2015-03-01

    Androgen excess (AE) was approximated to be present in 7% of the adult population of women. Polycystic ovary syndrome (PCOS) is the most prevalent among them, followed by idiopathic hirsutism (IH), congenital adrenal hyperplasia (CAH), hyperandrogenic insulin-resistant acanthosis nigricans (HAIRAN) syndrome, and androgen-secreting neoplasms (ASNs). Increased cardiovascular risk was implicated in women with AE. Serum testosterone independently increases risk for cardiovascular disease (CVD), and correlates even with indices of subclinical atherosclerosis in various populations of postmenopausal women. Hyperandrogenism in PCOS is closely related to the aggravation of abdominal obesity, and together with insulin resistance forming the metabolic core for the development of CVD. However, phenotypic variability of PCOS generates significant influence on the cardiometabolic risks. Numerous risk factors in PCOS lead to 5-7 times higher risk for CVD and over 2-fold higher risk for coronary heart disease and stroke. However, issue on the cardiometabolic risk in postmenopausal women with hyperandrogenic history is still challenging. There is a significant overlapping in the CVD characteristics of women with PCOS and variants of CAH. Relevant clinical data on the prevalence and cardiometabolic risk and events in women with IH, HAIRAN syndrome or ASNs are scarce. The effects of various oral contraceptives (OCs) and antiandrogenic compounds on metabolic profile are varying, and could be related to the selected populations and different therapy regiments mainly conducted in women with PCOS. It is assumed relation of OCs containing antiandrogenic progestins to the increased risk of cardiovascular and thromboembolic events.

  17. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  18. Event Recognition Based on Deep Learning in Chinese Texts.

    Science.gov (United States)

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  19. A model of pathways to artificial superintelligence catastrophe for risk and decision analysis

    Science.gov (United States)

    Barrett, Anthony M.; Baum, Seth D.

    2017-03-01

    An artificial superintelligence (ASI) is an artificial intelligence that is significantly more intelligent than humans in all respects. Whilst ASI does not currently exist, some scholars propose that it could be created sometime in the future, and furthermore that its creation could cause a severe global catastrophe, possibly even resulting in human extinction. Given the high stakes, it is important to analyze ASI risk and factor the risk into decisions related to ASI research and development. This paper presents a graphical model of major pathways to ASI catastrophe, focusing on ASI created via recursive self-improvement. The model uses the established risk and decision analysis modelling paradigms of fault trees and influence diagrams in order to depict combinations of events and conditions that could lead to AI catastrophe, as well as intervention options that could decrease risks. The events and conditions include select aspects of the ASI itself as well as the human process of ASI research, development and management. Model structure is derived from published literature on ASI risk. The model offers a foundation for rigorous quantitative evaluation and decision-making on the long-term risk of ASI catastrophe.

  20. Restructuring of an Event Tree for a Loss of Coolant Accident in a PSA model

    International Nuclear Information System (INIS)

    Lim, Ho-Gon; Han, Sang-Hoon; Park, Jin-Hee; Jang, Seong-Chul

    2015-01-01

    Conventional risk model using PSA (probabilistic Safety Assessment) for a NPP considers two types of accident initiators for internal events, LOCA (Loss of Coolant Accident) and transient event such as Loss of electric power, Loss of cooling, and so on. Traditionally, a LOCA is divided into three initiating event (IE) categories depending on the break size, small, medium, and large LOCA. In each IE group, safety functions or systems modeled in the accident sequences are considered to be applicable regardless of the break size. However, since the safety system or functions are not designed based on a break size, there exist lots of mismatch between safety system/function and an IE, which may make the risk model conservative or in some case optimistic. Present paper proposes new methodology for accident sequence analysis for LOCA. We suggest an integrated single ET construction for LOCA by incorporating a safety system/function and its applicable break spectrum into the ET. Integrated accident sequence analysis in terms of ET for LOCA was proposed in the present paper. Safety function/system can be properly assigned if its applicable range is given by break set point. Also, using simple Boolean algebra with the subset of the break spectrum, final accident sequences are expressed properly in terms of the Boolean multiplication, the occurrence frequency and the success/failure of safety system. The accident sequence results show that the accident sequence is described more detailed compared with the conventional results. Unfortunately, the quantitative results in terms of MCS (minimal Cut-Set) was not given because system fault tree was not constructed for this analysis and the break set points for all 7 point were not given as a specified numerical quantity. Further study may be needed to fix the break set point and to develop system fault tree

  1. Restructuring of an Event Tree for a Loss of Coolant Accident in a PSA model

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Ho-Gon; Han, Sang-Hoon; Park, Jin-Hee; Jang, Seong-Chul [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Conventional risk model using PSA (probabilistic Safety Assessment) for a NPP considers two types of accident initiators for internal events, LOCA (Loss of Coolant Accident) and transient event such as Loss of electric power, Loss of cooling, and so on. Traditionally, a LOCA is divided into three initiating event (IE) categories depending on the break size, small, medium, and large LOCA. In each IE group, safety functions or systems modeled in the accident sequences are considered to be applicable regardless of the break size. However, since the safety system or functions are not designed based on a break size, there exist lots of mismatch between safety system/function and an IE, which may make the risk model conservative or in some case optimistic. Present paper proposes new methodology for accident sequence analysis for LOCA. We suggest an integrated single ET construction for LOCA by incorporating a safety system/function and its applicable break spectrum into the ET. Integrated accident sequence analysis in terms of ET for LOCA was proposed in the present paper. Safety function/system can be properly assigned if its applicable range is given by break set point. Also, using simple Boolean algebra with the subset of the break spectrum, final accident sequences are expressed properly in terms of the Boolean multiplication, the occurrence frequency and the success/failure of safety system. The accident sequence results show that the accident sequence is described more detailed compared with the conventional results. Unfortunately, the quantitative results in terms of MCS (minimal Cut-Set) was not given because system fault tree was not constructed for this analysis and the break set points for all 7 point were not given as a specified numerical quantity. Further study may be needed to fix the break set point and to develop system fault tree.

  2. Estimating the value of a Country's built assets: investment-based exposure modelling for global risk assessment

    Science.gov (United States)

    Daniell, James; Pomonis, Antonios; Gunasekera, Rashmin; Ishizawa, Oscar; Gaspari, Maria; Lu, Xijie; Aubrecht, Christoph; Ungar, Joachim

    2017-04-01

    In order to quantify disaster risk, there is a demand and need for determining consistent and reliable economic value of built assets at national or sub national level exposed to natural hazards. The value of the built stock in the context of a city or a country is critical for risk modelling applications as it allows for the upper bound in potential losses to be established. Under the World Bank probabilistic disaster risk assessment - Country Disaster Risk Profiles (CDRP) Program and rapid post-disaster loss analyses in CATDAT, key methodologies have been developed that quantify the asset exposure of a country. In this study, we assess the complementary methods determining value of building stock through capital investment data vs aggregated ground up values based on built area and unit cost of construction analyses. Different approaches to modelling exposure around the world, have resulted in estimated values of built assets of some countries differing by order(s) of magnitude. Using the aforementioned methodology of comparing investment data based capital stock and bottom-up unit cost of construction values per square meter of assets; a suitable range of capital stock estimates for built assets have been created. A blind test format was undertaken to compare the two types of approaches from top-down (investment) and bottom-up (construction cost per unit), In many cases, census data, demographic, engineering and construction cost data are key for bottom-up calculations from previous years. Similarly for the top-down investment approach, distributed GFCF (Gross Fixed Capital Formation) data is also required. Over the past few years, numerous studies have been undertaken through the World Bank Caribbean and Central America disaster risk assessment program adopting this methodology initially developed by Gunasekera et al. (2015). The range of values of the building stock is tested for around 15 countries. In addition, three types of costs - Reconstruction cost

  3. Structured Event-B Models and Proofs

    DEFF Research Database (Denmark)

    Hallerstede, Stefan

    2010-01-01

    Event-B does not provide specific support for the modelling of problems that require some structuring, such as, local variables or sequential ordering of events. All variables need to be declared globally and sequential ordering of events can only be achieved by abstract program counters. This ha...

  4. Pathway index models for construction of patient-specific risk profiles.

    Science.gov (United States)

    Eng, Kevin H; Wang, Sijian; Bradley, William H; Rader, Janet S; Kendziorski, Christina

    2013-04-30

    Statistical methods for variable selection, prediction, and classification have proven extremely useful in moving personalized genomics medicine forward, in particular, leading to a number of genomic-based assays now in clinical use for predicting cancer recurrence. Although invaluable in individual cases, the information provided by these assays is limited. Most often, a patient is classified into one of very few groups (e.g., recur or not), limiting the potential for truly personalized treatment. Furthermore, although these assays provide information on which individuals are at most risk (e.g., those for which recurrence is predicted), they provide no information on the aberrant biological pathways that give rise to the increased risk. We have developed an approach to address these limitations. The approach models a time-to-event outcome as a function of known biological pathways, identifies important genomic aberrations, and provides pathway-based patient-specific assessments of risk. As we demonstrate in a study of ovarian cancer from The Cancer Genome Atlas project, the patient-specific risk profiles are powerful and efficient characterizations useful in addressing a number of questions related to identifying informative patient subtypes and predicting survival. Copyright © 2012 John Wiley & Sons, Ltd.

  5. Joint two-part Tobit models for longitudinal and time-to-event data.

    Science.gov (United States)

    Dagne, Getachew A

    2017-11-20

    In this article, we show how Tobit models can address problems of identifying characteristics of subjects having left-censored outcomes in the context of developing a method for jointly analyzing time-to-event and longitudinal data. There are some methods for handling these types of data separately, but they may not be appropriate when time to event is dependent on the longitudinal outcome, and a substantial portion of values are reported to be below the limits of detection. An alternative approach is to develop a joint model for the time-to-event outcome and a two-part longitudinal outcome, linking them through random effects. This proposed approach is implemented to assess the association between the risk of decline of CD4/CD8 ratio and rates of change in viral load, along with discriminating between patients who are potentially progressors to AIDS from patients who do not. We develop a fully Bayesian approach for fitting joint two-part Tobit models and illustrate the proposed methods on simulated and real data from an AIDS clinical study. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Preventable coronary heart disease events from control of cardiovascular risk factors in US adults with diabetes (projections from utilizing the UKPDS risk engine).

    Science.gov (United States)

    Wong, Nathan D; Patao, Christopher; Malik, Shaista; Iloeje, Uchenna

    2014-04-15

    Type 2 diabetes mellitus (T2DM) carries significant risks for coronary heart disease (CHD). We examined the potential US population impact of single and composite risk factor control. Among US adults with diagnosed T2DM aged≥30 years in the National Health and Nutrition Examination Survey 2007 to 2012, we assessed CHD events preventable using the United Kingdom Prospective Diabetes Study CHD risk engine. We examined in all those not at goal the impact of statistical control of smoking, glycated hemoglobin, systolic blood pressure, and total and high-density lipoprotein cholesterol, according to the predefined criteria setting risk factors at different levels of control representing (1) "All to Goal," (2) at "Nominal Control," or (3) at "Aggressive Control." Preventable CHD events represented the difference between the number of events estimated from the control of these risk factors versus current levels of the risk factors. Of 606 men (representing 6.2 million) and 603 women (6.3 million) with DM and no previous CHD, 1.3 million men and 0.7 million women would develop a CHD event within 10 years if left uncontrolled. Controlling all risk factors to goal was projected to prevent 35% and 45% of CHD events in men and women, respectively. Nominal risk factor control was projected to prevent 36% and 38% and aggressive control 51% and 61% of CHD events, respectively. In conclusion, a significant proportion of CHD events in adults with T2DM could be prevented from composite control of risk factors often not at goal. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Foundations for Streaming Model Transformations by Complex Event Processing.

    Science.gov (United States)

    Dávid, István; Ráth, István; Varró, Dániel

    2018-01-01

    Streaming model transformations represent a novel class of transformations to manipulate models whose elements are continuously produced or modified in high volume and with rapid rate of change. Executing streaming transformations requires efficient techniques to recognize activated transformation rules over a live model and a potentially infinite stream of events. In this paper, we propose foundations of streaming model transformations by innovatively integrating incremental model query, complex event processing (CEP) and reactive (event-driven) transformation techniques. Complex event processing allows to identify relevant patterns and sequences of events over an event stream. Our approach enables event streams to include model change events which are automatically and continuously populated by incremental model queries. Furthermore, a reactive rule engine carries out transformations on identified complex event patterns. We provide an integrated domain-specific language with precise semantics for capturing complex event patterns and streaming transformations together with an execution engine, all of which is now part of the Viatra reactive transformation framework. We demonstrate the feasibility of our approach with two case studies: one in an advanced model engineering workflow; and one in the context of on-the-fly gesture recognition.

  8. Risks of cardiovascular adverse events and death in patients with previous stroke undergoing emergency noncardiac, nonintracranial surgery

    DEFF Research Database (Denmark)

    Christiansen, Mia N.; Andersson, Charlotte; Gislason, Gunnar H.

    2017-01-01

    Background: The outcomes of emergent noncardiac, nonintracranial surgery in patients with previous stroke remain unknown. Methods: All emergency surgeries performed in Denmark (2005 to 2011) were analyzed according to time elapsed between previous ischemic stroke and surgery. The risks of 30-day...... mortality and major adverse cardiovascular events were estimated as odds ratios (ORs) and 95% CIs using adjusted logistic regression models in a priori defined groups (reference was no previous stroke). In patients undergoing surgery immediately (within 1 to 3 days) or early after stroke (within 4 to 14...... and general anesthesia less frequent in patients with previous stroke (all P Risks of major adverse cardiovascular events and mortality were high for patients with stroke less than 3 months (20.7 and 16.4% events; OR = 4.71 [95% CI, 4.18 to 5.32] and 1.65 [95% CI, 1.45 to 1.88]), and remained...

  9. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    Science.gov (United States)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  10. Semiparametric accelerated failure time cure rate mixture models with competing risks.

    Science.gov (United States)

    Choi, Sangbum; Zhu, Liang; Huang, Xuelin

    2018-01-15

    Modern medical treatments have substantially improved survival rates for many chronic diseases and have generated considerable interest in developing cure fraction models for survival data with a non-ignorable cured proportion. Statistical analysis of such data may be further complicated by competing risks that involve multiple types of endpoints. Regression analysis of competing risks is typically undertaken via a proportional hazards model adapted on cause-specific hazard or subdistribution hazard. In this article, we propose an alternative approach that treats competing events as distinct outcomes in a mixture. We consider semiparametric accelerated failure time models for the cause-conditional survival function that are combined through a multinomial logistic model within the cure-mixture modeling framework. The cure-mixture approach to competing risks provides a means to determine the overall effect of a treatment and insights into how this treatment modifies the components of the mixture in the presence of a cure fraction. The regression and nonparametric parameters are estimated by a nonparametric kernel-based maximum likelihood estimation method. Variance estimation is achieved through resampling methods for the kernel-smoothed likelihood function. Simulation studies show that the procedures work well in practical settings. Application to a sarcoma study demonstrates the use of the proposed method for competing risk data with a cure fraction. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Evaluation model for safety capacity of chemical industrial park based on acceptable regional risk

    Institute of Scientific and Technical Information of China (English)

    Guohua Chen; Shukun Wang; Xiaoqun Tan

    2015-01-01

    The paper defines the Safety Capacity of Chemical Industrial Park (SCCIP) from the perspective of acceptable regional risk. For the purpose of exploring the evaluation model for the SCCIP, a method based on quantitative risk assessment was adopted for evaluating transport risk and to confirm reasonable safety transport capacity of chemical industrial park, and then by combining with the safety storage capacity, a SCCIP evaluation model was put forward. The SCCIP was decided by the smaller one between the largest safety storage capacity and the maximum safety transport capacity, or else, the regional risk of the park will exceed the acceptable level. The developed method was applied to a chemical industrial park in Guangdong province to obtain the maximum safety transport capacity and the SCCIP. The results can be realized in the regional risk control of the park effectively.

  12. Development of Toxicological Risk Assessment Models for Acute and Chronic Exposure to Pollutants

    Directory of Open Access Journals (Sweden)

    Elke S. Reichwaldt

    2016-08-01

    Full Text Available Alert level frameworks advise agencies on a sequence of monitoring and management actions, and are implemented so as to reduce the risk of the public coming into contact with hazardous substances. Their effectiveness relies on the detection of the hazard, but with many systems not receiving any regular monitoring, pollution events often go undetected. We developed toxicological risk assessment models for acute and chronic exposure to pollutants that incorporate the probabilities that the public will come into contact with undetected pollution events, to identify the level of risk a system poses in regards to the pollutant. As a proof of concept, we successfully demonstrated that the models could be applied to determine probabilities of acute and chronic illness types related to recreational activities in waterbodies containing cyanotoxins. Using the acute model, we identified lakes that present a ‘high’ risk to develop Day Away From Work illness, and lakes that present a ‘low’ or ‘medium’ risk to develop First Aid Cases when used for swimming. The developed risk models succeeded in categorising lakes according to their risk level to the public in an objective way. Modelling by how much the probability of public exposure has to decrease to lower the risks to acceptable levels will enable authorities to identify suitable control measures and monitoring strategies. We suggest broadening the application of these models to other contaminants.

  13. Increased long-term risk of major adverse cardiovascular events in patients with carbon monoxide poisoning: A population-based study in Taiwan.

    Directory of Open Access Journals (Sweden)

    Chung-Shun Wong

    Full Text Available Carbon monoxide (CO poisoning may cause toxicity to the cardiovascular system. However, the association between CO poisoning and the risk of major adverse cardiovascular events (MACE remains unestablished. We investigated the incidence of MACE after CO poisoning in Taiwan and evaluated whether CO-poisoned individuals had a higher risk of MACE than did the general population.Using Taiwan's National Health Insurance Research Database (NHIRD during 2005-2013, a nationwide population-based cohort study was conducted among patients who experienced CO poisoning between 2005 and 2013. CO poisoning was defined according to the International Classification of Diseases, Ninth Revision, Clinical Modification codes. The study cohort comprised patients with CO poisoning between 2005 and 2010 (N = 13,939. Each patient was matched according to age, sex and index date with four randomly selected controls from the comparison cohort (N = 55,756. All patients were followed from the study date until MACE development, death, or the end of 2013. The hazard ratios for MACE were compared between the two cohorts by using Cox proportional hazards regressions analyses.Incident cases of MACE were identified from the NHIRD. After adjustment for potential confounders, the study cohort was independently associated with a higher MACE risk (adjusted hazard ratio, 2.00; 95% confidence interval, 1.83-2.18.This population-based cohort study indicated that patients with CO poisoning have a higher risk of MACE than do individuals without CO poisoning.

  14. Modeling of Flood Risk for the Continental United States

    Science.gov (United States)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  15. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    Science.gov (United States)

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  16. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data

    Directory of Open Access Journals (Sweden)

    Justine B. Nasejje

    2017-07-01

    Full Text Available Abstract Background Random survival forest (RSF models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. Methods In this study, we compare the random survival forest model to the conditional inference model (CIF using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points. The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB which consists of mainly categorical covariates with two levels (few split-points. Results The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Conclusion Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  17. Omega-3 dietary supplements and the risk of cardiovascular events: a systematic review.

    Science.gov (United States)

    Marik, Paul E; Varon, Joseph

    2009-07-01

    Epidemiologic data suggest that omega-3 fatty acids derived from fish oil reduce cardiovascular disease. The clinical benefit of dietary fish oil supplementation in preventing cardiovascular events in both high and low risk patients is unclear. To assess whether dietary supplements of eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) decrease cardiovascular events across a spectrum of patients. MEDLINE, Embase, the Cochrane Database of Systematic Reviews, and citation review of relevant primary and review articles. Prospective, randomized, placebo-controlled clinical trials that evaluated clinical cardiovascular end points (cardiovascular death, sudden death, and nonfatal cardiovascular events) and all-cause mortality in patients randomized to EPA/DHA or placebo. We only included studies that used dietary supplements of EPA/DHA which were administered for at least 1 year. Data were abstracted on study design, study size, type and dose of omega-3 supplement, cardiovascular events, all-cause mortality, and duration of follow-up. Studies were grouped according to the risk of cardiovascular events (high risk and moderate risk). Meta-analytic techniques were used to analyze the data. We identified 11 studies that included a total of 39 044 patients. The studies included patients after recent myocardial infarction, those with an implanted cardioverter defibrillator, and patients with heart failure, peripheral vascular disease, and hypercholesterolemia. The average dose of EPA/DHA was 1.8 +/- 1.2 g/day and the mean duration of follow-up was 2.2 +/- 1.2 years. Dietary supplementation with omega-3 fatty acids significantly reduced the risk of cardiovascular deaths (odds ratio [OR]: 0.87, 95% confidence interval [CI]: 0.79-0.95, p = 0.002), sudden cardiac death (OR: 0.87, 95% CI: 0.76-0.99, p = 0.04), all-cause mortality (OR: 0.92, 95% CI: 0.85-0.99, p = 0.02), and nonfatal cardiovascular events (OR: 0.92, 95% CI: 0.85-0.99, p = 0.02). The mortality benefit was

  18. A model-based approach to operational event groups ranking

    Energy Technology Data Exchange (ETDEWEB)

    Simic, Zdenko [European Commission Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport; Maqua, Michael [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Surete Nucleaire (IRSN), Fontenay-aux-Roses (France)

    2014-04-15

    The operational experience (OE) feedback provides improvements in all industrial activities. Identification of the most important and valuable groups of events within accumulated experience is important in order to focus on a detailed investigation of events. The paper describes the new ranking method and compares it with three others. Methods have been described and applied to OE events utilised by nuclear power plants in France and Germany for twenty years. The results show that different ranking methods only roughly agree on which of the event groups are the most important ones. In the new ranking method the analytical hierarchy process is applied in order to assure consistent and comprehensive weighting determination for ranking indexes. The proposed method allows a transparent and flexible event groups ranking and identification of the most important OE for further more detailed investigation in order to complete the feedback. (orig.)

  19. Polytomous diagnosis of ovarian tumors as benign, borderline, primary invasive or metastatic: development and validation of standard and kernel-based risk prediction models

    Directory of Open Access Journals (Sweden)

    Testa Antonia C

    2010-10-01

    Full Text Available Abstract Background Hitherto, risk prediction models for preoperative ultrasound-based diagnosis of ovarian tumors were dichotomous (benign versus malignant. We develop and validate polytomous models (models that predict more than two events to diagnose ovarian tumors as benign, borderline, primary invasive or metastatic invasive. The main focus is on how different types of models perform and compare. Methods A multi-center dataset containing 1066 women was used for model development and internal validation, whilst another multi-center dataset of 1938 women was used for temporal and external validation. Models were based on standard logistic regression and on penalized kernel-based algorithms (least squares support vector machines and kernel logistic regression. We used true polytomous models as well as combinations of dichotomous models based on the 'pairwise coupling' technique to produce polytomous risk estimates. Careful variable selection was performed, based largely on cross-validated c-index estimates. Model performance was assessed with the dichotomous c-index (i.e. the area under the ROC curve and a polytomous extension, and with calibration graphs. Results For all models, between 9 and 11 predictors were selected. Internal validation was successful with polytomous c-indexes between 0.64 and 0.69. For the best model dichotomous c-indexes were between 0.73 (primary invasive vs metastatic and 0.96 (borderline vs metastatic. On temporal and external validation, overall discrimination performance was good with polytomous c-indexes between 0.57 and 0.64. However, discrimination between primary and metastatic invasive tumors decreased to near random levels. Standard logistic regression performed well in comparison with advanced algorithms, and combining dichotomous models performed well in comparison with true polytomous models. The best model was a combination of dichotomous logistic regression models. This model is available online

  20. Advanced uncertainty modelling for container port risk analysis.

    Science.gov (United States)

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Pharmacogenetics-based area-under-curve model can predict efficacy and adverse events from axitinib in individual patients with advanced renal cell carcinoma.

    Science.gov (United States)

    Yamamoto, Yoshiaki; Tsunedomi, Ryouichi; Fujita, Yusuke; Otori, Toru; Ohba, Mitsuyoshi; Kawai, Yoshihisa; Hirata, Hiroshi; Matsumoto, Hiroaki; Haginaka, Jun; Suzuki, Shigeo; Dahiya, Rajvir; Hamamoto, Yoshihiko; Matsuyama, Kenji; Hazama, Shoichi; Nagano, Hiroaki; Matsuyama, Hideyasu

    2018-03-30

    We investigated the relationship between axitinib pharmacogenetics and clinical efficacy/adverse events in advanced renal cell carcinoma (RCC) and established a model to predict clinical efficacy and adverse events using pharmacokinetic and gene polymorphisms related to drug metabolism and efflux in a phase II trial. We prospectively evaluated the area under the plasma concentration-time curve (AUC) of axitinib, objective response rate, and adverse events in 44 consecutive advanced RCC patients treated with axitinib. To establish a model for predicting clinical efficacy and adverse events, polymorphisms in genes including ABC transporters ( ABCB1 and ABCG2 ), UGT1A , and OR2B11 were analyzed by whole-exome sequencing, Sanger sequencing, and DNA microarray. To validate this prediction model, calculated AUC by 6 gene polymorphisms was compared with actual AUC in 16 additional consecutive patients prospectively. Actual AUC significantly correlated with the objective response rate ( P = 0.0002) and adverse events (hand-foot syndrome, P = 0.0055; and hypothyroidism, P = 0.0381). Calculated AUC significantly correlated with actual AUC ( P treatment precisely predicted actual AUC after axitinib treatment ( P = 0.0066). Our pharmacogenetics-based AUC prediction model may determine the optimal initial dose of axitinib, and thus facilitate better treatment of patients with advanced RCC.

  2. Quantitative Risk Modeling of Fire on the International Space Station

    Science.gov (United States)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  3. Selection of low-risk design guidelines for energetic events

    International Nuclear Information System (INIS)

    Ferguson, D.; Marchaterre, J.; Graham, J.

    1982-01-01

    This paper recommends the establishment of specific design guidelines for protection against potential, but low-probability, energetic events. These guidelines recognize the plant protective features incorporated to prevent such events, as well as the inherent capability of the plant to accommodate a certain level of energy release. Further, their application is recommended within the context of necessary standardized and agreed-upon acceptance criteria which are less restrictive than ASME code requirements. The paper provides the background upon which the selection of the design is made, including the characterization of energetic events dependent on various core-design parameters, and including the necessity of a low-risk design balanced between prevention of accidents and the mitigation of consequences

  4. Modeling the risk of water pollution by pesticides from imbalanced data.

    Science.gov (United States)

    Trajanov, Aneta; Kuzmanovski, Vladimir; Real, Benoit; Perreau, Jonathan Marks; Džeroski, Sašo; Debeljak, Marko

    2018-04-30

    The pollution of ground and surface waters with pesticides is a serious ecological issue that requires adequate treatment. Most of the existing water pollution models are mechanistic mathematical models. While they have made a significant contribution to understanding the transfer processes, they face the problem of validation because of their complexity, the user subjectivity in their parameterization, and the lack of empirical data for validation. In addition, the data describing water pollution with pesticides are, in most cases, very imbalanced. This is due to strict regulations for pesticide applications, which lead to only a few pollution events. In this study, we propose the use of data mining to build models for assessing the risk of water pollution by pesticides in field-drained outflow water. Unlike the mechanistic models, the models generated by data mining are based on easily obtainable empirical data, while the parameterization of the models is not influenced by the subjectivity of ecological modelers. We used empirical data from field trials at the La Jaillière experimental site in France and applied the random forests algorithm to build predictive models that predict "risky" and "not-risky" pesticide application events. To address the problems of the imbalanced classes in the data, cost-sensitive learning and different measures of predictive performance were used. Despite the high imbalance between risky and not-risky application events, we managed to build predictive models that make reliable predictions. The proposed modeling approach can be easily applied to other ecological modeling problems where we encounter empirical data with highly imbalanced classes.

  5. Knowledge base about earthquakes as a tool to minimize strong events consequences

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

    2017-04-01

    The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

  6. The Effect of Task Duration on Event-Based Prospective Memory: A Multinomial Modeling Approach

    Directory of Open Access Journals (Sweden)

    Hongxia Zhang

    2017-11-01

    Full Text Available Remembering to perform an action when a specific event occurs is referred to as Event-Based Prospective Memory (EBPM. This study investigated how EBPM performance is affected by task duration by having university students (n = 223 perform an EBPM task that was embedded within an ongoing computer-based color-matching task. For this experiment, we separated the overall task’s duration into the filler task duration and the ongoing task duration. The filler task duration is the length of time between the intention and the beginning of the ongoing task, and the ongoing task duration is the length of time between the beginning of the ongoing task and the appearance of the first Prospective Memory (PM cue. The filler task duration and ongoing task duration were further divided into three levels: 3, 6, and 9 min. Two factors were then orthogonally manipulated between-subjects using a multinomial processing tree model to separate the effects of different task durations on the two EBPM components. A mediation model was then created to verify whether task duration influences EBPM via self-reminding or discrimination. The results reveal three points. (1 Lengthening the duration of ongoing tasks had a negative effect on EBPM performance while lengthening the duration of the filler task had no significant effect on it. (2 As the filler task was lengthened, both the prospective and retrospective components show a decreasing and then increasing trend. Also, when the ongoing task duration was lengthened, the prospective component decreased while the retrospective component significantly increased. (3 The mediating effect of discrimination between the task duration and EBPM performance was significant. We concluded that different task durations influence EBPM performance through different components with discrimination being the mediator between task duration and EBPM performance.

  7. Concurrency Models with Causality and Events as Psi-calculi

    Directory of Open Access Journals (Sweden)

    Håkon Normann

    2014-10-01

    Full Text Available Psi-calculi are a parametric framework for nominal calculi, where standard calculi are found as instances, like the pi-calculus, or the cryptographic spi-calculus and applied-pi. Psi-calculi have an interleaving operational semantics, with a strong foundation on the theory of nominal sets and process algebras. Much of the expressive power of psi-calculi comes from their logical part, i.e., assertions, conditions, and entailment, which are left quite open thus accommodating a wide range of logics. We are interested in how this expressiveness can deal with event-based models of concurrency. We thus take the popular prime event structures model and give an encoding into an instance of psi-calculi. We also take the recent and expressive model of Dynamic Condition Response Graphs (in which event structures are strictly included and give an encoding into another corresponding instance of psi-calculi. The encodings that we achieve look rather natural and intuitive. Additional results about these encodings give us more confidence in their correctness.

  8. A Nonparametric Operational Risk Modeling Approach Based on Cornish-Fisher Expansion

    Directory of Open Access Journals (Sweden)

    Xiaoqian Zhu

    2014-01-01

    Full Text Available It is generally accepted that the choice of severity distribution in loss distribution approach has a significant effect on the operational risk capital estimation. However, the usually used parametric approaches with predefined distribution assumption might be not able to fit the severity distribution accurately. The objective of this paper is to propose a nonparametric operational risk modeling approach based on Cornish-Fisher expansion. In this approach, the samples of severity are generated by Cornish-Fisher expansion and then used in the Monte Carlo simulation to sketch the annual operational loss distribution. In the experiment, the proposed approach is employed to calculate the operational risk capital charge for the overall Chinese banking. The experiment dataset is the most comprehensive operational risk dataset in China as far as we know. The results show that the proposed approach is able to use the information of high order moments and might be more effective and stable than the usually used parametric approach.

  9. Risk of Suicidal Events With Atomoxetine Compared to Stimulant Treatment: A Cohort Study.

    Science.gov (United States)

    Linden, Stephan; Bussing, Regina; Kubilis, Paul; Gerhard, Tobias; Segal, Richard; Shuster, Jonathan J; Winterstein, Almut G

    2016-05-01

    Antidepressant effects on increased suicidality in children have raised public concern in recent years. Approved in 2002 for attention-deficit/hyperactivity disorder treatment, the selective noradrenalin-reuptake-inhibitor atomoxetine was initially investigated for the treatment of depression. In post-hoc analyses of clinical trial data, atomoxetine has been associated with an increased risk of suicidal ideation in children and adolescents. We analyzed whether the observed increased risk of suicidal ideation in clinical trials translates into an increased risk of suicidal events in pediatric patients treated with atomoxetine compared with stimulants in 26 Medicaid programs. Employing a retrospective cohort design, we used propensity score-adjusted Cox proportional hazard models to evaluate the risk of suicide and suicide attempt in pediatric patients initiating treatment with atomoxetine compared with stimulants from 2002 to 2006. The first-line treatment cohort included 279 315 patients. During the first year of follow-up, the adjusted hazard ratio for current atomoxetine use compared with current stimulant use was 0.95 (95% CI 0.47-1.92, P = .88). The second-line treatment cohort included 220 215 patients. During the first year of follow-up, the adjusted hazard ratio for current atomoxetine use compared with current stimulant use was 0.71 (95% CI 0.30-1.67, P = .43). First- and second-line treatment of youths age 5 to 18 with atomoxetine compared with stimulants was not significantly associated with an increased risk of suicidal events. The low incidence of suicide and suicide attempt resulted in wide confidence intervals and did not allow stratified analysis of high-risk groups or assessment of suicidal risk associated with long-term use of atomoxetine. Copyright © 2016 by the American Academy of Pediatrics.

  10. A new methodology for dynamic modelling of health risks arising from wastewater influenced urban flooding

    Science.gov (United States)

    Jørgensen, Claus; Mark, Ole; Djordjevic, Slobodan; Hammond, Michael; Khan, David M.; Erichsen, Anders; Dorrit Enevoldsen, Ann; Heinicke, Gerald; Helwigh, Birgitte

    2015-04-01

    flood water, based on either measured waste water pathogen concentrations or on assumptions regarding the prevalence of infections in the population. The exposure (dosage) to pathogens was estimated by multiplying the concentration with literature values for the ingestion of water for different exposure groups (e.g. children, adults). The probability of infection was determined by applying dose response relations and MonteCarlo simulation. The methodology is demonstrated on two cases, i.e one case from a developing country with poor sanitation and one case from a developed country, where climate adaptation is the main issue: The risk of cholera in the City of Dhaka, Bangladesh during a flood event 2004, and the risk of bacterial and viral infections of during a flood event in Copenhagen, Denmark in 2011. Results PIC The historical flood events in Dhaka (2004) and Copenhagen (2011) were successfully modelled. The urban flood model was successfully coupled to QMRA. An example of the results of the quantitative microbial risk assessment given as the average estimated risk of cholera infection for children below 5 years living in slum areas in Dhaka is shown in the figure. Similarly, the risk of infection during the flood event in Copenhagen will be presented in the article. Conclusions We have developed a methodology for the dynamic modeling of the risk of infection during waste water influenced urban flooding. The outcome of the modelling exercise indicates that direct contact with polluted flood water is a likely route of transmission of cholera in Dhaka, and bacterial and viral infectious diseases in Copenhagen. It demonstrates the applicability and the potential for linking urban flood models with QMRA in order to identify interventions to reduce the burden of disease on the population in Dhaka City and Copenhagen.

  11. Age-specific risks, severity, time course, and outcome of bleeding on long-term antiplatelet treatment after vascular events: a population-based cohort study.

    Science.gov (United States)

    Li, Linxin; Geraghty, Olivia C; Mehta, Ziyah; Rothwell, Peter M

    2017-07-29

    Lifelong antiplatelet treatment is recommended after ischaemic vascular events, on the basis of trials done mainly in patients younger than 75 years. Upper gastrointestinal bleeding is a serious complication, but had low case fatality in trials of aspirin and is not generally thought to cause long-term disability. Consequently, although co-prescription of proton-pump inhibitors (PPIs) reduces upper gastrointestinal bleeds by 70-90%, uptake is low and guidelines are conflicting. We aimed to assess the risk, time course, and outcomes of bleeding on antiplatelet treatment for secondary prevention in patients of all ages. We did a prospective population-based cohort study in patients with a first transient ischaemic attack, ischaemic stroke, or myocardial infarction treated with antiplatelet drugs (mainly aspirin based, without routine PPI use) after the event in the Oxford Vascular Study from 2002 to 2012, with follow-up until 2013. We determined type, severity, outcome (disability or death), and time course of bleeding requiring medical attention by face-to-face follow-up for 10 years. We estimated age-specific numbers needed to treat (NNT) to prevent upper gastrointestinal bleeding with routine PPI co-prescription on the basis of Kaplan-Meier risk estimates and relative risk reduction estimates from previous trials. 3166 patients (1582 [50%] aged ≥75 years) had 405 first bleeding events (n=218 gastrointestinal, n=45 intracranial, and n=142 other) during 13 509 patient-years of follow-up. Of the 314 patients (78%) with bleeds admitted to hospital, 117 (37%) were missed by administrative coding. Risk of non-major bleeding was unrelated to age, but major bleeding increased steeply with age (≥75 years hazard ratio [HR] 3·10, 95% CI 2·27-4·24; pbleeds (5·53, 2·65-11·54; pbleeds (≥75 years HR 4·13, 2·60-6·57; pbleeds were mostly disabling or fatal (45 [62%] of 73 patients vs 101 [47%] of 213 patients with recurrent ischaemic stroke), and outnumbered

  12. Study of a risk-based piping inspection guideline system.

    Science.gov (United States)

    Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung

    2007-02-01

    A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.

  13. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    Science.gov (United States)

    Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.

    2016-02-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable individual vessel accident risk levels and shoreline contamination risk from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - the Portuguese continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time, or as an alternative, a correction factor based on vessel distance from coast. Shoreline risks can be computed in real time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns ("hot spots") or developing sensitivity analysis to specific conditions, whereas real

  14. Risk of Incident Coronary Heart Disease Events in Men Compared to Women by Menopause Type and Race

    Science.gov (United States)

    Kim, Catherine; Cushman, Mary; Khodneva, Yulia; Lisabeth, Lynda D; Judd, Suzanne; Kleindorfer, Dawn O; Howard, Virginia J; Safford, Monika M

    2015-01-01

    Background We examined whether type of menopause affects sex differences in coronary heart disease (CHD) events and whether the impact is similar in blacks and whites. Methods and Results Participants were enrolled in the Reasons for Geographic and Racial Differences in Stroke (REGARDS) cohort between 2003 and 2007 without CHD at baseline (n=23 086). Cox regression models were used to calculate the hazard of incident nonfatal CHD (definite or probable myocardial infarction) and acute CHD death, adjusting for age, age at last menstrual period menopause (hazard ratio [HR], 0.45; 95% confidence interval [CI], 0.31, 0.66) and surgical menopause (HR, 0.65; 95% CI, 0.42, 0.99) had a reduced hazard of nonfatal events, compared to white men. Black women in natural menopause (HR, 0.69; 95% CI, 0.47, 1.03), but not surgical menopause (HR, 0.81; 95% CI, 0.51, 1.29), had a marginally reduced hazard of nonfatal events, compared to black men. Women had lower risk of acute CHD death than men regardless of their menopause type and race. Conclusions Sex differences in the risk of incident CHD events were larger among whites than blacks and varied by type of menopause. Women consistently had a lower risk of incident CHD death than men, but the magnitude of sex differences was greater in whites than blacks for nonfatal events, regardless of menopause type. PMID:26133958

  15. Integrated analyzing method for the progress event based on subjects and predicates in events

    International Nuclear Information System (INIS)

    Minowa, Hirotsugu; Munesawa, Yoshiomi

    2014-01-01

    It is expected to make use of the knowledge that was extracted by analyzing the mistakes of the past to prevent recurrence of accidents. Currently main analytic style is an analytic style that experts decipher deeply the accident cases, but cross-analysis has come to an end with extracting the common factors in the accident cases. We propose an integrated analyzing method for progress events to analyze among accidents in this study. Our method realized the integration of many accident cases by the integration connecting the common keyword called as 'Subject' or 'Predicate' that are extracted from each progress event in accident cases or near-miss cases. Our method can analyze and visualize the partial risk identification and the frequency to cause accidents and the risk assessment from the data integrated accident cases. The result of applying our method to PEC-SAFER accident cases identified 8 hazardous factors which can be caused from tank again, and visualized the high frequent factors that the first factor was damage of tank 26% and the second factor was the corrosion 21%, and visualized the high risks that the first risk was the damage 3.3 x 10 -2 [risk rank / year] and the second risk was the destroy 2.5 x 10 -2 [risk rank / year]. (author)

  16. Risk of All-cause Mortality Associated with Non-fatal AIDS and Serious Non-AIDS Events among Adults Infected with HIV

    Science.gov (United States)

    NEUHAUS, Jacqueline; ANGUS, Brian; KOWALSKA, Justyna D.; LA ROSA, Alberto; SAMPSON, Jim; WENTWORTH, Deborah; MOCROFT, Amanda

    2010-01-01

    Objectives Among patients with HIV, the risk of death associated with different AIDS events has been quantified, but the risk of death associated with non-AIDS events has not been examined. We compared the risk of all-cause mortality following AIDS versus serious non-AIDS (SNA) events in SMART and ESPRIT. Design Data from 9,583 HIV-infected participants, 5,472 with CD4+ >350 cells/mm3 enrolled in SMART and 4,111 with CD4+ ≥300 cells/mm3 enrolled in ESPRIT were analyzed. Methods Cumulative mortality 6 months after AIDS and SNA (cardiovascular, renal, hepatic disease and malignancies) was estimated using the Kaplan-Meier method. Cox models were used to estimate hazard ratios (HRs) associated with AIDS and SNA on the risk of death overall and by treatment group within study. Results AIDS and SNA occurred in 286 and 435 participants with 47 (16%) and 115 (26%) subsequent deaths, respectively. Six-month cumulative mortality was 4.7% (95%CI:2.8–8.0) after experiencing an AIDS event and 13.4% (95%CI:10.5–17.0) after experiencing an SNA event. The adjusted HR for all-cause mortality for those who experienced AIDS versus those who did not was 4.9 (95%CI:3.6–6.8). The corresponding HR for SNA was 11.4 (95%CI:9.0–14.5) (pESPRIT. Conclusions Among HIV-infected persons with higher CD4+ counts, SNA events occur more frequently and are associated with a greater risk of death than AIDS events. Future research should be aimed at comparing strategies to reduce morbidity and mortality associated with SNA events for HIV-infected persons. PMID:20177360

  17. The Generation of a Stochastic Flood Event Catalogue for Continental USA

    Science.gov (United States)

    Quinn, N.; Wing, O.; Smith, A.; Sampson, C. C.; Neal, J. C.; Bates, P. D.

    2017-12-01

    Recent advances in the acquisition of spatiotemporal environmental data and improvements in computational capabilities has enabled the generation of large scale, even global, flood hazard layers which serve as a critical decision-making tool for a range of end users. However, these datasets are designed to indicate only the probability and depth of inundation at a given location and are unable to describe the likelihood of concurrent flooding across multiple sites.Recent research has highlighted that although the estimation of large, widespread flood events is of great value to flood mitigation and insurance industries, to date it has been difficult to deal with this spatial dependence structure in flood risk over relatively large scales. Many existing approaches have been restricted to empirical estimates of risk based on historic events, limiting their capability of assessing risk over the full range of plausible scenarios. Therefore, this research utilises a recently developed model-based approach to describe the multisite joint distribution of extreme river flows across continental USA river gauges. Given an extreme event at a site, the model characterises the likelihood neighbouring sites are also impacted. This information is used to simulate an ensemble of plausible synthetic extreme event footprints from which flood depths are extracted from an existing global flood hazard catalogue. Expected economic losses are then estimated by overlaying flood depths with national datasets defining asset locations, characteristics and depth damage functions. The ability of this approach to quantify probabilistic economic risk and rare threshold exceeding events is expected to be of value to those interested in the flood mitigation and insurance sectors.This work describes the methodological steps taken to create the flood loss catalogue over a national scale; highlights the uncertainty in the expected annual economic vulnerability within the USA from extreme river flows

  18. Abnormal Event Detection in Wireless Sensor Networks Based on Multiattribute Correlation

    Directory of Open Access Journals (Sweden)

    Mengdi Wang

    2017-01-01

    Full Text Available Abnormal event detection is one of the vital tasks in wireless sensor networks. However, the faults of nodes and the poor deployment environment have brought great challenges to abnormal event detection. In a typical event detection technique, spatiotemporal correlations are collected to detect an event, which is susceptible to noises and errors. To improve the quality of detection results, we propose a novel approach for abnormal event detection in wireless sensor networks. This approach considers not only spatiotemporal correlations but also the correlations among observed attributes. A dependency model of observed attributes is constructed based on Bayesian network. In this model, the dependency structure of observed attributes is obtained by structure learning, and the conditional probability table of each node is calculated by parameter learning. We propose a new concept named attribute correlation confidence to evaluate the fitting degree between the sensor reading and the abnormal event pattern. On the basis of time correlation detection and space correlation detection, the abnormal events are identified. Experimental results show that the proposed algorithm can reduce the impact of interference factors and the rate of the false alarm effectively; it can also improve the accuracy of event detection.

  19. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  20. Preterm Versus Term Children: Analysis of Sedation/Anesthesia Adverse Events and Longitudinal Risk.

    Science.gov (United States)

    Havidich, Jeana E; Beach, Michael; Dierdorf, Stephen F; Onega, Tracy; Suresh, Gautham; Cravero, Joseph P

    2016-03-01

    Preterm and former preterm children frequently require sedation/anesthesia for diagnostic and therapeutic procedures. Our objective was to determine the age at which children who are born risk for sedation/anesthesia adverse events. Our secondary objective was to describe the nature and incidence of adverse events. This is a prospective observational study of children receiving sedation/anesthesia for diagnostic and/or therapeutic procedures outside of the operating room by the Pediatric Sedation Research Consortium. A total of 57,227 patients 0 to 22 years of age were eligible for this study. All adverse events and descriptive terms were predefined. Logistic regression and locally weighted scatterplot regression were used for analysis. Preterm and former preterm children had higher adverse event rates (14.7% vs 8.5%) compared with children born at term. Our analysis revealed a biphasic pattern for the development of adverse sedation/anesthesia events. Airway and respiratory adverse events were most commonly reported. MRI scans were the most commonly performed procedures in both categories of patients. Patients born preterm are nearly twice as likely to develop sedation/anesthesia adverse events, and this risk continues up to 23 years of age. We recommend obtaining birth history during the formulation of an anesthetic/sedation plan, with heightened awareness that preterm and former preterm children may be at increased risk. Further prospective studies focusing on the etiology and prevention of adverse events in former preterm patients are warranted. Copyright © 2016 by the American Academy of Pediatrics.