WorldWideScience

Sample records for risk models based

  1. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  2. RISK LOAN PORTFOLIO OPTIMIZATION MODEL BASED ON CVAR RISK MEASURE

    Directory of Open Access Journals (Sweden)

    Ming-Chang LEE

    2015-07-01

    Full Text Available In order to achieve commercial banks liquidity, safety and profitability objective requirements, loan portfolio risk analysis based optimization decisions are rational allocation of assets.  The risk analysis and asset allocation are the key technology of banking and risk management.  The aim of this paper, build a loan portfolio optimization model based on risk analysis.  Loan portfolio rate of return by using Value-at-Risk (VaR and Conditional Value-at-Risk (CVaR constraint optimization decision model reflects the bank's risk tolerance, and the potential loss of direct control of the bank.  In this paper, it analyze a general risk management model applied to portfolio problems with VaR and CVaR risk measures by using Using the Lagrangian Algorithm.  This paper solves the highly difficult problem by matrix operation method.  Therefore, the combination of this paper is easy understanding the portfolio problems with VaR and CVaR risk model is a hyperbola in mean-standard deviation space.  It is easy calculation in proposed method.

  3. A Knowledge-Based Model of Audit Risk

    OpenAIRE

    Dhar, Vasant; Lewis, Barry; Peters, James

    1988-01-01

    Within the academic and professional auditing communities, there has been growing concern about how to accurately assess the various risks associated with performing an audit. These risks are difficult to conceptualize in terms of numeric estimates. This article discusses the development of a prototype computational model (computer program) that assesses one of the major audit risks -- inherent risk. This program bases most of its inferencing activities on a qualitative model of a typical bus...

  4. Risk based modelling

    International Nuclear Information System (INIS)

    Chapman, O.J.V.; Baker, A.E.

    1993-01-01

    Risk based analysis is a tool becoming available to both engineers and managers to aid decision making concerning plant matters such as In-Service Inspection (ISI). In order to develop a risk based method, some form of Structural Reliability Risk Assessment (SRRA) needs to be performed to provide a probability of failure ranking for all sites around the plant. A Probabilistic Risk Assessment (PRA) can then be carried out to combine these possible events with the capability of plant safety systems and procedures, to establish the consequences of failure for the sites. In this way the probability of failures are converted into a risk based ranking which can be used to assist the process of deciding which sites should be included in an ISI programme. This paper reviews the technique and typical results of a risk based ranking assessment carried out for nuclear power plant pipework. (author)

  5. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  6. Risk Based Milk Pricing Model at Dairy Farmers Level

    Directory of Open Access Journals (Sweden)

    W. Septiani

    2017-12-01

    Full Text Available The milk price from a cooperative institution to farmer does not fully cover the production cost. Though, dairy farmers encounter various risks and uncertainties in conducting their business. The highest risk in milk supply lies in the activities at the farm. This study was designed to formulate a model for calculating milk price at farmer’s level based on risk. Risks that occur on farms include the risk of cow breeding, sanitation, health care, cattle feed management, milking and milk sales. This research used the location of the farm in West Java region. There were five main stages in the preparation of this model, (1 identification and analysis of influential factors, (2 development of a conceptual model, (3 structural analysis and the amount of production costs, (4 model calculation of production cost with risk factors, and (5 risk based milk pricing model. This research built a relationship between risks on smallholder dairy farms with the production costs to be incurred by the farmers. It was also obtained the formulation of risk adjustment factor calculation for the variable costs of production in dairy cattle farm. The difference in production costs with risk and the total production cost without risk was about 8% to 10%. It could be concluded that the basic price of milk proposed based on the research was around IDR 4,250-IDR 4,350/L for 3 to 4 cows ownership. Increasing farmer income was expected to be obtained by entering the value of this risk in the calculation of production costs. 

  7. LIFETIME LUNG CANCER RISKS ASSOCIATED WITH INDOOR RADON EXPOSURE BASED ON VARIOUS RADON RISK MODELS FOR CANADIAN POPULATION.

    Science.gov (United States)

    Chen, Jing

    2017-04-01

    This study calculates and compares the lifetime lung cancer risks associated with indoor radon exposure based on well-known risk models in the literature; two risk models are from joint studies among miners and the other three models were developed from pooling studies on residential radon exposure from China, Europe and North America respectively. The aim of this article is to make clear that the various models are mathematical descriptions of epidemiologically observed real risks in different environmental settings. The risk from exposure to indoor radon is real and it is normal that variations could exist among different risk models even when they were applied to the same dataset. The results show that lifetime risk estimates vary significantly between the various risk models considered here: the model based on the European residential data provides the lowest risk estimates, while models based on the European miners and Chinese residential pooling with complete dosimetry give the highest values. The lifetime risk estimates based on the EPA/BEIR-VI model lie within this range and agree reasonably well with the averages of risk estimates from the five risk models considered in this study. © Crown copyright 2016.

  8. Bayesian risk-based decision method for model validation under uncertainty

    International Nuclear Information System (INIS)

    Jiang Xiaomo; Mahadevan, Sankaran

    2007-01-01

    This paper develops a decision-making methodology for computational model validation, considering the risk of using the current model, data support for the current model, and cost of acquiring new information to improve the model. A Bayesian decision theory-based method is developed for this purpose, using a likelihood ratio as the validation metric for model assessment. An expected risk or cost function is defined as a function of the decision costs, and the likelihood and prior of each hypothesis. The risk is minimized through correctly assigning experimental data to two decision regions based on the comparison of the likelihood ratio with a decision threshold. A Bayesian validation metric is derived based on the risk minimization criterion. Two types of validation tests are considered: pass/fail tests and system response value measurement tests. The methodology is illustrated for the validation of reliability prediction models in a tension bar and an engine blade subjected to high cycle fatigue. The proposed method can effectively integrate optimal experimental design into model validation to simultaneously reduce the cost and improve the accuracy of reliability model assessment

  9. Coronary risk assessment by point-based vs. equation-based Framingham models: significant implications for clinical care.

    Science.gov (United States)

    Gordon, William J; Polansky, Jesse M; Boscardin, W John; Fung, Kathy Z; Steinman, Michael A

    2010-11-01

    US cholesterol guidelines use original and simplified versions of the Framingham model to estimate future coronary risk and thereby classify patients into risk groups with different treatment strategies. We sought to compare risk estimates and risk group classification generated by the original, complex Framingham model and the simplified, point-based version. We assessed 2,543 subjects age 20-79 from the 2001-2006 National Health and Nutrition Examination Surveys (NHANES) for whom Adult Treatment Panel III (ATP-III) guidelines recommend formal risk stratification. For each subject, we calculated the 10-year risk of major coronary events using the original and point-based Framingham models, and then compared differences in these risk estimates and whether these differences would place subjects into different ATP-III risk groups (20% risk). Using standard procedures, all analyses were adjusted for survey weights, clustering, and stratification to make our results nationally representative. Among 39 million eligible adults, the original Framingham model categorized 71% of subjects as having "moderate" risk (20%) risk. Estimates of coronary risk by the original and point-based models often differed substantially. The point-based system classified 15% of adults (5.7 million) into different risk groups than the original model, with 10% (3.9 million) misclassified into higher risk groups and 5% (1.8 million) into lower risk groups, for a net impact of classifying 2.1 million adults into higher risk groups. These risk group misclassifications would impact guideline-recommended drug treatment strategies for 25-46% of affected subjects. Patterns of misclassifications varied significantly by gender, age, and underlying CHD risk. Compared to the original Framingham model, the point-based version misclassifies millions of Americans into risk groups for which guidelines recommend different treatment strategies.

  10. Canadian population risk of radon induced lung cancer variation range assessment based on various radon risk models

    International Nuclear Information System (INIS)

    Chen, Jing

    2017-01-01

    To address public concerns regarding radon risk and variations in risk estimates based on various risk models available in the literature, lifetime lung cancer risks were calculated with five well-known risk models using more recent Canadian vital statistics (5-year averages from 2008 to 2012). Variations in population risk estimation among various models were assessed. The results showed that the Canadian population risk of radon induced lung cancer can vary from 5.0 to 17% for men and 5.1 to 18% for women based on different radon risk models. Averaged over the estimates from various risk models with better radon dosimetry, 13% of lung cancer deaths among Canadian males and 14% of lung cancer deaths among Canadian females were attributable to long-term indoor radon exposure. (authors)

  11. Stochastic Watershed Models for Risk Based Decision Making

    Science.gov (United States)

    Vogel, R. M.

    2017-12-01

    Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation

  12. Model based climate information on drought risk in Africa

    Science.gov (United States)

    Calmanti, S.; Syroka, J.; Jones, C.; Carfagna, F.; Dell'Aquila, A.; Hoefsloot, P.; Kaffaf, S.; Nikulin, G.

    2012-04-01

    The United Nations World Food Programme (WFP) has embarked upon the endeavor of creating a sustainable Africa-wide natural disaster risk management system. A fundamental building block of this initiative is the setup of a drought impact modeling platform called Africa Risk-View that aims to quantify and monitor weather-related food security risk in Africa. The modeling approach is based the Water Requirement Satisfaction Index (WRSI), as the fundamental indicator of the performances of agriculture and uses historical records of food assistance operation to project future potential needs for livelihood protection. By using climate change scenarios as an input to Africa Risk-View it is possible, in principles, to evaluate the future impact of climate variability on critical issues such as food security and the overall performance of the envisaged risk management system. A necessary preliminary step to this challenging task is the exploration of the sources of uncertainties affecting the assessment based on modeled climate change scenarios. For this purpose, a limited set of climate models have been selected in order verify the relevance of using climate model output data with Africa Risk-View and to explore a minimal range of possible sources of uncertainty. This first evaluation exercise started before the setup of the CORDEX framework and has relied on model output available at the time. In particular only one regional downscaling was available for the entire African continent from the ENSEMBLES project. The analysis shows that current coarse resolution global climate models can not directly feed into the Africa RiskView risk-analysis tool. However, regional downscaling may help correcting the inherent biases observed in the datasets. Further analysis is performed by using the first data available under the CORDEX framework. In particular, we consider a set of simulation driven with boundary conditions from the reanalysis ERA-Interim to evaluate the skill drought

  13. The use of biologically based cancer risk models in radiation epidemiology

    International Nuclear Information System (INIS)

    Krewski, D.; Zielinski, J.M.; Hazelton, W.D.; Garner, M.J.; Moolgavkar, S.H.

    2003-01-01

    Biologically based risk projection models for radiation carcinogenesis seek to describe the fundamental biological processes involved in neoplastic transformation of somatic cells into malignant cancer cells. A validated biologically based model, whose parameters have a direct biological interpretation, can also be used to extrapolate cancer risks to different exposure conditions with some confidence. In this article, biologically based models for radiation carcinogenesis, including the two-stage clonal expansion (TSCE) model and its extensions, are reviewed. The biological and mathematical bases for such models are described, and the implications of key model parameters for cancer risk assessment examined. Specific applications of versions of the TSCE model to important epidemiologic datasets are discussed, including the Colorado uranium miners' cohort; a cohort of Chinese tin miners; the lifespan cohort of atomic bomb survivors in Hiroshima and Nagasaki; and a cohort of over 200,000 workers included in the National Dose Registry (NDR) of Canada. (author)

  14. Physics-based Entry, Descent and Landing Risk Model

    Science.gov (United States)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  15. An Agent-Based Model of Evolving Community Flood Risk.

    Science.gov (United States)

    Tonn, Gina L; Guikema, Seth D

    2017-11-17

    Although individual behavior plays a major role in community flood risk, traditional flood risk models generally do not capture information on how community policies and individual decisions impact the evolution of flood risk over time. The purpose of this study is to improve the understanding of the temporal aspects of flood risk through a combined analysis of the behavioral, engineering, and physical hazard aspects of flood risk. Additionally, the study aims to develop a new modeling approach for integrating behavior, policy, flood hazards, and engineering interventions. An agent-based model (ABM) is used to analyze the influence of flood protection measures, individual behavior, and the occurrence of floods and near-miss flood events on community flood risk. The ABM focuses on the following decisions and behaviors: dissemination of flood management information, installation of community flood protection, elevation of household mechanical equipment, and elevation of homes. The approach is place based, with a case study area in Fargo, North Dakota, but is focused on generalizable insights. Generally, community mitigation results in reduced future damage, and individual action, including mitigation and movement into and out of high-risk areas, can have a significant influence on community flood risk. The results of this study provide useful insights into the interplay between individual and community actions and how it affects the evolution of flood risk. This study lends insight into priorities for future work, including the development of more in-depth behavioral and decision rules at the individual and community level. © 2017 Society for Risk Analysis.

  16. Measuring the coupled risks: A copula-based CVaR model

    Science.gov (United States)

    He, Xubiao; Gong, Pu

    2009-01-01

    Integrated risk management for financial institutions requires an approach for aggregating risk types (such as market and credit) whose distributional shapes vary considerably. The financial institutions often ignore risks' coupling influence so as to underestimate the financial risks. We constructed a copula-based Conditional Value-at-Risk (CVaR) model for market and credit risks. This technique allows us to incorporate realistic marginal distributions that capture essential empirical features of these risks, such as skewness and fat-tails while allowing for a rich dependence structure. Finally, the numerical simulation method is used to implement the model. Our results indicate that the coupled risks for the listed company's stock maybe are undervalued if credit risk is ignored, especially for the listed company with bad credit quality.

  17. Formal safety assessment based on relative risks model in ship navigation

    Energy Technology Data Exchange (ETDEWEB)

    Hu Shenping [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: sphu@mmc.shmtu.edu.cn; Fang Quangen [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: qgfang@mmc.shmtu.edu.cn; Xia Haibo [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: hbxia@mmc.shmtu.edu.cn; Xi Yongtao [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: xiyt@mmc.shmtu.edu.cn

    2007-03-15

    Formal safety assessment (FSA) is a structured and systematic methodology aiming at enhancing maritime safety. It has been gradually and broadly used in the shipping industry nowadays around the world. On the basis of analysis and conclusion of FSA approach, this paper discusses quantitative risk assessment and generic risk model in FSA, especially frequency and severity criteria in ship navigation. Then it puts forward a new model based on relative risk assessment (MRRA). The model presents a risk-assessment approach based on fuzzy functions and takes five factors into account, including detailed information about accident characteristics. It has already been used for the assessment of pilotage safety in Shanghai harbor, China. Consequently, it can be proved that MRRA is a useful method to solve the problems in the risk assessment of ship navigation safety in practice.

  18. Formal safety assessment based on relative risks model in ship navigation

    International Nuclear Information System (INIS)

    Hu Shenping; Fang Quangen; Xia Haibo; Xi Yongtao

    2007-01-01

    Formal safety assessment (FSA) is a structured and systematic methodology aiming at enhancing maritime safety. It has been gradually and broadly used in the shipping industry nowadays around the world. On the basis of analysis and conclusion of FSA approach, this paper discusses quantitative risk assessment and generic risk model in FSA, especially frequency and severity criteria in ship navigation. Then it puts forward a new model based on relative risk assessment (MRRA). The model presents a risk-assessment approach based on fuzzy functions and takes five factors into account, including detailed information about accident characteristics. It has already been used for the assessment of pilotage safety in Shanghai harbor, China. Consequently, it can be proved that MRRA is a useful method to solve the problems in the risk assessment of ship navigation safety in practice

  19. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    Science.gov (United States)

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2017-10-01

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  20. Blended Risk Approach in Applying PSA Models to Risk-Based Regulations

    International Nuclear Information System (INIS)

    Dimitrijevic, V. B.; Chapman, J. R.

    1996-01-01

    In this paper, the authors will discuss a modern approach in applying PSA models in risk-based regulation. The Blended Risk Approach is a combination of traditional and probabilistic processes. It is receiving increased attention in different industries in the U. S. and abroad. The use of the deterministic regulations and standards provides a proven and well understood basis on which to assess and communicate the impact of change to plant design and operation. Incorporation of traditional values into risk evaluation is working very well in the blended approach. This approach is very application specific. It includes multiple risk attributes, qualitative risk analysis, and basic deterministic principles. In blending deterministic and probabilistic principles, this approach ensures that the objectives of the traditional defense-in-depth concept are not compromised and the design basis of the plant is explicitly considered. (author)

  1. An Integrated Risk Index Model Based on Hierarchical Fuzzy Logic for Underground Risk Assessment

    Directory of Open Access Journals (Sweden)

    Muhammad Fayaz

    2017-10-01

    Full Text Available Available space in congested cities is getting scarce due to growing urbanization in the recent past. The utilization of underground space is considered as a solution to the limited space in smart cities. The numbers of underground facilities are growing day by day in the developing world. Typical underground facilities include the transit subway, parking lots, electric lines, water supply and sewer lines. The likelihood of the occurrence of accidents due to underground facilities is a random phenomenon. To avoid any accidental loss, a risk assessment method is required to conduct the continuous risk assessment and report any abnormality before it happens. In this paper, we have proposed a hierarchical fuzzy inference based model for under-ground risk assessment. The proposed hierarchical fuzzy inference architecture reduces the total number of rules from the rule base. Rule reduction is important because the curse of dimensionality damages the transparency and interpretation as it is very tough to understand and justify hundreds or thousands of fuzzy rules. The computation time also increases as rules increase. The proposed model takes 175 rules having eight input parameters to compute the risk index, and the conventional fuzzy logic requires 390,625 rules, having the same number of input parameters to compute risk index. Hence, the proposed model significantly reduces the curse of dimensionality. Rule design for fuzzy logic is also a tedious task. In this paper, we have also introduced new rule schemes, namely maximum rule-based and average rule-based; both schemes can be used interchangeably according to the logic needed for rule design. The experimental results show that the proposed method is a virtuous choice for risk index calculation where the numbers of variables are greater.

  2. Intelligent judgements over health risks in a spatial agent-based model.

    Science.gov (United States)

    Abdulkareem, Shaheen A; Augustijn, Ellen-Wien; Mustafa, Yaseen T; Filatova, Tatiana

    2018-03-20

    Millions of people worldwide are exposed to deadly infectious diseases on a regular basis. Breaking news of the Zika outbreak for instance, made it to the main media titles internationally. Perceiving disease risks motivate people to adapt their behavior toward a safer and more protective lifestyle. Computational science is instrumental in exploring patterns of disease spread emerging from many individual decisions and interactions among agents and their environment by means of agent-based models. Yet, current disease models rarely consider simulating dynamics in risk perception and its impact on the adaptive protective behavior. Social sciences offer insights into individual risk perception and corresponding protective actions, while machine learning provides algorithms and methods to capture these learning processes. This article presents an innovative approach to extend agent-based disease models by capturing behavioral aspects of decision-making in a risky context using machine learning techniques. We illustrate it with a case of cholera in Kumasi, Ghana, accounting for spatial and social risk factors that affect intelligent behavior and corresponding disease incidents. The results of computational experiments comparing intelligent with zero-intelligent representations of agents in a spatial disease agent-based model are discussed. We present a spatial disease agent-based model (ABM) with agents' behavior grounded in Protection Motivation Theory. Spatial and temporal patterns of disease diffusion among zero-intelligent agents are compared to those produced by a population of intelligent agents. Two Bayesian Networks (BNs) designed and coded using R and are further integrated with the NetLogo-based Cholera ABM. The first is a one-tier BN1 (only risk perception), the second is a two-tier BN2 (risk and coping behavior). We run three experiments (zero-intelligent agents, BN1 intelligence and BN2 intelligence) and report the results per experiment in terms of

  3. Physics-Based Fragment Acceleration Modeling for Pressurized Tank Burst Risk Assessments

    Science.gov (United States)

    Manning, Ted A.; Lawrence, Scott L.

    2014-01-01

    As part of comprehensive efforts to develop physics-based risk assessment techniques for space systems at NASA, coupled computational fluid and rigid body dynamic simulations were carried out to investigate the flow mechanisms that accelerate tank fragments in bursting pressurized vessels. Simulations of several configurations were compared to analyses based on the industry-standard Baker explosion model, and were used to formulate an improved version of the model. The standard model, which neglects an external fluid, was found to agree best with simulation results only in configurations where the internal-to-external pressure ratio is very high and fragment curvature is small. The improved model introduces terms that accommodate an external fluid and better account for variations based on circumferential fragment count. Physics-based analysis was critical in increasing the model's range of applicability. The improved tank burst model can be used to produce more accurate risk assessments of space vehicle failure modes that involve high-speed debris, such as exploding propellant tanks and bursting rocket engines.

  4. Risk-based systems analysis for emerging technologies: Applications of a technology risk assessment model to public decision making

    International Nuclear Information System (INIS)

    Quadrel, M.J.; Fowler, K.M.; Cameron, R.; Treat, R.J.; McCormack, W.D.; Cruse, J.

    1995-01-01

    The risk-based systems analysis model was designed to establish funding priorities among competing technologies for tank waste remediation. The model addresses a gap in the Department of Energy's (DOE's) ''toolkit'' for establishing funding priorities among emerging technologies by providing disciplined risk and cost assessments of candidate technologies within the context of a complete remediation system. The model is comprised of a risk and cost assessment and a decision interface. The former assesses the potential reductions in risk and cost offered by new technology relative to the baseline risk and cost of an entire system. The latter places this critical information in context of other values articulated by decision makers and stakeholders in the DOE system. The risk assessment portion of the model is demonstrated for two candidate technologies for tank waste retrieval (arm-based mechanical retrieval -- the ''long reach arm'') and subsurface barriers (close-coupled chemical barriers). Relative changes from the base case in cost and risk are presented for these two technologies to illustrate how the model works. The model and associated software build on previous work performed for DOE's Office of Technology Development and the former Underground Storage Tank Integrated Demonstration, and complement a decision making tool presented at Waste Management 1994 for integrating technical judgements and non-technical (stakeholder) values when making technology funding decisions

  5. Addressing dependability by applying an approach for model-based risk assessment

    International Nuclear Information System (INIS)

    Gran, Bjorn Axel; Fredriksen, Rune; Thunem, Atoosa P.-J.

    2007-01-01

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development

  6. Addressing dependability by applying an approach for model-based risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

    2007-11-15

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

  7. Risk Assessment of Engineering Project Financing Based on PPP Model

    Directory of Open Access Journals (Sweden)

    Ma Qiuli

    2017-01-01

    Full Text Available At present, the project financing channel is single, and the urban facilities are in short supply, and the risk assessment and prevention mechanism of financing should be further improved to reduce the risk of project financing. In view of this, the fuzzy comprehensive evaluation model of project financing risk which combined the method of fuzzy comprehensive evaluation and analytic hierarchy process is established. The scientificalness and effectiveness of the model are verified by the example of the world port project in Luohe city, and it provides basis and reference for engineering project financing based on PPP mode.

  8. WORKSHOP ON APPLICATION OF STATISTICAL METHODS TO BIOLOGICALLY-BASED PHARMACOKINETIC MODELING FOR RISK ASSESSMENT

    Science.gov (United States)

    Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...

  9. Large-scale model-based assessment of deer-vehicle collision risk.

    Directory of Open Access Journals (Sweden)

    Torsten Hothorn

    Full Text Available Ungulates, in particular the Central European roe deer Capreolus capreolus and the North American white-tailed deer Odocoileus virginianus, are economically and ecologically important. The two species are risk factors for deer-vehicle collisions and as browsers of palatable trees have implications for forest regeneration. However, no large-scale management systems for ungulates have been implemented, mainly because of the high efforts and costs associated with attempts to estimate population sizes of free-living ungulates living in a complex landscape. Attempts to directly estimate population sizes of deer are problematic owing to poor data quality and lack of spatial representation on larger scales. We used data on >74,000 deer-vehicle collisions observed in 2006 and 2009 in Bavaria, Germany, to model the local risk of deer-vehicle collisions and to investigate the relationship between deer-vehicle collisions and both environmental conditions and browsing intensities. An innovative modelling approach for the number of deer-vehicle collisions, which allows nonlinear environment-deer relationships and assessment of spatial heterogeneity, was the basis for estimating the local risk of collisions for specific road types on the scale of Bavarian municipalities. Based on this risk model, we propose a new "deer-vehicle collision index" for deer management. We show that the risk of deer-vehicle collisions is positively correlated to browsing intensity and to harvest numbers. Overall, our results demonstrate that the number of deer-vehicle collisions can be predicted with high precision on the scale of municipalities. In the densely populated and intensively used landscapes of Central Europe and North America, a model-based risk assessment for deer-vehicle collisions provides a cost-efficient instrument for deer management on the landscape scale. The measures derived from our model provide valuable information for planning road protection and defining

  10. Risk-based safety indicators

    International Nuclear Information System (INIS)

    Sedlak, J.

    2001-12-01

    The report is structured as follows: 1. Risk-based safety indicators: Typology of risk-based indicators (RBIs); Tools for defining RBIs; Requirements for the PSA model; Data sources for RBIs; Types of risks monitored; RBIs and operational safety indicators; Feedback from operating experience; PSO model modification for RBIs; RBI categorization; RBI assessment; RBI applications; Suitable RBI applications. 2. Proposal for risk-based indicators: Acquiring information from operational experience; Method of acquiring safety relevance coefficients for the systems from a PSA model; Indicator definitions; On-line indicators. 3. Annex: Application of RBIs worldwide. (P.A.)

  11. Integrating adaptive behaviour in large-scale flood risk assessments: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Aerts, Jeroen

    2015-04-01

    Between 1998 and 2009, Europe suffered over 213 major damaging floods, causing 1126 deaths, displacing around half a million people. In this period, floods caused at least 52 billion euro in insured economic losses making floods the most costly natural hazard faced in Europe. In many low-lying areas, the main strategy to cope with floods is to reduce the risk of the hazard through flood defence structures, like dikes and levees. However, it is suggested that part of the responsibility for flood protection needs to shift to households and businesses in areas at risk, and that governments and insurers can effectively stimulate the implementation of individual protective measures. However, adaptive behaviour towards flood risk reduction and the interaction between the government, insurers, and individuals has hardly been studied in large-scale flood risk assessments. In this study, an European Agent-Based Model is developed including agent representatives for the administrative stakeholders of European Member states, insurers and reinsurers markets, and individuals following complex behaviour models. The Agent-Based Modelling approach allows for an in-depth analysis of the interaction between heterogeneous autonomous agents and the resulting (non-)adaptive behaviour. Existing flood damage models are part of the European Agent-Based Model to allow for a dynamic response of both the agents and the environment to changing flood risk and protective efforts. By following an Agent-Based Modelling approach this study is a first contribution to overcome the limitations of traditional large-scale flood risk models in which the influence of individual adaptive behaviour towards flood risk reduction is often lacking.

  12. Task-based dermal exposure models for regulatory risk assessment

    NARCIS (Netherlands)

    Warren, N.D.; Marquart, H.; Christopher, Y.; Laitinen, J.; Hemmen, J.J. van

    2006-01-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of

  13. A Risk Prediction Model for Sporadic CRC Based on Routine Lab Results.

    Science.gov (United States)

    Boursi, Ben; Mamtani, Ronac; Hwang, Wei-Ting; Haynes, Kevin; Yang, Yu-Xiao

    2016-07-01

    Current risk scores for colorectal cancer (CRC) are based on demographic and behavioral factors and have limited predictive values. To develop a novel risk prediction model for sporadic CRC using clinical and laboratory data in electronic medical records. We conducted a nested case-control study in a UK primary care database. Cases included those with a diagnostic code of CRC, aged 50-85. Each case was matched with four controls using incidence density sampling. CRC predictors were examined using univariate conditional logistic regression. Variables with p value CRC prediction models which included age, sex, height, obesity, ever smoking, alcohol dependence, and previous screening colonoscopy had an AUC of 0.58 (0.57-0.59) with poor goodness of fit. A laboratory-based model including hematocrit, MCV, lymphocytes, and neutrophil-lymphocyte ratio (NLR) had an AUC of 0.76 (0.76-0.77) and a McFadden's R2 of 0.21 with a NRI of 47.6 %. A combined model including sex, hemoglobin, MCV, white blood cells, platelets, NLR, and oral hypoglycemic use had an AUC of 0.80 (0.79-0.81) with a McFadden's R2 of 0.27 and a NRI of 60.7 %. Similar results were shown in an internal validation set. A laboratory-based risk model had good predictive power for sporadic CRC risk.

  14. Component Degradation Susceptibilities As The Bases For Modeling Reactor Aging Risk

    International Nuclear Information System (INIS)

    Unwin, Stephen D.; Lowry, Peter P.; Toyooka, Michael Y.

    2010-01-01

    The extension of nuclear power plant operating licenses beyond 60 years in the United States will be necessary if we are to meet national energy needs while addressing the issues of carbon and climate. Characterizing the operating risks associated with aging reactors is problematic because the principal tool for risk-informed decision-making, Probabilistic Risk Assessment (PRA), is not ideally-suited to addressing aging systems. The components most likely to drive risk in an aging reactor - the passives - receive limited treatment in PRA, and furthermore, standard PRA methods are based on the assumption of stationary failure rates: a condition unlikely to be met in an aging system. A critical barrier to modeling passives aging on the wide scale required for a PRA is that there is seldom sufficient field data to populate parametric failure models, and nor is there the availability of practical physics models to predict out-year component reliability. The methodology described here circumvents some of these data and modeling needs by using materials degradation metrics, integrated with conventional PRA models, to produce risk importance measures for specific aging mechanisms and component types. We suggest that these measures have multiple applications, from the risk-screening of components to the prioritization of materials research.

  15. Forewarning model for water pollution risk based on Bayes theory.

    Science.gov (United States)

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  16. Dealing with project complexity by matrix-based propagation modelling for project risk analysis

    OpenAIRE

    Fang , Chao; Marle , Franck

    2012-01-01

    International audience; Engineering projects are facing a growing complexity and are thus exposed to numerous and interdependent risks. In this paper, we present a quantitative method for modelling propagation behaviour in the project risk network. The construction of the network requires the involvement of the project manager and related experts using the Design Structure Matrix (DSM) method. A matrix-based risk propagation model is introduced to calculate risk propagation and thus to re-eva...

  17. A Risk Assessment Example for Soil Invertebrates Using Spatially Explicit Agent-Based Models

    DEFF Research Database (Denmark)

    Reed, Melissa; Alvarez, Tania; Chelinho, Sonia

    2016-01-01

    Current risk assessment methods for measuring the toxicity of plant protection products (PPPs) on soil invertebrates use standardized laboratory conditions to determine acute effects on mortality and sublethal effects on reproduction. If an unacceptable risk is identified at the lower tier...... population models for ubiquitous soil invertebrates (collembolans and earthworms) as refinement options in current risk assessment. Both are spatially explicit agent-based models (ABMs), incorporating individual and landscape variability. The models were used to provide refined risk assessments for different...... application scenarios of a hypothetical pesticide applied to potato crops (full-field spray onto the soil surface [termed “overall”], in-furrow, and soil-incorporated pesticide applications). In the refined risk assessment, the population models suggest that soil invertebrate populations would likely recover...

  18. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    Science.gov (United States)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  19. Modeling of Ship Collision Risk Index Based on Complex Plane and Its Realization

    OpenAIRE

    Xiaoqin Xu; Xiaoqiao Geng; Yuanqiao Wen

    2016-01-01

    Ship collision risk index is the basic and important concept in the domain of ship collision avoidance. In this paper, the advantages and deficiencies of the various calculation methods of ship collision risk index are pointed out. Then the ship collision risk model based on complex plane, which can well make up for the deficiencies of the widely-used evaluation model proposed by Kearon.J and Liu ruru is proposed. On this basis, the calculation method of collision risk index under the encount...

  20. Risk based surveillance for vector borne diseases

    DEFF Research Database (Denmark)

    Bødker, Rene

    of samples and hence early detection of outbreaks. Models for vector borne diseases in Denmark have demonstrated dramatic variation in outbreak risk during the season and between years. The Danish VetMap project aims to make these risk based surveillance estimates available on the veterinarians smart phones...... in Northern Europe. This model approach may be used as a basis for risk based surveillance. In risk based surveillance limited resources for surveillance are targeted at geographical areas most at risk and only when the risk is high. This makes risk based surveillance a cost effective alternative...... sample to a diagnostic laboratory. Risk based surveillance models may reduce this delay. An important feature of risk based surveillance models is their ability to continuously communicate the level of risk to veterinarians and hence increase awareness when risk is high. This is essential for submission...

  1. Causal Loop-based Modeling on System Dynamics for Risk Communication

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Ju [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kang, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2009-10-15

    It is true that a national policy should be based on public confidence, analyzing their recognition and attitude on life safety, since they have very special risk perception characteristics. For achieving effective public consensus regarding a national policy such as nuclear power, we have to utilize a risk communication (hereafter, calls RiCom) process. However, domestic research models on RiCom process do not provide a practical guideline, because most of them are still superficial and stick on an administrative aspect. Also, most of current models have no experience in terms of verification and validation for effective applications to diverse stake holders. This study focuses on public's dynamic mechanism through the modeling on system dynamics, basically utilizing casual loop diagram (CLD) and stock flow diagram (SFD), which regards as a critical technique for decision making in many industrial RiCom models.

  2. Causal Loop-based Modeling on System Dynamics for Risk Communication

    International Nuclear Information System (INIS)

    Lee, Chang Ju; Kang, Kyung Min

    2009-01-01

    It is true that a national policy should be based on public confidence, analyzing their recognition and attitude on life safety, since they have very special risk perception characteristics. For achieving effective public consensus regarding a national policy such as nuclear power, we have to utilize a risk communication (hereafter, calls RiCom) process. However, domestic research models on RiCom process do not provide a practical guideline, because most of them are still superficial and stick on an administrative aspect. Also, most of current models have no experience in terms of verification and validation for effective applications to diverse stake holders. This study focuses on public's dynamic mechanism through the modeling on system dynamics, basically utilizing casual loop diagram (CLD) and stock flow diagram (SFD), which regards as a critical technique for decision making in many industrial RiCom models

  3. Life cycle cost-based risk model for energy performance contracting retrofits

    Science.gov (United States)

    Berghorn, George H.

    Buildings account for 41% of the primary energy consumption in the United States, nearly half of which is accounted for by commercial buildings. Among the greatest energy users are those in the municipalities, universities, schools, and hospitals (MUSH) market. Correctional facilities are in the upper half of all commercial building types for energy intensity. Public agencies have experienced reduced capital budgets to fund retrofits; this has led to the increased use of energy performance contracts (EPC), which are implemented by energy services companies (ESCOs). These companies guarantee a minimum amount of energy savings resulting from the retrofit activities, which in essence transfers performance risk from the owner to the contractor. Building retrofits in the MUSH market, especially correctional facilities, are well-suited to EPC, yet despite this potential and their high energy intensities, efficiency improvements lag behind that of other public building types. Complexities in project execution, lack of support for data requests and sub-metering, and conflicting project objectives have been cited as reasons for this lag effect. As a result, project-level risks must be understood in order to support wider adoption of retrofits in the public market, in particular the correctional facility sub-market. The goal of this research is to understand risks related to the execution of energy efficiency retrofits delivered via EPC in the MUSH market. To achieve this goal, in-depth analysis and improved understanding was sought with regard to ESCO risks that are unique to EPC in this market. The proposed work contributes to this understanding by developing a life cycle cost-based risk model to improve project decision making with regard to risk control and reduction. The specific objectives of the research are: (1) to perform an exploratory analysis of the EPC retrofit process and identify key areas of performance risk requiring in-depth analysis; (2) to construct a

  4. Methodological Bases for Describing Risks of the Enterprise Business Model in Integrated Reporting

    Directory of Open Access Journals (Sweden)

    Nesterenko Oksana O.

    2017-12-01

    Full Text Available The aim of the article is to substantiate the methodological bases for describing the business and accounting risks of an enterprise business model in integrated reporting for their timely detection and assessment, and develop methods for their leveling or minimizing and possible prevention. It is proposed to consider risks in the process of forming integrated reporting from two sides: first, risks that arise in the business model of an organization and should be disclosed in its integrated report; second, accounting risks of integrated reporting, which should be taken into account by members of the cross-sectoral working group and management personnel in the process of forming and promulgating integrated reporting. To develop an adequate accounting and analytical tool for disclosure of information about the risks of the business model and integrated reporting, their leveling or minimization, in the article a terminological analysis of the essence of entrepreneurial and accounting risks is carried out. The entrepreneurial risk is defined as an objective-subjective economic category that characterizes the probability of negative or positive consequences of economic-social-ecological activity within the framework of the business model of an enterprise under uncertainty. The accounting risk is suggested to be understood as the probability of unfavorable consequences as a result of organizational, methodological errors in the integrated accounting system, which present threat to the quality, accuracy and reliability of the reporting information on economic, social and environmental activities in integrated reporting as well as threat of inappropriate decision-making by stakeholders based on the integrated report. For the timely identification of business risks and maximum leveling of the influence of accounting risks on the process of formation and publication of integrated reporting, in the study the place of entrepreneurial and accounting risks in

  5. Evaluating the Risk of Metabolic Syndrome Based on an Artificial Intelligence Model

    Directory of Open Access Journals (Sweden)

    Hui Chen

    2014-01-01

    Full Text Available Metabolic syndrome is worldwide public health problem and is a serious threat to people's health and lives. Understanding the relationship between metabolic syndrome and the physical symptoms is a difficult and challenging task, and few studies have been performed in this field. It is important to classify adults who are at high risk of metabolic syndrome without having to use a biochemical index and, likewise, it is important to develop technology that has a high economic rate of return to simplify the complexity of this detection. In this paper, an artificial intelligence model was developed to identify adults at risk of metabolic syndrome based on physical signs; this artificial intelligence model achieved more powerful capacity for classification compared to the PCLR (principal component logistic regression model. A case study was performed based on the physical signs data, without using a biochemical index, that was collected from the staff of Lanzhou Grid Company in Gansu province of China. The results show that the developed artificial intelligence model is an effective classification system for identifying individuals at high risk of metabolic syndrome.

  6. Risk, individual differences, and environment: an Agent-Based Modeling approach to sexual risk-taking.

    Science.gov (United States)

    Nagoski, Emily; Janssen, Erick; Lohrmann, David; Nichols, Eric

    2012-08-01

    Risky sexual behaviors, including the decision to have unprotected sex, result from interactions between individuals and their environment. The current study explored the use of Agent-Based Modeling (ABM)-a methodological approach in which computer-generated artificial societies simulate human sexual networks-to assess the influence of heterogeneity of sexual motivation on the risk of contracting HIV. The models successfully simulated some characteristics of human sexual systems, such as the relationship between individual differences in sexual motivation (sexual excitation and inhibition) and sexual risk, but failed to reproduce the scale-free distribution of number of partners observed in the real world. ABM has the potential to inform intervention strategies that target the interaction between an individual and his or her social environment.

  7. Modeling of Ship Collision Risk Index Based on Complex Plane and Its Realization

    Directory of Open Access Journals (Sweden)

    Xiaoqin Xu

    2016-07-01

    Full Text Available Ship collision risk index is the basic and important concept in the domain of ship collision avoidance. In this paper, the advantages and deficiencies of the various calculation methods of ship collision risk index are pointed out. Then the ship collision risk model based on complex plane, which can well make up for the deficiencies of the widely-used evaluation model proposed by Kearon.J and Liu ruru is proposed. On this basis, the calculation method of collision risk index under the encountering situation of multi-ships is constructed, then the three-dimensional image and spatial curve of the risk index are figured out. Finally, single chip microcomputer is used to realize the model. And attaching this single chip microcomputer to ARPA is helpful to the decision-making of the marine navigators.

  8. Time-based collision risk modeling for air traffic management

    Science.gov (United States)

    Bell, Alan E.

    Since the emergence of commercial aviation in the early part of last century, economic forces have driven a steadily increasing demand for air transportation. Increasing density of aircraft operating in a finite volume of airspace is accompanied by a corresponding increase in the risk of collision, and in response to a growing number of incidents and accidents involving collisions between aircraft, governments worldwide have developed air traffic control systems and procedures to mitigate this risk. The objective of any collision risk management system is to project conflicts and provide operators with sufficient opportunity to recognize potential collisions and take necessary actions to avoid them. It is therefore the assertion of this research that the currency of collision risk management is time. Future Air Traffic Management Systems are being designed around the foundational principle of four dimensional trajectory based operations, a method that replaces legacy first-come, first-served sequencing priorities with time-based reservations throughout the airspace system. This research will demonstrate that if aircraft are to be sequenced in four dimensions, they must also be separated in four dimensions. In order to separate aircraft in four dimensions, time must emerge as the primary tool by which air traffic is managed. A functional relationship exists between the time-based performance of aircraft, the interval between aircraft scheduled to cross some three dimensional point in space, and the risk of collision. This research models that relationship and presents two key findings. First, a method is developed by which the ability of an aircraft to meet a required time of arrival may be expressed as a robust standard for both industry and operations. Second, a method by which airspace system capacity may be increased while maintaining an acceptable level of collision risk is presented and demonstrated for the purpose of formulating recommendations for procedures

  9. Risk assessment of storm surge disaster based on numerical models and remote sensing

    Science.gov (United States)

    Liu, Qingrong; Ruan, Chengqing; Zhong, Shan; Li, Jian; Yin, Zhonghui; Lian, Xihu

    2018-06-01

    Storm surge is one of the most serious ocean disasters in the world. Risk assessment of storm surge disaster for coastal areas has important implications for planning economic development and reducing disaster losses. Based on risk assessment theory, this paper uses coastal hydrological observations, a numerical storm surge model and multi-source remote sensing data, proposes methods for valuing hazard and vulnerability for storm surge and builds a storm surge risk assessment model. Storm surges in different recurrence periods are simulated in numerical models and the flooding areas and depth are calculated, which are used for assessing the hazard of storm surge; remote sensing data and GIS technology are used for extraction of coastal key objects and classification of coastal land use are identified, which is used for vulnerability assessment of storm surge disaster. The storm surge risk assessment model is applied for a typical coastal city, and the result shows the reliability and validity of the risk assessment model. The building and application of storm surge risk assessment model provides some basis reference for the city development plan and strengthens disaster prevention and mitigation.

  10. A quantitative risk-based model for reasoning over critical system properties

    Science.gov (United States)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  11. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, S.Y.

    1994-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the US Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities

  12. Development of risk-based computer models for deriving criteria on residual radioactivity and recycling

    International Nuclear Information System (INIS)

    Chen, Shih-Yew

    1995-01-01

    Argonne National Laboratory (ANL) is developing multimedia environmental pathway and health risk computer models to assess radiological risks to human health and to derive cleanup guidelines for environmental restoration, decommissioning, and recycling activities. These models are based on the existing RESRAD code, although each has a separate design and serves different objectives. Two such codes are RESRAD-BUILD and RESRAD-PROBABILISTIC. The RESRAD code was originally developed to implement the U.S. Department of Energy's (DOE's) residual radioactive materials guidelines for contaminated soils. RESRAD has been successfully used by DOE and its contractors to assess health risks and develop cleanup criteria for several sites selected for cleanup or restoration programs. RESRAD-BUILD analyzes human health risks from radioactive releases during decommissioning or rehabilitation of contaminated buildings. Risks to workers are assessed for dismantling activities; risks to the public are assessed for occupancy. RESRAD-BUILD is based on a room compartmental model analyzing the effects on room air quality of contaminant emission and resuspension (as well as radon emanation), the external radiation pathway, and other exposure pathways. RESRAD-PROBABILISTIC, currently under development, is intended to perform uncertainty analysis for RESRAD by using the Monte Carlo approach based on the Latin-Hypercube sampling scheme. The codes being developed at ANL are tailored to meet a specific objective of human health risk assessment and require specific parameter definition and data gathering. The combined capabilities of these codes satisfy various risk assessment requirements in environmental restoration and remediation activities. (author)

  13. Lifestyle-based risk model for fall risk assessment

    OpenAIRE

    Sannino, Giovanna; De Falco, Ivanoe; De Pietro, Guiseppe

    2016-01-01

    Purpose: The aim of this study was to identify the explicit relationship between life-style and the risk of falling under the form of a mathematical model. Starting from some personal and behavioral information of a subject as, e.g., weight, height, age, data about physical activity habits, and concern about falling, the model would estimate the score of her/his Mini-Balance Evaluation Systems (Mini-BES) test. This score ranges within 0 and 28, and the lower its value the more likely the subj...

  14. A risk-return based model to measure the performance of portfolio management

    Directory of Open Access Journals (Sweden)

    Hamid Reza Vakili Fard

    2014-10-01

    Full Text Available The primary concern in all portfolio management systems is to find a good tradeoff between risk and expected return and a good balance between accepted risk and actual return indicates the performance of a particular portfolio. This paper develops “A-Y Model” to measure the performance of a portfolio and analyze it during the bull and the bear market. This paper considers the daily information of one year before and one year after Iran's 2013 precedential election. The proposed model of this paper provides lost profit and unrealized loss to measure the portfolio performance. The proposed study first ranks the resulted data and then uses some non-parametric methods to see whether there is any change because of the changes in markets on the performance of the portfolio. The results indicate that despite increasing profitable opportunities in bull market, the performance of the portfolio did not match the target risk. As a result, using A-Y Model as a risk and return base model to measure portfolio management's performance appears to reduce risks and increases return of portfolio.

  15. Task-based dermal exposure models for regulatory risk assessment.

    Science.gov (United States)

    Warren, Nicholas D; Marquart, Hans; Christopher, Yvette; Laitinen, Juha; VAN Hemmen, Joop J

    2006-07-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of new measurements of dermal exposure together with detailed contextual information. This article describes the development of a set of generic task-based models capable of predicting potential dermal exposure to both solids and liquids in a wide range of situations. To facilitate modelling of the wide variety of dermal exposure situations six separate models were made for groupings of exposure scenarios called Dermal Exposure Operation units (DEO units). These task-based groupings cluster exposure scenarios with regard to the expected routes of dermal exposure and the expected influence of exposure determinants. Within these groupings linear mixed effect models were used to estimate the influence of various exposure determinants and to estimate components of variance. The models predict median potential dermal exposure rates for the hands and the rest of the body from the values of relevant exposure determinants. These rates are expressed as mg or microl product per minute. Using these median potential dermal exposure rates and an accompanying geometric standard deviation allows a range of exposure percentiles to be calculated.

  16. Physicologically Based Toxicokinetic Models of Tebuconazole and Application in Human Risk Assessment

    DEFF Research Database (Denmark)

    Jonsdottir, Svava Osk; Reffstrup, Trine Klein; Petersen, Annette

    2016-01-01

    (ADME) of tebuconazole. The developed models were validated on in vivo half-life data for rabbit with good results, and on plasma and tissue concentration-time course data of tebuconazole after i.v. administration in rabbit. In most cases, the predicted concentration levels were seen to be within......A series of physiologically based toxicokinetic (PBTK) models for tebuconazole were developed in four species, rat, rabbit, rhesus monkey, and human. The developed models were analyzed with respect to the application of the models in higher tier human risk assessment, and the prospect of using...... such models in risk assessment of cumulative and aggregate exposure is discussed. Relatively simple and biologically sound models were developed using available experimental data as parameters for describing the physiology of the species, as well as the absorption, distribution, metabolism, and elimination...

  17. Mean-variance model for portfolio optimization with background risk based on uncertainty theory

    Science.gov (United States)

    Zhai, Jia; Bai, Manying

    2018-04-01

    The aim of this paper is to develop a mean-variance model for portfolio optimization considering the background risk, liquidity and transaction cost based on uncertainty theory. In portfolio selection problem, returns of securities and assets liquidity are assumed as uncertain variables because of incidents or lacking of historical data, which are common in economic and social environment. We provide crisp forms of the model and a hybrid intelligent algorithm to solve it. Under a mean-variance framework, we analyze the portfolio frontier characteristic considering independently additive background risk. In addition, we discuss some effects of background risk and liquidity constraint on the portfolio selection. Finally, we demonstrate the proposed models by numerical simulations.

  18. An RES-Based Model for Risk Assessment and Prediction of Backbreak in Bench Blasting

    Science.gov (United States)

    Faramarzi, F.; Ebrahimi Farsangi, M. A.; Mansouri, H.

    2013-07-01

    Most blasting operations are associated with various forms of energy loss, emerging as environmental side effects of rock blasting, such as flyrock, vibration, airblast, and backbreak. Backbreak is an adverse phenomenon in rock blasting operations, which imposes risk and increases operation expenses because of safety reduction due to the instability of walls, poor fragmentation, and uneven burden in subsequent blasts. In this paper, based on the basic concepts of a rock engineering systems (RES) approach, a new model for the prediction of backbreak and the risk associated with a blast is presented. The newly suggested model involves 16 effective parameters on backbreak due to blasting, while retaining simplicity as well. The data for 30 blasts, carried out at Sungun copper mine, western Iran, were used to predict backbreak and the level of risk corresponding to each blast by the RES-based model. The results obtained were compared with the backbreak measured for each blast, which showed that the level of risk achieved is in consistence with the backbreak measured. The maximum level of risk [vulnerability index (VI) = 60] was associated with blast No. 2, for which the corresponding average backbreak was the highest achieved (9.25 m). Also, for blasts with levels of risk under 40, the minimum average backbreaks (<4 m) were observed. Furthermore, to evaluate the model performance for backbreak prediction, the coefficient of correlation ( R 2) and root mean square error (RMSE) of the model were calculated ( R 2 = 0.8; RMSE = 1.07), indicating the good performance of the model.

  19. A risk based model supporting long term maintenance and reinvestment strategy decision making

    Energy Technology Data Exchange (ETDEWEB)

    Sand, Kjell; Montard, Julien; Tremoen, Tord H.

    2010-02-15

    This Technical Report is a product from the project Risk-Based Distribution System Asset Management (short: RISK DSAM) - Work Package 3 Risk exposure on company/strategic level. In the report a concept for portfolio distribution system asset management is presented. The approach comprises four main steps: 1. Decide the asset base. 2. Divide the asset base into relevant archetypes. 3. Develop or select relevant maintenance and reinvestment strategies for the different archetypes. 4. Estimate risks and costs for each archetype for the relevant strategies. For the different steps guidelines are given and a proposal for implementation of the concept is given in terms of a proposed IT system architecture.To evaluate the feasibility of such a concept, a prototype was developed in by using Visual Basic macros in Excel using real technical data from a small DSO. The experience from using the prototype shows that the concept is realistic. All assets are included and depending of the ambition of the risk analysis both simple simulation models and more advanced might be embedded. Presentations of the concept for a utility engineers have receive positive feedback indicating that the concept is regarded as a practical way to develop risk based asset management strategies for the asset fleet. It should be noted that the concept should be applied on a company strategic level and is thus not designed to be applied for a specific project or asset decisions. For this, more detailed models with area specific information, topology etc. are needed. (Author)

  20. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  1. Radiation risk estimation based on measurement error models

    CERN Document Server

    Masiuk, Sergii; Shklyar, Sergiy; Chepurny, Mykola; Likhtarov, Illya

    2017-01-01

    This monograph discusses statistics and risk estimates applied to radiation damage under the presence of measurement errors. The first part covers nonlinear measurement error models, with a particular emphasis on efficiency of regression parameter estimators. In the second part, risk estimation in models with measurement errors is considered. Efficiency of the methods presented is verified using data from radio-epidemiological studies.

  2. WRF-based fire risk modelling and evaluation for years 2010 and 2012 in Poland

    Science.gov (United States)

    Stec, Magdalena; Szymanowski, Mariusz; Kryza, Maciej

    2016-04-01

    Wildfires are one of the main ecosystems' disturbances for forested, seminatural and agricultural areas. They generate significant economic loss, especially in forest management and agriculture. Forest fire risk modeling is therefore essential e.g. for forestry administration. In August 2015 a new method of forest fire risk forecasting entered into force in Poland. The method allows to predict a fire risk level in a 4-degree scale (0 - no risk, 3 - highest risk) and consists of a set of linearized regression equations. Meteorological information is used as predictors in regression equations, with air temperature, relative humidity, average wind speed, cloudiness and rainfall. The equations include also pine litter humidity as a measure of potential fuel characteristics. All these parameters are measured routinely in Poland at 42 basic and 94 auxiliary sites. The fire risk level is estimated for a current (basing on morning measurements) or next day (basing on midday measurements). Entire country is divided into 42 prognostic zones, and fire risk level for each zone is taken from the closest measuring site. The first goal of this work is to assess if the measurements needed for fire risk forecasting may be replaced by the data from mesoscale meteorological model. Additionally, the use of a meteorological model would allow to take into account much more realistic spatial differentiation of weather elements determining the fire risk level instead of discrete point-made measurements. Meteorological data have been calculated using the Weather Research and Forecasting model (WRF). For the purpose of this study the WRF model is run in the reanalysis mode allowing to estimate all required meteorological data in a 5-kilometers grid. The only parameter that cannot be directly calculated using WRF is the litter humidity, which has been estimated using empirical formula developed by Sakowska (2007). The experiments are carried out for two selected years: 2010 and 2012. The

  3. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    Directory of Open Access Journals (Sweden)

    Moiz Mumtaz

    2012-01-01

    Full Text Available Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures.

  4. Integration of an Evidence Base into a Probabilistic Risk Assessment Model. The Integrated Medical Model Database: An Organized Evidence Base for Assessing In-Flight Crew Health Risk and System Design

    Science.gov (United States)

    Saile, Lynn; Lopez, Vilma; Bickham, Grandin; FreiredeCarvalho, Mary; Kerstman, Eric; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) database, which is an organized evidence base for assessing in-flight crew health risk. The database is a relational database accessible to many people. The database quantifies the model inputs by a ranking based on the highest value of the data as Level of Evidence (LOE) and the quality of evidence (QOE) score that provides an assessment of the evidence base for each medical condition. The IMM evidence base has already been able to provide invaluable information for designers, and for other uses.

  5. Software for occupational health and safety risk analysis based on a fuzzy model.

    Science.gov (United States)

    Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan

    2012-01-01

    Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.

  6. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  7. Modelling and Simulating of Risk Behaviours in Virtual Environments Based on Multi-Agent and Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Linqin Cai

    2013-11-01

    Full Text Available Due to safety and ethical issues, traditional experimental approaches to modelling underground risk behaviours can be costly, dangerous and even impossible to realize. Based on multi-agent technology, a virtual coalmine platform for risk behaviour simulation is presented to model and simulate the human-machine-environment related risk factors in underground coalmines. To reveal mine workers' risk behaviours, a fuzzy emotional behaviour model is proposed to simulate underground miners' responding behaviours to potential hazardous events based on cognitive appraisal theories and fuzzy logic techniques. The proposed emotion model can generate more believable behaviours for virtual miners according to personalized emotion states, internal motivation needs and behaviour selection thresholds. Finally, typical accident cases of underground hazard spotting and locomotive transport were implemented. The behaviour believability of virtual miners was evaluated with a user assessment method. Experimental results show that the proposed models can create more realistic and reasonable behaviours in virtual coalmine environments, which can improve miners' risk awareness and further train miners' emergent decision-making ability when facing unexpected underground situations.

  8. Comparative performance of diabetes-specific and general population-based cardiovascular risk assessment models in people with diabetes mellitus.

    Science.gov (United States)

    Echouffo-Tcheugui, J-B; Kengne, A P

    2013-10-01

    Multivariable models for estimating cardiovascular disease (CVD) risk in people with diabetes comprise general population-based models and those from diabetic cohorts. Whether one set of models should receive preference is unclear. We evaluated the evidence on direct comparisons of the performance of general population vs diabetes-specific CVD risk models in people with diabetes. MEDLINE and EMBASE databases were searched up to March 2013. Two reviewers independently identified studies that compared the performance of general CVD models vs diabetes-specific ones in the same group of people with diabetes. Independent, dual data extraction on study design, risk models, outcomes; and measures of performance was conducted. Eleven articles reporting on 22 pair wise comparisons of a diabetes-specific model (UKPDS, ADVANCE and DCS risk models) to a general population model (three variants of the Framingham model, Prospective Cardiovascular Münster [PROCAM] score, CardioRisk Manager [CRM], Joint British Societies Coronary Risk Chart [JBSRC], Progetto Cuore algorithm and the CHD-Riskard algorithm) were eligible. Absolute differences in C-statistic of diabetes-specific vs general population-based models varied from -0.13 to 0.09. Comparisons for other performance measures were unusual. Outcomes definitions were congruent with those applied during model development. In 14 comparisons, the UKPDS, ADVANCE or DCS diabetes-specific models were superior to the general population CVD risk models. Authors reported better C-statistic for models they developed. The limited existing evidence suggests a possible discriminatory advantage of diabetes-specific over general population-based models for CVD risk stratification in diabetes. More robust head-to-head comparisons are needed to confirm this trend and strengthen recommendations. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  9. Credit Risk Modeling

    DEFF Research Database (Denmark)

    Lando, David

    Credit risk is today one of the most intensely studied topics in quantitative finance. This book provides an introduction and overview for readers who seek an up-to-date reference to the central problems of the field and to the tools currently used to analyze them. The book is aimed at researchers...... and students in finance, at quantitative analysts in banks and other financial institutions, and at regulators interested in the modeling aspects of credit risk. David Lando considers the two broad approaches to credit risk analysis: that based on classical option pricing models on the one hand...

  10. Are Masking-Based Models of Risk Useful?

    Science.gov (United States)

    Gisiner, Robert C

    2016-01-01

    As our understanding of directly observable effects from anthropogenic sound exposure has improved, concern about "unobservable" effects such as stress and masking have received greater attention. Equal energy models of masking such as power spectrum models have the appeal of simplicity, but do they offer biologically realistic assessments of the risk of masking? Data relevant to masking such as critical ratios, critical bandwidths, temporal resolution, and directional resolution along with what is known about general mammalian antimasking mechanisms all argue for a much more complicated view of masking when making decisions about the risk of masking inherent in a given anthropogenic sound exposure scenario.

  11. Breast cancer risks and risk prediction models.

    Science.gov (United States)

    Engel, Christoph; Fischer, Christine

    2015-02-01

    BRCA1/2 mutation carriers have a considerably increased risk to develop breast and ovarian cancer. The personalized clinical management of carriers and other at-risk individuals depends on precise knowledge of the cancer risks. In this report, we give an overview of the present literature on empirical cancer risks, and we describe risk prediction models that are currently used for individual risk assessment in clinical practice. Cancer risks show large variability between studies. Breast cancer risks are at 40-87% for BRCA1 mutation carriers and 18-88% for BRCA2 mutation carriers. For ovarian cancer, the risk estimates are in the range of 22-65% for BRCA1 and 10-35% for BRCA2. The contralateral breast cancer risk is high (10-year risk after first cancer 27% for BRCA1 and 19% for BRCA2). Risk prediction models have been proposed to provide more individualized risk prediction, using additional knowledge on family history, mode of inheritance of major genes, and other genetic and non-genetic risk factors. User-friendly software tools have been developed that serve as basis for decision-making in family counseling units. In conclusion, further assessment of cancer risks and model validation is needed, ideally based on prospective cohort studies. To obtain such data, clinical management of carriers and other at-risk individuals should always be accompanied by standardized scientific documentation.

  12. Risk Level Based Management System: a control banding model for occupational health and safety risk management in a highly regulated environment

    Energy Technology Data Exchange (ETDEWEB)

    Zalk, D; Kamerzell, R; Paik, S; Kapp, J; Harrington, D; Swuste, P

    2009-05-27

    The Risk Level Based Management System (RLBMS) is an occupational risk management (ORM) model that focuses occupational safety, hygeiene, and health (OSHH) resources on the highest risk procedures at work. This article demonstrates the model's simplicity through an implementation within a heavily regulated research institution. The model utilizes control banding strategies with a stratification of four risk levels (RLs) for many commonly performed maintenance and support activities, characterizing risk consistently for comparable tasks. RLBMS creates an auditable tracking of activities, maximizes OSHH professional field time, and standardizes documentation and control commensurate to a given task's RL. Validation of RLs and their exposure control effectiveness is collected in a traditional quantitative collection regime for regulatory auditing. However, qualitative risk assessment methods are also used within this validation process. Participatory approaches are used throughout the RLBMS process. Workers are involved in all phases of building, maintaining, and improving this model. This work participation also improves the implementation of established controls.

  13. Evaluation of three physiologically based pharmacokinetic (PBPK) modeling tools for emergency risk assessment after acute dichloromethane exposure

    NARCIS (Netherlands)

    Boerleider, R. Z.; Olie, J. D N; van Eijkeren, J. C H; Bos, P. M J; Hof, B. G H; de Vries, I.; Bessems, J. G M; Meulenbelt, J.; Hunault, C. C.

    2015-01-01

    Introduction: Physiologically based pharmacokinetic (PBPK) models may be useful in emergency risk assessment, after acute exposure to chemicals, such as dichloromethane (DCM). We evaluated the applicability of three PBPK models for human risk assessment following a single exposure to DCM: one model

  14. A point-based prediction model for cardiovascular risk in orthotopic liver transplantation: The CAR-OLT score.

    Science.gov (United States)

    VanWagner, Lisa B; Ning, Hongyan; Whitsett, Maureen; Levitsky, Josh; Uttal, Sarah; Wilkins, John T; Abecassis, Michael M; Ladner, Daniela P; Skaro, Anton I; Lloyd-Jones, Donald M

    2017-12-01

    Cardiovascular disease (CVD) complications are important causes of morbidity and mortality after orthotopic liver transplantation (OLT). There is currently no preoperative risk-assessment tool that allows physicians to estimate the risk for CVD events following OLT. We sought to develop a point-based prediction model (risk score) for CVD complications after OLT, the Cardiovascular Risk in Orthotopic Liver Transplantation risk score, among a cohort of 1,024 consecutive patients aged 18-75 years who underwent first OLT in a tertiary-care teaching hospital (2002-2011). The main outcome measures were major 1-year CVD complications, defined as death from a CVD cause or hospitalization for a major CVD event (myocardial infarction, revascularization, heart failure, atrial fibrillation, cardiac arrest, pulmonary embolism, and/or stroke). The bootstrap method yielded bias-corrected 95% confidence intervals for the regression coefficients of the final model. Among 1,024 first OLT recipients, major CVD complications occurred in 329 (32.1%). Variables selected for inclusion in the model (using model optimization strategies) included preoperative recipient age, sex, race, employment status, education status, history of hepatocellular carcinoma, diabetes, heart failure, atrial fibrillation, pulmonary or systemic hypertension, and respiratory failure. The discriminative performance of the point-based score (C statistic = 0.78, bias-corrected C statistic = 0.77) was superior to other published risk models for postoperative CVD morbidity and mortality, and it had appropriate calibration (Hosmer-Lemeshow P = 0.33). The point-based risk score can identify patients at risk for CVD complications after OLT surgery (available at www.carolt.us); this score may be useful for identification of candidates for further risk stratification or other management strategies to improve CVD outcomes after OLT. (Hepatology 2017;66:1968-1979). © 2017 by the American Association for the Study of Liver

  15. A Risk-Based Interval Two-Stage Programming Model for Agricultural System Management under Uncertainty

    Directory of Open Access Journals (Sweden)

    Ye Xu

    2016-01-01

    Full Text Available Nonpoint source (NPS pollution caused by agricultural activities is main reason that water quality in watershed becomes worse, even leading to deterioration. Moreover, pollution control is accompanied with revenue’s fall for agricultural system. How to design and generate a cost-effective and environmentally friendly agricultural production pattern is a critical issue for local managers. In this study, a risk-based interval two-stage programming model (RBITSP was developed. Compared to general ITSP model, significant contribution made by RBITSP model was that it emphasized importance of financial risk under various probabilistic levels, rather than only being concentrated on expected economic benefit, where risk is expressed as the probability of not meeting target profit under each individual scenario realization. This way effectively avoided solutions’ inaccuracy caused by traditional expected objective function and generated a variety of solutions through adjusting weight coefficients, which reflected trade-off between system economy and reliability. A case study of agricultural production management with the Tai Lake watershed was used to demonstrate superiority of proposed model. Obtained results could be a base for designing land-structure adjustment patterns and farmland retirement schemes and realizing balance of system benefit, system-failure risk, and water-body protection.

  16. Modelling domestic stock energy use and heat-related health risk : a GIS-based bottom-up modelling approach

    Energy Technology Data Exchange (ETDEWEB)

    Mavrogianni, A.; Davies, M. [Univ. College London, London (United Kingdom). Bartlett School of Graduate Studies; Chalabi, Z.; Wilkinson, P. [London School of Hygiene and Tropical Medecine, London (United Kingdom); Kolokotroni, M. [Brunel Univ., London (United Kingdom). School of Engineering Design

    2009-07-01

    Approximately 8 per cent of the carbon dioxide (CO{sub 2}) emissions produced in the United Kingdom are produced in London, one of the fastest growing cities worldwide. Based on the projected rates of population and economic growth, a 15 per cent increase of emissions is predicted. In addition to the national target to cut emissions by 80 per cent by 2050, the Mayor of London Climate Change Action Plan set a target to reduce London's CO{sub 2} emissions by 60 per cent by 2025. Significant carbon savings can be achieved in the building sector, particularly since 38 per cent of the total delivered energy in London is associated with domestic energy use. This paper demonstrated a systematic approach towards exploring the impact of urban built form and the combined effect of climate change and the urban heat island (UHI) phenomenon on the levels of domestic energy consumption and heat-related health risk in London. It presented work in progress on the development of a GIS-based energy consumption model and heat vulnerability index of the Greater London Area domestic stock. Comparison of the model output for 10 case study areas with topdown energy statistics revealed that the model successfully ranks areas based on their domestic space heating demand. The health module can be used to determine environments prone to higher risk of heat stress by investigating urban texture factors. A newly developed epidemiological model will be feed into the health module to examine the influence on risk of heat-related mortality of local urban built form characteristics. The epidemiological model is based on multi-variable analysis of deaths during heat wave and non-heat wave days. 29 refs., 1 tab., 7 figs.

  17. Risk assessment and model for community-based construction ...

    African Journals Online (AJOL)

    It, therefore, becomes necessary to systematically manage uncertainty in community-based construction in order to increase the likelihood of meeting project objectives using necessary risk management strategies. Risk management, which is an iterative process due to the dynamic nature of many risks, follows three main ...

  18. Model of MSD Risk Assessment at Workplace

    OpenAIRE

    K. Sekulová; M. Šimon

    2015-01-01

    This article focuses on upper-extremity musculoskeletal disorders risk assessment model at workplace. In this model are used risk factors that are responsible for musculoskeletal system damage. Based on statistic calculations the model is able to define what risk of MSD threatens workers who are under risk factors. The model is also able to say how MSD risk would decrease if these risk factors are eliminated.

  19. Environmental risk assessment of selected organic chemicals based on TOC test and QSAR estimation models.

    Science.gov (United States)

    Chi, Yulang; Zhang, Huanteng; Huang, Qiansheng; Lin, Yi; Ye, Guozhu; Zhu, Huimin; Dong, Sijun

    2018-02-01

    Environmental risks of organic chemicals have been greatly determined by their persistence, bioaccumulation, and toxicity (PBT) and physicochemical properties. Major regulations in different countries and regions identify chemicals according to their bioconcentration factor (BCF) and octanol-water partition coefficient (Kow), which frequently displays a substantial correlation with the sediment sorption coefficient (Koc). Half-life or degradability is crucial for the persistence evaluation of chemicals. Quantitative structure activity relationship (QSAR) estimation models are indispensable for predicting environmental fate and health effects in the absence of field- or laboratory-based data. In this study, 39 chemicals of high concern were chosen for half-life testing based on total organic carbon (TOC) degradation, and two widely accepted and highly used QSAR estimation models (i.e., EPI Suite and PBT Profiler) were adopted for environmental risk evaluation. The experimental results and estimated data, as well as the two model-based results were compared, based on the water solubility, Kow, Koc, BCF and half-life. Environmental risk assessment of the selected compounds was achieved by combining experimental data and estimation models. It was concluded that both EPI Suite and PBT Profiler were fairly accurate in measuring the physicochemical properties and degradation half-lives for water, soil, and sediment. However, the half-lives between the experimental and the estimated results were still not absolutely consistent. This suggests deficiencies of the prediction models in some ways, and the necessity to combine the experimental data and predicted results for the evaluation of environmental fate and risks of pollutants. Copyright © 2016. Published by Elsevier B.V.

  20. Holistic flood risk assessment using agent-based modelling: the case of Sint Maarten Island

    Science.gov (United States)

    Abayneh Abebe, Yared; Vojinovic, Zoran; Nikolic, Igor; Hammond, Michael; Sanchez, Arlex; Pelling, Mark

    2015-04-01

    Floods in coastal regions are regarded as one of the most dangerous and harmful disasters. Though commonly referred to as natural disasters, coastal floods are also attributable to various social, economic, historical and political issues. Rapid urbanisation in coastal areas combined with climate change and poor governance can lead to a significant increase in the risk of pluvial flooding coinciding with fluvial and coastal flooding posing a greater risk of devastation in coastal communities. Disasters that can be triggered by hydro-meteorological events are interconnected and interrelated with both human activities and natural processes. They, therefore, require holistic approaches to help understand their complexity in order to design and develop adaptive risk management approaches that minimise social and economic losses and environmental impacts, and increase resilience to such events. Being located in the North Atlantic Ocean, Sint Maarten is frequently subjected to hurricanes. In addition, the stormwater catchments and streams on Sint Maarten have several unique characteristics that contribute to the severity of flood-related impacts. Urban environments are usually situated in low-lying areas, with little consideration for stormwater drainage, and as such are subject to flash flooding. Hence, Sint Maarten authorities drafted policies to minimise the risk of flood-related disasters on the island. In this study, an agent-based model is designed and applied to understand the implications of introduced policies and regulations, and to understand how different actors' behaviours influence the formation, propagation and accumulation of flood risk. The agent-based model built for this study is based on the MAIA meta-model, which helps to decompose, structure and conceptualize socio-technical systems with an agent-oriented perspective, and is developed using the NetLogo simulation environment. The agents described in this model are households and businesses, and

  1. Recognition of risk situations based on endoscopic instrument tracking and knowledge based situation modeling

    Science.gov (United States)

    Speidel, Stefanie; Sudra, Gunther; Senemaud, Julien; Drentschew, Maximilian; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger

    2008-03-01

    Minimally invasive surgery has gained significantly in importance over the last decade due to the numerous advantages on patient-side. The surgeon has to adapt special operation-techniques and deal with difficulties like the complex hand-eye coordination, limited field of view and restricted mobility. To alleviate these constraints we propose to enhance the surgeon's capabilities by providing a context-aware assistance using augmented reality (AR) techniques. In order to generate a context-aware assistance it is necessary to recognize the current state of the intervention using intraoperatively gained sensor data and a model of the surgical intervention. In this paper we present the recognition of risk situations, the system warns the surgeon if an instrument gets too close to a risk structure. The context-aware assistance system starts with an image-based analysis to retrieve information from the endoscopic images. This information is classified and a semantic description is generated. The description is used to recognize the current state and launch an appropriate AR visualization. In detail we present an automatic vision-based instrument tracking to obtain the positions of the instruments. Situation recognition is performed using a knowledge representation based on a description logic system. Two augmented reality visualization programs are realized to warn the surgeon if a risk situation occurs.

  2. A Simple Model to Rank Shellfish Farming Areas Based on the Risk of Disease Introduction and Spread.

    Science.gov (United States)

    Thrush, M A; Pearce, F M; Gubbins, M J; Oidtmann, B C; Peeler, E J

    2017-08-01

    The European Union Council Directive 2006/88/EC requires that risk-based surveillance (RBS) for listed aquatic animal diseases is applied to all aquaculture production businesses. The principle behind this is the efficient use of resources directed towards high-risk farm categories, animal types and geographic areas. To achieve this requirement, fish and shellfish farms must be ranked according to their risk of disease introduction and spread. We present a method to risk rank shellfish farming areas based on the risk of disease introduction and spread and demonstrate how the approach was applied in 45 shellfish farming areas in England and Wales. Ten parameters were used to inform the risk model, which were grouped into four risk themes based on related pathways for transmission of pathogens: (i) live animal movement, (ii) transmission via water, (iii) short distance mechanical spread (birds) and (iv) long distance mechanical spread (vessels). Weights (informed by expert knowledge) were applied both to individual parameters and to risk themes for introduction and spread to reflect their relative importance. A spreadsheet model was developed to determine quantitative scores for the risk of pathogen introduction and risk of pathogen spread for each shellfish farming area. These scores were used to independently rank areas for risk of introduction and for risk of spread. Thresholds were set to establish risk categories (low, medium and high) for introduction and spread based on risk scores. Risk categories for introduction and spread for each area were combined to provide overall risk categories to inform a risk-based surveillance programme directed at the area level. Applying the combined risk category designation framework for risk of introduction and spread suggested by European Commission guidance for risk-based surveillance, 4, 10 and 31 areas were classified as high, medium and low risk, respectively. © 2016 Crown copyright.

  3. Effect of risk-based payment model on caries inequalities in preschool children assessed by geo-mapping.

    Science.gov (United States)

    Holmén, Anders; Strömberg, Ulf; Håkansson, Gunnel; Twetman, Svante

    2018-01-05

    To describe, with aid of geo-mapping, the effects of a risk-based capitation model linked to caries-preventive guidelines on the polarization of caries in preschool children living in the Halland region of Sweden. The new capitation model was implemented in 2013 in which more money was allocated to Public Dental Clinics surrounded by administrative parishes inhabited by children with increased caries risk, while a reduced capitation was allocated to those clinics with a low burden of high risk children. Regional geo-maps of caries risk based on caries prevalence, level of education and the families purchasing power were produced for 3-6-year-old children in 2010 (n = 10,583) and 2016 (n = 7574). Newly migrated children to the region (n = 344 in 2010 and n = 522 in 2016) were analyzed separately. A regional caries polarization index was calculated as the ratio between the maximum and minimum estimates of caries frequency on parish-level, based on a Bayesian hierarchical mapping model. Overall, the total caries prevalence (dmfs > 0) remained unchanged from 2010 (10.6%) to 2016 (10.5%). However, the polarization index decreased from 7.0 in 2010 to 5.6 in 2016. Newly arrived children born outside Sweden had around four times higher caries prevalence than their Swedish-born peers. A risk-based capitation model could reduce the socio-economic inequalities in dental caries among preschool children living in Sweden. Although updated evidence-based caries-preventive guidelines were released, the total prevalence of caries on dentin surface level was unaffected 4 years after the implementation.

  4. Individual-based model for radiation risk assessment

    Science.gov (United States)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  5. Korean risk assessment model for breast cancer risk prediction.

    Science.gov (United States)

    Park, Boyoung; Ma, Seung Hyun; Shin, Aesun; Chang, Myung-Chul; Choi, Ji-Yeob; Kim, Sungwan; Han, Wonshik; Noh, Dong-Young; Ahn, Sei-Hyun; Kang, Daehee; Yoo, Keun-Young; Park, Sue K

    2013-01-01

    We evaluated the performance of the Gail model for a Korean population and developed a Korean breast cancer risk assessment tool (KoBCRAT) based upon equations developed for the Gail model for predicting breast cancer risk. Using 3,789 sets of cases and controls, risk factors for breast cancer among Koreans were identified. Individual probabilities were projected using Gail's equations and Korean hazard data. We compared the 5-year and lifetime risk produced using the modified Gail model which applied Korean incidence and mortality data and the parameter estimators from the original Gail model with those produced using the KoBCRAT. We validated the KoBCRAT based on the expected/observed breast cancer incidence and area under the curve (AUC) using two Korean cohorts: the Korean Multicenter Cancer Cohort (KMCC) and National Cancer Center (NCC) cohort. The major risk factors under the age of 50 were family history, age at menarche, age at first full-term pregnancy, menopausal status, breastfeeding duration, oral contraceptive usage, and exercise, while those at and over the age of 50 were family history, age at menarche, age at menopause, pregnancy experience, body mass index, oral contraceptive usage, and exercise. The modified Gail model produced lower 5-year risk for the cases than for the controls (p = 0.017), while the KoBCRAT produced higher 5-year and lifetime risk for the cases than for the controls (pKorean women, especially urban women.

  6. Evaluation model for safety capacity of chemical industrial park based on acceptable regional risk

    Institute of Scientific and Technical Information of China (English)

    Guohua Chen; Shukun Wang; Xiaoqun Tan

    2015-01-01

    The paper defines the Safety Capacity of Chemical Industrial Park (SCCIP) from the perspective of acceptable regional risk. For the purpose of exploring the evaluation model for the SCCIP, a method based on quantitative risk assessment was adopted for evaluating transport risk and to confirm reasonable safety transport capacity of chemical industrial park, and then by combining with the safety storage capacity, a SCCIP evaluation model was put forward. The SCCIP was decided by the smaller one between the largest safety storage capacity and the maximum safety transport capacity, or else, the regional risk of the park will exceed the acceptable level. The developed method was applied to a chemical industrial park in Guangdong province to obtain the maximum safety transport capacity and the SCCIP. The results can be realized in the regional risk control of the park effectively.

  7. A Nonparametric Operational Risk Modeling Approach Based on Cornish-Fisher Expansion

    Directory of Open Access Journals (Sweden)

    Xiaoqian Zhu

    2014-01-01

    Full Text Available It is generally accepted that the choice of severity distribution in loss distribution approach has a significant effect on the operational risk capital estimation. However, the usually used parametric approaches with predefined distribution assumption might be not able to fit the severity distribution accurately. The objective of this paper is to propose a nonparametric operational risk modeling approach based on Cornish-Fisher expansion. In this approach, the samples of severity are generated by Cornish-Fisher expansion and then used in the Monte Carlo simulation to sketch the annual operational loss distribution. In the experiment, the proposed approach is employed to calculate the operational risk capital charge for the overall Chinese banking. The experiment dataset is the most comprehensive operational risk dataset in China as far as we know. The results show that the proposed approach is able to use the information of high order moments and might be more effective and stable than the usually used parametric approach.

  8. Study of a risk-based piping inspection guideline system.

    Science.gov (United States)

    Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung

    2007-02-01

    A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.

  9. Modeling Research Project Risks with Fuzzy Maps

    Science.gov (United States)

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  10. Fuzzy audit risk modeling algorithm

    Directory of Open Access Journals (Sweden)

    Zohreh Hajihaa

    2011-07-01

    Full Text Available Fuzzy logic has created suitable mathematics for making decisions in uncertain environments including professional judgments. One of the situations is to assess auditee risks. During recent years, risk based audit (RBA has been regarded as one of the main tools to fight against fraud. The main issue in RBA is to determine the overall audit risk an auditor accepts, which impact the efficiency of an audit. The primary objective of this research is to redesign the audit risk model (ARM proposed by auditing standards. The proposed model of this paper uses fuzzy inference systems (FIS based on the judgments of audit experts. The implementation of proposed fuzzy technique uses triangular fuzzy numbers to express the inputs and Mamdani method along with center of gravity are incorporated for defuzzification. The proposed model uses three FISs for audit, inherent and control risks, and there are five levels of linguistic variables for outputs. FISs include 25, 25 and 81 rules of if-then respectively and officials of Iranian audit experts confirm all the rules.

  11. Risk-based technical specifications: Development and application of an approach to the generation of a plant specific real-time risk model

    International Nuclear Information System (INIS)

    Puglia, B.; Gallagher, D.; Amico, P.; Atefi, B.

    1992-10-01

    This report describes a process developed to convert an existing PRA into a model amenable to real time, risk-based technical specification calculations. In earlier studies (culminating in NUREG/CR-5742), several risk-based approaches to technical specification were evaluated. A real-time approach using a plant specific PRA capable of modeling plant configurations as they change was identified as the most comprehensive approach to control plant risk. A master fault tree logic model representative of-all of the core damage sequences was developed. Portions of the system fault trees were modularized and supercomponents comprised of component failures with similar effects were developed to reduce the size of the model and, quantification times. Modifications to the master fault tree logic were made to properly model the effect of maintenance and recovery actions. Fault trees representing several actuation systems not modeled in detail in the existing PRA were added to the master fault tree logic. This process was applied to the Surry NUREG-1150 Level 1 PRA. The master logic mode was confirmed. The model was then used to evaluate frequency associated with several plant configurations using the IRRAS code. For all cases analyzed computational time was less than three minutes. This document Volume 2, contains appendices A, B, and C. These provide, respectively: Surry Technical Specifications Model Database, Surry Technical Specifications Model, and a list of supercomponents used in the Surry Technical Specifications Model

  12. P2P Lending Risk Contagion Analysis Based on a Complex Network Model

    Directory of Open Access Journals (Sweden)

    Qi Wei

    2016-01-01

    Full Text Available This paper analyzes two major channels of P2P lending risk contagion in China—direct risk contagion between platforms and indirect risk contagion with other financial organizations as the contagion medium. Based on this analysis, the current study constructs a complex network model of P2P lending risk contagion in China and performs dynamics analogue simulations in order to analyze general characteristics of direct risk contagion among China’s online P2P lending platforms. The assumed conditions are that other financial organizations act as the contagion medium, with variations in the risk contagion characteristics set under the condition of significant information asymmetry in Internet lending. It is indicated that, compared to direct risk contagion among platforms, both financial organizations acting as the contagion medium and information asymmetry magnify the effect of risk contagion. It is also found that the superposition of media effects and information asymmetry is more likely to magnify the risk contagion effect.

  13. Adequacy of relative and absolute risk models for lifetime risk estimate of radiation-induced cancer

    International Nuclear Information System (INIS)

    McBride, M.; Coldman, A.J.

    1988-03-01

    This report examines the applicability of the relative (multiplicative) and absolute (additive) models in predicting lifetime risk of radiation-induced cancer. A review of the epidemiologic literature, and a discussion of the mathematical models of carcinogenesis and their relationship to these models of lifetime risk, are included. Based on the available data, the relative risk model for the estimation of lifetime risk is preferred for non-sex-specific epithelial tumours. However, because of lack of knowledge concerning other determinants of radiation risk and of background incidence rates, considerable uncertainty in modelling lifetime risk still exists. Therefore, it is essential that follow-up of exposed cohorts be continued so that population-based estimates of lifetime risk are available

  14. Use of an ecologically relevant modelling approach to improve remote sensing-based schistosomiasis risk profiling

    Directory of Open Access Journals (Sweden)

    Yvonne Walz

    2015-11-01

    Full Text Available Schistosomiasis is a widespread water-based disease that puts close to 800 million people at risk of infection with more than 250 million infected, mainly in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and the frequency, duration and extent of human bodies exposed to infested water sources during human water contact. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. Since schistosomiasis risk profiling based on remote sensing data inherits a conceptual drawback if school-based disease prevalence data are directly related to the remote sensing measurements extracted at the location of the school, because the disease transmission usually does not exactly occur at the school, we took the local environment around the schools into account by explicitly linking ecologically relevant environmental information of potential disease transmission sites to survey measurements of disease prevalence. Our models were validated at two sites with different landscapes in Côte d’Ivoire using high- and moderateresolution remote sensing data based on random forest and partial least squares regression. We found that the ecologically relevant modelling approach explained up to 70% of the variation in Schistosoma infection prevalence and performed better compared to a purely pixelbased modelling approach. Furthermore, our study showed that model performance increased as a function of enlarging the school catchment area, confirming the hypothesis that suitable environments for schistosomiasis transmission rarely occur at the location of survey measurements.

  15. An example of population-level risk assessments for small mammals using individual-based population models.

    Science.gov (United States)

    Schmitt, Walter; Auteri, Domenica; Bastiansen, Finn; Ebeling, Markus; Liu, Chun; Luttik, Robert; Mastitsky, Sergey; Nacci, Diane; Topping, Chris; Wang, Magnus

    2016-01-01

    This article presents a case study demonstrating the application of 3 individual-based, spatially explicit population models (IBMs, also known as agent-based models) in ecological risk assessments to predict long-term effects of a pesticide to populations of small mammals. The 3 IBMs each used a hypothetical fungicide (FungicideX) in different scenarios: spraying in cereals (common vole, Microtus arvalis), spraying in orchards (field vole, Microtus agrestis), and cereal seed treatment (wood mouse, Apodemus sylvaticus). Each scenario used existing model landscapes, which differed greatly in size and structural complexity. The toxicological profile of FungicideX was defined so that the deterministic long-term first tier risk assessment would result in high risk to small mammals, thus providing the opportunity to use the IBMs for risk assessment refinement (i.e., higher tier risk assessment). Despite differing internal model design and scenarios, results indicated in all 3 cases low population sensitivity unless FungicideX was applied at very high (×10) rates. Recovery from local population impacts was generally fast. Only when patch extinctions occured in simulations of intentionally high acute toxic effects, recovery periods, then determined by recolonization, were of any concern. Conclusions include recommendations for the most important input considerations, including the selection of exposure levels, duration of simulations, statistically robust number of replicates, and endpoints to report. However, further investigation and agreement are needed to develop recommendations for landscape attributes such as size, structure, and crop rotation to define appropriate regulatory risk assessment scenarios. Overall, the application of IBMs provides multiple advantages to higher tier ecological risk assessments for small mammals, including consistent and transparent direct links to specific protection goals, and the consideration of more realistic scenarios. © 2015 SETAC.

  16. A Bayesian network model for predicting type 2 diabetes risk based on electronic health records

    Science.gov (United States)

    Xie, Jiang; Liu, Yan; Zeng, Xu; Zhang, Wu; Mei, Zhen

    2017-07-01

    An extensive, in-depth study of diabetes risk factors (DBRF) is of crucial importance to prevent (or reduce) the chance of suffering from type 2 diabetes (T2D). Accumulation of electronic health records (EHRs) makes it possible to build nonlinear relationships between risk factors and diabetes. However, the current DBRF researches mainly focus on qualitative analyses, and the inconformity of physical examination items makes the risk factors likely to be lost, which drives us to study the novel machine learning approach for risk model development. In this paper, we use Bayesian networks (BNs) to analyze the relationship between physical examination information and T2D, and to quantify the link between risk factors and T2D. Furthermore, with the quantitative analyses of DBRF, we adopt EHR and propose a machine learning approach based on BNs to predict the risk of T2D. The experiments demonstrate that our approach can lead to better predictive performance than the classical risk model.

  17. Combined prediction model for supply risk in nuclear power equipment manufacturing industry based on support vector machine and decision tree

    International Nuclear Information System (INIS)

    Shi Chunsheng; Meng Dapeng

    2011-01-01

    The prediction index for supply risk is developed based on the factor identifying of nuclear equipment manufacturing industry. The supply risk prediction model is established with the method of support vector machine and decision tree, based on the investigation on 3 important nuclear power equipment manufacturing enterprises and 60 suppliers. Final case study demonstrates that the combination model is better than the single prediction model, and demonstrates the feasibility and reliability of this model, which provides a method to evaluate the suppliers and measure the supply risk. (authors)

  18. Multilevel joint competing risk models

    Science.gov (United States)

    Karunarathna, G. H. S.; Sooriyarachchi, M. R.

    2017-09-01

    Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).

  19. Risk of the Maritime Supply Chain System Based on Interpretative Structural Model

    Directory of Open Access Journals (Sweden)

    Jiang He

    2017-11-01

    Full Text Available Marine transportation is the most important transport mode of in the international trade, but the maritime supply chain is facing with many risks. At present, most of the researches on the risk of the maritime supply chain focus on the risk identification and risk management, and barely carry on the quantitative analysis of the logical structure of each influencing factor. This paper uses the interpretative structure model to analysis the maritime supply chain risk system. On the basis of comprehensive literature analysis and expert opinion, this paper puts forward 16 factors of maritime supply chain risk system. Using the interpretative structure model to construct maritime supply chain risk system, and then optimize the model. The model analyzes the structure of the maritime supply chain risk system and its forming process, and provides a scientific basis for the controlling the maritime supply chain risk, and puts forward some corresponding suggestions for the prevention and control the maritime supply chain risk.

  20. A real-time, dynamic early-warning model based on uncertainty analysis and risk assessment for sudden water pollution accidents.

    Science.gov (United States)

    Hou, Dibo; Ge, Xiaofan; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2014-01-01

    A real-time, dynamic, early-warning model (EP-risk model) is proposed to cope with sudden water quality pollution accidents affecting downstream areas with raw-water intakes (denoted as EPs). The EP-risk model outputs the risk level of water pollution at the EP by calculating the likelihood of pollution and evaluating the impact of pollution. A generalized form of the EP-risk model for river pollution accidents based on Monte Carlo simulation, the analytic hierarchy process (AHP) method, and the risk matrix method is proposed. The likelihood of water pollution at the EP is calculated by the Monte Carlo method, which is used for uncertainty analysis of pollutants' transport in rivers. The impact of water pollution at the EP is evaluated by expert knowledge and the results of Monte Carlo simulation based on the analytic hierarchy process. The final risk level of water pollution at the EP is determined by the risk matrix method. A case study of the proposed method is illustrated with a phenol spill accident in China.

  1. Home-Based Risk of Falling Assessment Test Using a Closed-Loop Balance Model.

    Science.gov (United States)

    Ayena, Johannes C; Zaibi, Helmi; Otis, Martin J-D; Menelas, Bob-Antoine J

    2016-12-01

    The aim of this study is to improve and facilitate the methods used to assess risk of falling at home among older people through the computation of a risk of falling in real time in daily activities. In order to increase a real time computation of the risk of falling, a closed-loop balance model is proposed and compared with One-Leg Standing Test (OLST). This balance model allows studying the postural response of a person having an unpredictable perturbation. Twenty-nine volunteers participated in this study for evaluating the effectiveness of the proposed system which includes seventeen elder participants: ten healthy elderly ( 68.4 ±5.5 years), seven Parkinson's disease (PD) subjects ( 66.28 ±8.9 years), and twelve healthy young adults ( 28.27 ±3.74 years). Our work suggests that there is a relationship between OLST score and the risk of falling based on center of pressure measurement with four low cost force sensors located inside an instrumented insole, which could be predicted using our suggested closed-loop balance model. For long term monitoring at home, this system could be included in a medical electronic record and could be useful as a diagnostic aid tool.

  2. Agent-Based Modelling for Security Risk Assessment

    NARCIS (Netherlands)

    Janssen, S.A.M.; Sharpans'kykh, Alexei; Bajo, J.; Vale, Z.; Hallenborg, K.; Rocha, A.P.; Mathieu, P.; Pawlewski, P.; Del Val, E.; Novais, P.; Lopes, F.; Duque Méndez, N.D.; Julián, V.; Holmgren, J.

    2017-01-01

    Security Risk Assessment is commonly performed by using traditional methods based on linear probabilistic tools and informal expert judgements. These methods lack the capability to take the inherent dynamic and intelligent nature of attackers into account. To partially address the limitations,

  3. Collision risk in white-tailed eagles. Modelling kernel-based collision risk using satellite telemetry data in Smoela wind-power plant

    Energy Technology Data Exchange (ETDEWEB)

    May, Roel; Nygaard, Torgeir; Dahl, Espen Lie; Reitan, Ole; Bevanger, Kjetil

    2011-05-15

    Large soaring birds of prey, such as the white-tailed eagle, are recognized to be perhaps the most vulnerable bird group regarding risk of collisions with turbines in wind-power plants. Their mortalities have called for methods capable of modelling collision risks in connection with the planning of new wind-power developments. The so-called 'Band model' estimates collision risk based on the number of birds flying through the rotor swept zone and the probability of being hit by the passing rotor blades. In the calculations for the expected collision mortality a correction factor for avoidance behaviour is included. The overarching objective of this study was to use satellite telemetry data and recorded mortality to back-calculate the correction factor for white-tailed eagles. The Smoela wind-power plant consists of 68 turbines, over an area of approximately 18 km2. Since autumn 2006 the number of collisions has been recorded on a weekly basis. The analyses were based on satellite telemetry data from 28 white-tailed eagles equipped with backpack transmitters since 2005. The correction factor (i.e. 'avoidance rate') including uncertainty levels used within the Band collision risk model for white-tailed eagles was 99% (94-100%) for spring and 100% for the other seasons. The year-round estimate, irrespective of season, was 98% (95-99%). Although the year-round estimate was similar, the correction factor for spring was higher than the correction factor of 95% derived earlier from vantage point data. The satellite telemetry data may provide an alternative way to provide insight into relative risk among seasons, and help identify periods or areas with increased risk either in a pre- or post construction situation. (Author)

  4. Application of discriminant analysis-based model for prediction of risk of low back disorders due to workplace design in industrial jobs.

    Science.gov (United States)

    Ganga, G M D; Esposto, K F; Braatz, D

    2012-01-01

    The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.

  5. Population-based absolute risk estimation with survey data

    Science.gov (United States)

    Kovalchik, Stephanie A.; Pfeiffer, Ruth M.

    2013-01-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  6. Toward risk assessment 2.0: Safety supervisory control and model-based hazard monitoring for risk-informed safety interventions

    International Nuclear Information System (INIS)

    Favarò, Francesca M.; Saleh, Joseph H.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a staple in the engineering risk community, and it has become to some extent synonymous with the entire quantitative risk assessment undertaking. Limitations of PRA continue to occupy researchers, and workarounds are often proposed. After a brief review of this literature, we propose to address some of PRA's limitations by developing a novel framework and analytical tools for model-based system safety, or safety supervisory control, to guide safety interventions and support a dynamic approach to risk assessment and accident prevention. Our work shifts the emphasis from the pervading probabilistic mindset in risk assessment toward the notions of danger indices and hazard temporal contingency. The framework and tools here developed are grounded in Control Theory and make use of the state-space formalism in modeling dynamical systems. We show that the use of state variables enables the definition of metrics for accident escalation, termed hazard levels or danger indices, which measure the “proximity” of the system state to adverse events, and we illustrate the development of such indices. Monitoring of the hazard levels provides diagnostic information to support both on-line and off-line safety interventions. For example, we show how the application of the proposed tools to a rejected takeoff scenario provides new insight to support pilots’ go/no-go decisions. Furthermore, we augment the traditional state-space equations with a hazard equation and use the latter to estimate the times at which critical thresholds for the hazard level are (b)reached. This estimation process provides important prognostic information and produces a proxy for a time-to-accident metric or advance notice for an impending adverse event. The ability to estimate these two hazard coordinates, danger index and time-to-accident, offers many possibilities for informing system control strategies and improving accident prevention and risk mitigation

  7. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    Directory of Open Access Journals (Sweden)

    Ninna Reitzel Jensen

    2015-06-01

    Full Text Available Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance and unit-linked insurance. By use of a two-account model, we are able to illustrate general concepts without making the model too abstract. To allow for complicated financial markets without dramatically increasing the mathematical complexity, we focus on economic scenarios. We illustrate the use of our model by conducting scenario analysis based on Monte Carlo simulation, but the model applies to scenarios in general and to worst-case and best-estimate scenarios in particular. In addition to easy computations, our model offers a common framework for the valuation of life insurance payments across product types. This enables comparison of participating life insurance products and unit-linked insurance products, thus building a bridge between the two different ways of formalizing life insurance products. Finally, our model distinguishes itself from the existing literature by taking into account the Markov model for the state of the policyholder and, hereby, facilitating event risk.

  8. Surface water flood risk and management strategies for London: An Agent-Based Model approach

    Directory of Open Access Journals (Sweden)

    Jenkins Katie

    2016-01-01

    Full Text Available Flooding is recognised as one of the most common and costliest natural disasters in England. Flooding in urban areas during heavy rainfall is known as ‘surface water flooding’, considered to be the most likely cause of flood events and one of the greatest short-term climate risks for London. In this paper we present results from a novel Agent-Based Model designed to assess the interplay between different adaptation options, different agents, and the role of flood insurance and the flood insurance pool, Flood Re, in the context of climate change. The model illustrates how investment in adaptation options could reduce London’s surface water flood risk, today and in the future. However, benefits can be outweighed by continued development in high risk areas and the effects of climate change. Flood Re is beneficial in its function to provide affordable insurance, even under climate change. However, it offers no additional benefits in terms of overall risk reduction, and will face increasing pressure due to rising surface water flood risk in the future. The modelling approach and findings are highly relevant for reviewing the proposed Flood Re scheme, as well as for wider discussions on the potential of insurance schemes, and broader multi-sectoral partnerships, to incentivise flood risk management in the UK and internationally.

  9. Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Science.gov (United States)

    These model-based estimates use two surveys, the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS). The two surveys are combined using novel statistical methodology.

  10. Application of REVEAL-W to risk-based configuration control

    International Nuclear Information System (INIS)

    Dezfuli, H.; Meyer, J.; Modarres, M.

    1994-01-01

    Over the past two years, the concept of risk-based configuration control has been introduced to the US Nuclear Regulatory Commission and the nuclear industry. Converting much of the current, deterministically based regulation of nuclear power plants to risk-based regulation can result in lower levels of risk while relieving unnecessary burdens on power plant operators and regulatory staff. To achieve the potential benefits of risk-based configuration control, the risk models developed for nuclear power plants should be (1) flexible enough to effectively support necessary risk calculations, and (2) transparent enough to encourage their use by all parties. To address these needs, SCIENTECH, Inc., has developed the PC-based REVEAL W (formerly known as SMART). This graphic-oriented and user-friendly application software allows the user to develop transparent complex logic models based on the concept of the master plant logic diagram. The logic model is success-oriented and compact. The analytical capability built into REVEAL W is generic, so the software can support different types of risk-based evaluations, such as probabilistic safety assessment, accident sequence precursor analysis, design evaluation and configuration management. In this paper, we focus on the application of REVEAL W to support risk-based configuration control of nuclear power plants. (author)

  11. A Risk Metric Assessment of Scenario-Based Market Risk Measures for Volatility and Risk Estimation: Evidence from Emerging Markets

    Directory of Open Access Journals (Sweden)

    Sitima Innocent

    2015-03-01

    Full Text Available The study evaluated the sensitivity of the Value- at- Risk (VaR and Expected Shortfalls (ES with respect to portfolio allocation in emerging markets with an index portfolio of a developed market. This study utilised different models for VaR and ES techniques using various scenario-based models such as Covariance Methods, Historical Simulation and the GARCH (1, 1 for the predictive ability of these models in both relatively stable market conditions and extreme market conditions. The results showed that Expected Shortfall has less risk tolerance than VaR based on the same scenario-based market risk measures

  12. Fuzzy Comprehensive Evaluation of Ecological Risk Based on Cloud Model: Taking Chengchao Iron Mine as Example

    Science.gov (United States)

    Ruan, Jinghua; Chen, Yong; Xiao, Xiao; Yong, Gan; Huang, Ranran; Miao, Zuohua

    2018-01-01

    Aimed at the fuzziness and randomness during the evaluation process, this paper constructed a fuzzy comprehensive evaluation method based on cloud model. The evaluation index system was established based on the inherent risk, present level and control situation, which had been proved to be able to convey the main contradictions of ecological risk in mine on the macro level, and be advantageous for comparison among mines. The comment sets and membership functions improved by cloud model could reflect the uniformity of ambiguity and randomness effectively. In addition, the concept of fuzzy entropy was introduced to further characterize the fuzziness of assessments results and the complexities of ecological problems in target mine. A practical example in Chengchao Iron Mine evidenced that, the assessments results can reflect actual situations appropriately and provide a new theoretic guidance for comprehensive ecological risk evaluation of underground iron mine.

  13. 12 CFR Appendix B to Part 3 - Risk-Based Capital Guidelines; Market Risk Adjustment

    Science.gov (United States)

    2010-01-01

    ...) The bank must have a risk control unit that reports directly to senior management and is independent... management systems at least annually. (c) Market risk factors. The bank's internal model must use risk.... Section 4. Internal Models (a) General. For risk-based capital purposes, a bank subject to this appendix...

  14. Risk Management Technologies With Logic and Probabilistic Models

    CERN Document Server

    Solozhentsev, E D

    2012-01-01

    This book presents intellectual, innovative, information technologies (I3-technologies) based on logical and probabilistic (LP) risk models. The technologies presented here consider such models for structurally complex systems and processes with logical links and with random events in economics and technology.  The volume describes the following components of risk management technologies: LP-calculus; classes of LP-models of risk and efficiency; procedures for different classes; special software for different classes; examples of applications; methods for the estimation of probabilities of events based on expert information. Also described are a variety of training courses in these topics. The classes of risk models treated here are: LP-modeling, LP-classification, LP-efficiency, and LP-forecasting. Particular attention is paid to LP-models of risk of failure to resolve difficult economic and technical problems. Amongst the  discussed  procedures of I3-technologies  are the construction of  LP-models,...

  15. Lung cancer risk models from experimental animals

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1988-03-01

    The objective of this paper is to present analyses of data based on methods that adequately account for time-related factors and competiting risks, and that yield results that are expressed in a form comparable to results obtained from recent analyses of epidemiological studies of humans exposed to radon and radon daughters. These epidemiological analyses have modeled the hazard, or age-specific death rates, as a function of factors such as dose and dose rate, time from exposure, and time from cessation of exposure. The starting point for many of the analyses of human data has been the constant relative risk modeling which the age-specific death rates are assumed to be a function of cumulative dose, and the risks due to exposure are assumed to be proportional to the age-specific baseline death rates. However, departures from this initial model, such as dependence of risks on age at risk and/or time from exposure, have been investigated. These analyses have frequently been based on a non-parametric model that requires minimal assumptions regarding the baseline risks and their dependence on age

  16. Model-Based Estimation of Collision Risks of Predatory Birds with Wind Turbines

    Directory of Open Access Journals (Sweden)

    Marcus Eichhorn

    2012-06-01

    Full Text Available The expansion of renewable energies, such as wind power, is a promising way of mitigating climate change. Because of the risk of collision with rotor blades, wind turbines have negative effects on local bird populations, particularly on raptors such as the Red Kite (Milvus milvus. Appropriate assessment tools for these effects have been lacking. To close this gap, we have developed an agent-based, spatially explicit model that simulates the foraging behavior of the Red Kite around its aerie in a landscape consisting of different land-use types. We determined the collision risk of the Red Kite with the turbine as a function of the distance between the wind turbine and the aerie and other parameters. The impact function comprises the synergistic effects of species-specific foraging behavior and landscape structure. The collision risk declines exponentially with increasing distance. The strength of this decline depends on the raptor's foraging behavior, its ability to avoid wind turbines, and the mean wind speed in the region. The collision risks, which are estimated by the simulation model, are in the range of values observed in the field. The derived impact function shows that the collision risk can be described as an aggregated function of distance between the wind turbine and the raptor's aerie. This allows an easy and rapid assessment of the ecological impacts of (existing or planned wind turbines in relation to their spatial location. Furthermore, it implies that minimum buffer zones for different landscapes can be determined in a defensible way. This modeling approach can be extended to other bird species with central-place foraging behavior. It provides a helpful tool for landscape planning aimed at minimizing the impacts of wind power on biodiversity.

  17. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  18. A global airport-based risk model for the spread of dengue infection via the air transport network.

    Directory of Open Access Journals (Sweden)

    Lauren Gardner

    Full Text Available The number of travel-acquired dengue infections has seen a consistent global rise over the past decade. An increased volume of international passenger air traffic originating from regions with endemic dengue has contributed to a rise in the number of dengue cases in both areas of endemicity and elsewhere. This paper reports results from a network-based risk assessment model which uses international passenger travel volumes, travel routes, travel distances, regional populations, and predictive species distribution models (for the two vector species, Aedes aegypti and Aedes albopictus to quantify the relative risk posed by each airport in importing passengers with travel-acquired dengue infections. Two risk attributes are evaluated: (i the risk posed by through traffic at each stopover airport and (ii the risk posed by incoming travelers to each destination airport. The model results prioritize optimal locations (i.e., airports for targeted dengue surveillance. The model is easily extendible to other vector-borne diseases.

  19. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    Science.gov (United States)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging

  20. Projecting Sexual and Injecting HIV Risks into Future Outcomes with Agent-Based Modeling

    Science.gov (United States)

    Bobashev, Georgiy V.; Morris, Robert J.; Zule, William A.

    Longitudinal studies of health outcomes for HIV could be very costly cumbersome and not representative of the risk population. Conversely, cross-sectional approaches could be representative but rely on the retrospective information to estimate prevalence and incidence. We present an Agent-based Modeling (ABM) approach where we use behavioral data from a cross-sectional representative study and project the behavior into the future so that the risks of acquiring HIV could be studied in a dynamical/temporal sense. We show how the blend of behavior and contact network factors (sexual, injecting) play the role in the risk of future HIV acquisition and time till obtaining HIV. We show which subjects are the most likely persons to get HIV in the next year, and whom they are likely to infect. We examine how different behaviors are related to the increase or decrease of HIV risks and how to estimate the quantifiable risk measures such as survival HIV free.

  1. A Corrosion Risk Assessment Model for Underground Piping

    Science.gov (United States)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  2. A risk-based model for predicting the impact of using condoms on the spread of sexually transmitted infections

    Directory of Open Access Journals (Sweden)

    Asma Azizi

    2017-02-01

    Full Text Available We create and analyze a mathematical model to understand the impact of condom-use and sexual behavior on the prevalence and spread of Sexually Transmitted Infections (STIs. STIs remain significant public health challenges globally with a high burden of some Sexually Transmitted Diseases (STDs in both developed and undeveloped countries. Although condom-use is known to reduce the transmission of STIs, there are a few quantitative population-based studies on the protective role of condom-use in reducing the incidence of STIs. The number of concurrent partners is correlated with their risk of being infectious by an STI such as chlamydia, gonorrhea, or syphilis. We develop a Susceptible-Infectious-Susceptible (SIS model that stratifies the population based on the number of concurrent partners. The model captures the multi-level heterogeneous mixing through a combination of biased (preferential and random (proportional mixing processes between individuals with distinct risk levels, and accounts for differences in condom-use in the low- and high-risk populations. We use sensitivity analysis to assess the relative impact of high-risk people using condom as a prophylactic intervention to reduce their chance of being infectious, or infecting others. The model predicts the STI prevalence as a function of the number of partners of an individual, and quantifies how this distribution of effective partners changes as a function of condom-use. Our results show that when the mixing is random, then increasing the condom-use in the high-risk population is more effective in reducing the prevalence than when many of the partners of high-risk people have high risk. The model quantifies how the risk of being infected increases for people who have more partners, and the need for high-risk people to consistently use condoms to reduce their risk of infection. Keywords: Mathematical modeling, Sexually transmitted infection (STI, Biased (preferential mixing, Random

  3. A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis

    NARCIS (Netherlands)

    Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L.; Postmus, Douwe

    2011-01-01

    Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multicriteria model that fully takes into account the evidence on efficacy and adverse drug

  4. Information risk and security modeling

    Science.gov (United States)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  5. The determination of risk areas for muddy floods based on a worst-case erosion modelling

    Science.gov (United States)

    Saathoff, Ulfert; Schindewolf, Marcus; Annika Arévalo, Sarah

    2013-04-01

    Soil erosion and muddy floods are a frequently occurring hazard in the German state of Saxony, because of the topography and the high relief energy together with the high proportion of arable land. Still, the events are rather heterogeneously distributed and we do not know where damage is likely to occur. The goal of this study is to locate hot spots for the risk of muddy floods, with the objective to prevent high economic damage in future. We applied a soil erosion and deposition map of Saxony, calculated with the process based soil erosion model EROSION 3D. This map shows the potential soil erosion and transported sediment for worst case soil conditions and a 10 year rain storm event. Furthermore, a map of the current landuse in the state is used. From the landuse map, we extracted those areas that are especially vulnerable to muddy floods, like residential and industrial areas, infrastructural facilities (e.g. power plants, hospitals) and highways. In combination with the output of the soil erosion model, the amount of sediment, that enters each single landuse entity, is calculated. Based on this data, a state-wide map with classified risks is created. The results are furthermore used to identify the risk of muddy floods for each single municipality in Saxony. The results are evaluated with data of real occurred muddy flood events with documented locations during the period between 2000 and 2010. Additionally, plausibility tests are performed for selected areas (examination of landuse, topography and soil). The results prove to be plausible and most of the documented events can be explained by the modelled risk map. The created map can be used by different institutions like city and traffic planners, to estimate the risk of muddy flood occurrence at specific locations. Furthermore, the risk map can serve insurance companies to evaluate the insurance risk of a building. To make them easily accessible, the risk map will be published online via a web GIS

  6. Risk-based decision making for terrorism applications.

    Science.gov (United States)

    Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas

    2009-03-01

    This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application.

  7. Incentivising flood risk adaptation through risk based insurance premiums : Trade-offs between affordability and risk reduction

    NARCIS (Netherlands)

    Hudson, Paul F.; Botzen, W.J.W.; Feyen, L.; Aerts, Jeroen C.J.H.

    2016-01-01

    The financial incentives offered by the risk-based pricing of insurance can stimulate policyholder adaptation to flood risk while potentially conflicting with affordability. We examine the trade-off between risk reduction and affordability in a model of public-private flood insurance in France and

  8. Prediction Model of Collapse Risk Based on Information Entropy and Distance Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Hujun He

    2017-01-01

    Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.

  9. Operational risk quantification and modelling within Romanian insurance industry

    Directory of Open Access Journals (Sweden)

    Tudor Răzvan

    2017-07-01

    Full Text Available This paper aims at covering and describing the shortcomings of various models used to quantify and model the operational risk within insurance industry with a particular focus on Romanian specific regulation: Norm 6/2015 concerning the operational risk issued by IT systems. While most of the local insurers are focusing on implementing the standard model to compute the Operational Risk solvency capital required, the local regulator has issued a local norm that requires to identify and assess the IT based operational risks from an ISO 27001 perspective. The challenges raised by the correlations assumed in the Standard model are substantially increased by this new regulation that requires only the identification and quantification of the IT operational risks. The solvency capital requirement stipulated by the implementation of Solvency II doesn’t recommend a model or formula on how to integrate the newly identified risks in the Operational Risk capital requirements. In this context we are going to assess the academic and practitioner’s understanding in what concerns: The Frequency-Severity approach, Bayesian estimation techniques, Scenario Analysis and Risk Accounting based on risk units, and how they could support the modelling of operational risk that are IT based. Developing an internal model only for the operational risk capital requirement proved to be, so far, costly and not necessarily beneficial for the local insurers. As the IT component will play a key role in the future of the insurance industry, the result of this analysis will provide a specific approach in operational risk modelling that can be implemented in the context of Solvency II, in a particular situation when (internal or external operational risk databases are scarce or not available.

  10. Risk Prediction Model for Severe Postoperative Complication in Bariatric Surgery.

    Science.gov (United States)

    Stenberg, Erik; Cao, Yang; Szabo, Eva; Näslund, Erik; Näslund, Ingmar; Ottosson, Johan

    2018-01-12

    Factors associated with risk for adverse outcome are important considerations in the preoperative assessment of patients for bariatric surgery. As yet, prediction models based on preoperative risk factors have not been able to predict adverse outcome sufficiently. This study aimed to identify preoperative risk factors and to construct a risk prediction model based on these. Patients who underwent a bariatric surgical procedure in Sweden between 2010 and 2014 were identified from the Scandinavian Obesity Surgery Registry (SOReg). Associations between preoperative potential risk factors and severe postoperative complications were analysed using a logistic regression model. A multivariate model for risk prediction was created and validated in the SOReg for patients who underwent bariatric surgery in Sweden, 2015. Revision surgery (standardized OR 1.19, 95% confidence interval (CI) 1.14-0.24, p prediction model. Despite high specificity, the sensitivity of the model was low. Revision surgery, high age, low BMI, large waist circumference, and dyspepsia/GERD were associated with an increased risk for severe postoperative complication. The prediction model based on these factors, however, had a sensitivity that was too low to predict risk in the individual patient case.

  11. Comparison of models used for ecological risk assessment and human health risk assessment

    International Nuclear Information System (INIS)

    Ryti, R.T.; Gallegos, A.F.

    1994-01-01

    Models are used to derive action levels for site screening, or to estimate potential ecological or human health risks posed by potentially hazardous sites. At the Los Alamos National Laboratory (LANL), which is RCRA-regulated, the human-health screening action levels are based on hazardous constituents described in RCRA Subpart S and RESRAD-derived soil guidelines (based on 10 mRem/year) for radiological constituents. Also, an ecological risk screening model was developed for a former firing site, where the primary constituents include depleted uranium, beryllium and lead. Sites that fail the screening models are evaluated with site-specific human risk assessment (using RESRAD and other approaches) and a detailed ecological effect model (ECOTRAN). ECOTRAN is based on pharmacokinetics transport modeling within a multitrophic-level biological-growth dynamics model. ECOTRAN provides detailed temporal records of contaminant concentrations in biota, and annual averages of these body burdens are compared to equivalent site-specific runs of the RESRAD model. The results show that thoughtful interpretation of the results of these models must be applied before they can be used for evaluation of current risk posed by sites and the benefits of various remedial options. This presentation compares the concentrations of biological media in the RESRAD screening runs to the concentrations in ecological endpoints predicted by the ecological screening model. The assumptions and limitations of these screening models and the decision process where these are screening models are applied are discussed

  12. Psychosocial Modeling of Insider Threat Risk Based on Behavioral and Word Use Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Greitzer, Frank L.; Kangas, Lars J.; Noonan, Christine F.; Brown, Christopher R.; Ferryman, Thomas A.

    2013-10-01

    In many insider crimes, managers and other coworkers observed that the offenders had exhibited signs of stress, disgruntlement, or other issues, but no alarms were raised. Barriers to using such psychosocial indicators include the inability to recognize the signs and the failure to record the behaviors so that they can be assessed. A psychosocial model was developed to assess an employee’s behavior associated with an increased risk of insider abuse. The model is based on case studies and research literature on factors/correlates associated with precursor behavioral manifestations of individuals committing insider crimes. A complementary Personality Factor modeling approach was developed based on analysis to derive relevant personality characteristics from word use. Several implementations of the psychosocial model were evaluated by comparing their agreement with judgments of human resources and management professionals; the personality factor modeling approach was examined using email samples. If implemented in an operational setting, these models should be part of a set of management tools for employee assessment to identify employees who pose a greater insider threat.

  13. Challenges of using HIV as a primary risk indicator: Need for integrated blood donor risk management model

    NARCIS (Netherlands)

    Mapako, T.; Parirewa, J.J.; Emmanuel, J.C.; Mvere, D.A.; Massundah, E.; Mavunganidze, G.; Marowa, L.M.; Postma, M.J.; Van Hulst, M.

    2015-01-01

    Background: The use of risk modelling in blood safety is increasing getting momentum. NBSZ initiated blood donor risk profiling based on donation frequency (r-coding) since 1994 and in 2006 a generic risk classification model was developed (include age and donation venue) which was mainly based on

  14. Risk assessment model for nuclear accident emergency protection countermeasure based on fuzzy matter-element analysis

    International Nuclear Information System (INIS)

    Xin Jing; Tang Huaqing; Zhang Yinghua; Zhang Limin

    2009-01-01

    A risk assessment model of nuclear accident emergency protection countermeasure based on fuzzy matter-element analysis and Euclid approach degree is proposed in the paper. The weight of assessed index is determined by information entropy and the scoring by experts, which could not only make full use of the inherent information of the indexes adequately, but reduce subjective assumption in the course of assessment effectively. The applied result shows that it is reasonable that the model is adopted to make risk assessment for nuclear accident emergency protective countermeasure,and it could be a kind of effective analytical method and decision making basis to choose the optimum protection countermeasure. (authors)

  15. Risk Evaluation of Railway Coal Transportation Network Based on Multi Level Grey Evaluation Model

    Science.gov (United States)

    Niu, Wei; Wang, Xifu

    2018-01-01

    The railway transport mode is currently the most important way of coal transportation, and now China’s railway coal transportation network has become increasingly perfect, but there is still insufficient capacity, some lines close to saturation and other issues. In this paper, the theory and method of risk assessment, analytic hierarchy process and multi-level gray evaluation model are applied to the risk evaluation of coal railway transportation network in China. Based on the example analysis of Shanxi railway coal transportation network, to improve the internal structure and the competitiveness of the market.

  16. Integrated Monitoring and Modeling of Carbon Dioxide Leakage Risk Using Remote Sensing, Ground-Based Monitoring, Atmospheric Models and Risk-Indexing Tools

    Science.gov (United States)

    Burton, E. A.; Pickles, W. L.; Gouveia, F. J.; Bogen, K. T.; Rau, G. H.; Friedmann, J.

    2006-12-01

    estimating its associated risk, spatially and temporally. This requires integration of subsurface, surface and atmospheric data and models. To date, we have developed techniques to map risk based on predicted atmospheric plumes and GIS/MT (meteorologic- topographic) risk-indexing tools. This methodology was derived from study of large CO2 releases from an abandoned well penetrating a natural CO2 reservoir at Crystal Geyser, Utah. This integrated approach will provide a powerful tool to screen for high-risk zones at proposed sequestration sites, to design and optimize surface networks for site monitoring and/or to guide setting science-based regulatory compliance requirements for monitoring sequestration sites, as well as to target critical areas for first responders should a catastrophic-release event occur. This work was performed under the auspices of the U.S. Dept. of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48.

  17. Innovative Models of Dental Care Delivery and Coverage: Patient-Centric Dental Benefits Based on Digital Oral Health Risk Assessment.

    Science.gov (United States)

    Martin, John; Mills, Shannon; Foley, Mary E

    2018-04-01

    Innovative models of dental care delivery and coverage are emerging across oral health care systems causing changes to treatment and benefit plans. A novel addition to these models is digital risk assessment, which offers a promising new approach that incorporates the use of a cloud-based technology platform to assess an individual patient's risk for oral disease. Risk assessment changes treatment by including risk as a modifier of treatment and as a determinant of preventive services. Benefit plans are being developed to use risk assessment to predetermine preventive benefits for patients identified at elevated risk for oral disease. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Risk-based modelling of surface water quality: a case study of the Charles River, Massachusetts

    Science.gov (United States)

    McIntyre, Neil R.; Wagener, Thorsten; Wheater, Howard S.; Chapra, Steven C.

    2003-04-01

    A model of phytoplankton, dissolved oxygen and nutrients is presented and applied to the Charles River, Massachusetts within a framework of Monte Carlo simulation. The model parameters are conditioned using data from eight sampling stations along a 40 km stretch of the Charles River, during a (supposed) steady-state period in the summer of 1996, and the conditioned model is evaluated using data from later in the same year. Regional multi-objective sensitivity analysis is used to identify the parameters and pollution sources most affecting the various model outputs under the conditions observed during that summer. The effects of Monte Carlo sampling error are included in this analysis, and the observations which have least contributed to model conditioning are indicated. It is shown that the sensitivity analysis can be used to speculate about the factors responsible for undesirable levels of eutrophication, and to speculate about the risk of failure of nutrient reduction interventions at a number of strategic control sections. The analysis indicates that phosphorus stripping at the CRPCD wastewater treatment plant on the Charles River would be a high-risk intervention, especially for controlling eutrophication at the control sections further downstream. However, as the risk reflects the perceived scope for model error, it can only be recommended that more resources are invested in data collection and model evaluation. Furthermore, as the risk is based solely on water quality criteria, rather than broader environmental and economic objectives, the results need to be supported by detailed and extensive knowledge of the Charles River problem.

  19. A carbon risk prediction model for Chinese heavy-polluting industrial enterprises based on support vector machine

    International Nuclear Information System (INIS)

    Zhou, Zhifang; Xiao, Tian; Chen, Xiaohong; Wang, Chang

    2016-01-01

    Chinese heavy-polluting industrial enterprises, especially petrochemical or chemical industry, labeled low carbon efficiency and high emission load, are facing the tremendous pressure of emission reduction under the background of global shortage of energy supply and constrain of carbon emission. However, due to the limited amount of theoretic and practical research in this field, problems like lacking prediction indicators or models, and the quantified standard of carbon risk remain unsolved. In this paper, the connotation of carbon risk and an assessment index system for Chinese heavy-polluting industrial enterprises (eg. coal enterprise, petrochemical enterprises, chemical enterprises et al.) based on support vector machine are presented. By using several heavy-polluting industrial enterprises’ related data, SVM model is trained to predict the carbon risk level of a specific enterprise, which allows the enterprise to identify and manage its carbon risks. The result shows that this method can predict enterprise’s carbon risk level in an efficient, accurate way with high practical application and generalization value.

  20. Data Sources for the Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Science.gov (United States)

    The model-based estimates of important cancer risk factors and screening behaviors are obtained by combining the responses to the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS).

  1. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  2. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  3. Using Cutting-Edge Tree-Based Stochastic Models to Predict Credit Risk

    Directory of Open Access Journals (Sweden)

    Khaled Halteh

    2018-05-01

    Full Text Available Credit risk is a critical issue that affects banks and companies on a global scale. Possessing the ability to accurately predict the level of credit risk has the potential to help the lender and borrower. This is achieved by alleviating the number of loans provided to borrowers with poor financial health, thereby reducing the number of failed businesses, and, in effect, preventing economies from collapsing. This paper uses state-of-the-art stochastic models, namely: Decision trees, random forests, and stochastic gradient boosting to add to the current literature on credit-risk modelling. The Australian mining industry has been selected to test our methodology. Mining in Australia generates around $138 billion annually, making up more than half of the total goods and services. This paper uses publicly-available financial data from 750 risky and not risky Australian mining companies as variables in our models. Our results indicate that stochastic gradient boosting was the superior model at correctly classifying the good and bad credit-rated companies within the mining sector. Our model showed that ‘Property, Plant, & Equipment (PPE turnover’, ‘Invested Capital Turnover’, and ‘Price over Earnings Ratio (PER’ were the variables with the best explanatory power pertaining to predicting credit risk in the Australian mining sector.

  4. Stimulating household flood risk mitigation investments through insurance and subsidies: an Agent-Based Modelling approach

    Science.gov (United States)

    Haer, Toon; Botzen, Wouter; de Moel, Hans; Aerts, Jeroen

    2015-04-01

    In the period 1998-2009, floods triggered roughly 52 billion euro in insured economic losses making floods the most costly natural hazard in Europe. Climate change and socio/economic trends are expected to further aggrevate floods losses in many regions. Research shows that flood risk can be significantly reduced if households install protective measures, and that the implementation of such measures can be stimulated through flood insurance schemes and subsidies. However, the effectiveness of such incentives to stimulate implementation of loss-reducing measures greatly depends on the decision process of individuals and is hardly studied. In our study, we developed an Agent-Based Model that integrates flood damage models, insurance mechanisms, subsidies, and household behaviour models to assess the effectiveness of different economic tools on stimulating households to invest in loss-reducing measures. Since the effectiveness depends on the decision making process of individuals, the study compares different household decision models ranging from standard economic models, to economic models for decision making under risk, to more complex decision models integrating economic models and risk perceptions, opinion dynamics, and the influence of flood experience. The results show the effectiveness of incentives to stimulate investment in loss-reducing measures for different household behavior types, while assuming climate change scenarios. It shows how complex decision models can better reproduce observed real-world behaviour compared to traditional economic models. Furthermore, since flood events are included in the simulations, the results provide an analysis of the dynamics in insured and uninsured losses for households, the costs of reducing risk by implementing loss-reducing measures, the capacity of the insurance market, and the cost of government subsidies under different scenarios. The model has been applied to the City of Rotterdam in The Netherlands.

  5. Mechanistic modeling for mammography screening risks

    International Nuclear Information System (INIS)

    Bijwaard, Harmen

    2008-01-01

    Full text: Western populations show a very high incidence of breast cancer and in many countries mammography screening programs have been set up for the early detection of these cancers. Through these programs large numbers of women (in the Netherlands, 700.000 per year) are exposed to low but not insignificant X-ray doses. ICRP based risk estimates indicate that the number of breast cancer casualties due to mammography screening can be as high as 50 in the Netherlands per year. The number of lives saved is estimated to be much higher, but for an accurate calculation of the benefits of screening a better estimate of these risks is indispensable. Here it is attempted to better quantify the radiological risks of mammography screening through the application of a biologically based model for breast tumor induction by X-rays. The model is applied to data obtained from the National Institutes of Health in the U.S. These concern epidemiological data of female TB patients who received high X-ray breast doses in the period 1930-1950 through frequent fluoroscopy of their lungs. The mechanistic model that is used to describe the increased breast cancer incidence is based on an earlier study by Moolgavkar et al. (1980), in which the natural background incidence of breast cancer was modeled. The model allows for a more sophisticated extrapolation of risks to the low dose X-ray exposures that are common in mammography screening and to the higher ages that are usually involved. Furthermore, it allows for risk transfer to other (non-western) populations. The results have implications for decisions on the frequency of screening, the number of mammograms taken at each screening, minimum and maximum ages for screening and the transfer to digital equipment. (author)

  6. IT Operational Risk Measurement Model Based on Internal Loss Data of Banks

    Science.gov (United States)

    Hao, Xiaoling

    Business operation of banks relies increasingly on information technology (IT) and the most important role of IT is to guarantee the operational continuity of business process. Therefore, IT Risk management efforts need to be seen from the perspective of operational continuity. Traditional IT risk studies focused on IT asset-based risk analysis and risk-matrix based qualitative risk evaluation. In practice, IT risk management practices of banking industry are still limited to the IT department and aren't integrated into business risk management, which causes the two departments to work in isolation. This paper presents an improved methodology for dealing with IT operational risk. It adopts quantitative measurement method, based on the internal business loss data about IT events, and uses Monte Carlo simulation to predict the potential losses. We establish the correlation between the IT resources and business processes to make sure risk management of IT and business can work synergistically.

  7. [Non-linear System Dynamics Simulation Modeling of Adolescent Obesity: Using Korea Youth Risk Behavior Web-based Survey].

    Science.gov (United States)

    Lee, Hanna; Park, Eun Suk; Yu, Jae Kook; Yun, Eun Kyoung

    2015-10-01

    The purpose of this study was to develop a system dynamics model for adolescent obesity in Korea that could be used for obesity policy analysis. On the basis of the casual loop diagram, a model was developed by converting to stock and flow diagram. The Vensim DSS 5.0 program was used in the model development. We simulated method of moments to the calibration of this model with data from The Korea Youth Risk Behavior Web-based Survey 2005 to 2013. We ran the scenario simulation. This model can be used to understand the current adolescent obesity rate, predict the future obesity rate, and be utilized as a tool for controlling the risk factors. The results of the model simulation match well with the data. It was identified that a proper model, able to predict obesity probability, was established. These results of stock and flow diagram modeling in adolescent obesity can be helpful in development of obesity by policy planners and other stakeholders to better anticipate the multiple effects of interventions in both the short and the long term. In the future we suggest the development of an expanded model based on this adolescent obesity model.

  8. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  9. Physics-Based Identification, Modeling and Risk Management for Aeroelastic Flutter and Limit-Cycle Oscillations (LCO), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed research program will develop a physics-based identification, modeling and risk management infrastructure for aeroelastic transonic flutter and...

  10. Methodology for the Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Science.gov (United States)

    This model-based approach uses data from both the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS) to produce estimates of the prevalence rates of cancer risk factors and screening behaviors at the state, health service area, and county levels.

  11. Improvement of the projection models for radiogenic cancer risk

    International Nuclear Information System (INIS)

    Tong Jian

    2005-01-01

    Calculations of radiogenic cancer risk are based on the risk projection models for specific cancer sites. Improvement has been made for the parameters used in the previous models including introductions of mortality and morbidity risk coefficients, and age-/ gender-specific risk coefficients. These coefficients have been applied to calculate the radiogenic cancer risks for specific organs and radionuclides under different exposure scenarios. (authors)

  12. Physiologically Based Toxicokinetic Modelling as a Tool to Support Risk Assessment: Three Case Studies

    Directory of Open Access Journals (Sweden)

    Hans Mielke

    2012-01-01

    Full Text Available In this contribution we present three case studies of physiologically based toxicokinetic (PBTK modelling in regulatory risk assessment. (1 Age-dependent lower enzyme expression in the newborn leads to bisphenol A (BPA blood levels which are near the levels of the tolerated daily intake (TDI at the oral exposure as calculated by EFSA. (2 Dermal exposure of BPA by receipts, car park tickets, and so forth, contribute to the exposure towards BPA. However, at the present levels of dermal exposure there is no risk for the adult. (3 Dermal exposure towards coumarin via cosmetic products leads to external exposures of two-fold the TDI. PBTK modeling helped to identify liver peak concentration as the metric for liver toxicity. After dermal exposure of twice the TDI, the liver peak concentration was lower than that present after oral exposure with the TDI dose. In the presented cases, PBTK modeling was useful to reach scientifically sound regulatory decisions.

  13. Relative risk estimation of Chikungunya disease in Malaysia: An analysis based on Poisson-gamma model

    Science.gov (United States)

    Samat, N. A.; Ma'arof, S. H. Mohd Imam

    2015-05-01

    Disease mapping is a method to display the geographical distribution of disease occurrence, which generally involves the usage and interpretation of a map to show the incidence of certain diseases. Relative risk (RR) estimation is one of the most important issues in disease mapping. This paper begins by providing a brief overview of Chikungunya disease. This is followed by a review of the classical model used in disease mapping, based on the standardized morbidity ratio (SMR), which we then apply to our Chikungunya data. We then fit an extension of the classical model, which we refer to as a Poisson-Gamma model, when prior distributions for the relative risks are assumed known. Both results are displayed and compared using maps and we reveal a smoother map with fewer extremes values of estimated relative risk. The extensions of this paper will consider other methods that are relevant to overcome the drawbacks of the existing methods, in order to inform and direct government strategy for monitoring and controlling Chikungunya disease.

  14. Predictive risk modelling under different data access scenarios: who is identified as high risk and for how long?

    Science.gov (United States)

    Johnson, Tracy L; Kaldor, Jill; Sutherland, Kim; Humphries, Jacob; Jorm, Louisa R; Levesque, Jean-Frederic

    2018-01-01

    Objective This observational study critically explored the performance of different predictive risk models simulating three data access scenarios, comparing: (1) sociodemographic and clinical profiles; (2) consistency in high-risk designation across models; and (3) persistence of high-risk status over time. Methods Cross-sectional health survey data (2006–2009) for more than 260 000 Australian adults 45+ years were linked to longitudinal individual hospital, primary care, pharmacy and mortality data. Three risk models predicting acute emergency hospitalisations were explored, simulating conditions where data are accessed through primary care practice management systems, or through hospital-based electronic records, or through a hypothetical ‘full’ model using a wider array of linked data. High-risk patients were identified using different risk score thresholds. Models were reapplied monthly for 24 months to assess persistence in high-risk categorisation. Results The three models displayed similar statistical performance. Three-quarters of patients in the high-risk quintile from the ‘full’ model were also identified using the primary care or hospital-based models, with the remaining patients differing according to age, frailty, multimorbidity, self-rated health, polypharmacy, prior hospitalisations and imminent mortality. The use of higher risk prediction thresholds resulted in lower levels of agreement in high-risk designation across models and greater morbidity and mortality in identified patient populations. Persistence of high-risk status varied across approaches according to updated information on utilisation history, with up to 25% of patients reassessed as lower risk within 1 year. Conclusion/implications Small differences in risk predictors or risk thresholds resulted in comparatively large differences in who was classified as high risk and for how long. Pragmatic predictive risk modelling design decisions based on data availability or projected

  15. Quantitative occupational risk model: Single hazard

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Aneziris, O.N.; Bellamy, L.J.; Ale, B.J.M.; Oh, J.

    2017-01-01

    A model for the quantification of occupational risk of a worker exposed to a single hazard is presented. The model connects the working conditions and worker behaviour to the probability of an accident resulting into one of three types of consequence: recoverable injury, permanent injury and death. Working conditions and safety barriers in place to reduce the likelihood of an accident are included. Logical connections are modelled through an influence diagram. Quantification of the model is based on two sources of information: a) number of accidents observed over a period of time and b) assessment of exposure data of activities and working conditions over the same period of time and the same working population. Effectiveness of risk reducing measures affecting the working conditions, worker behaviour and/or safety barriers can be quantified through the effect of these measures on occupational risk. - Highlights: • Quantification of occupational risk from a single hazard. • Influence diagram connects working conditions, worker behaviour and safety barriers. • Necessary data include the number of accidents and the total exposure of worker • Effectiveness of risk reducing measures is quantified through the impact on the risk • An example illustrates the methodology.

  16. A semi-quantitative model for risk appreciation and risk weighing

    DEFF Research Database (Denmark)

    Bos, Peter M.J.; Boon, Polly E.; van der Voet, Hilko

    2009-01-01

    Risk managers need detailed information on (1) the type of effect, (2) the size (severity) of the expected effect(s) and (3) the fraction of the population at risk to decide on well-balanced risk reduction measures. A previously developed integrated probabilistic risk assessment (IPRA) model...... provides quantitative information on these three parameters. A semi-quantitative tool is presented that combines information on these parameters into easy-readable charts that will facilitate risk evaluations of exposure situations and decisions on risk reduction measures. This tool is based on a concept...... detailed information on the estimated health impact in a given exposure situation. These graphs will facilitate the discussions on appropriate risk reduction measures to be taken....

  17. Developing points-based risk-scoring systems in the presence of competing risks.

    Science.gov (United States)

    Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P

    2016-09-30

    Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  18. Globally-Applicable Predictive Wildfire Model   a Temporal-Spatial GIS Based Risk Analysis Using Data Driven Fuzzy Logic Functions

    Science.gov (United States)

    van den Dool, G.

    2017-11-01

    This study (van den Dool, 2017) is a proof of concept for a global predictive wildfire model, in which the temporal-spatial characteristics of wildfires are placed in a Geographical Information System (GIS), and the risk analysis is based on data-driven fuzzy logic functions. The data sources used in this model are available as global datasets, but subdivided into three pilot areas: North America (California/Nevada), Europe (Spain), and Asia (Mongolia), and are downscaled to the highest resolution (3-arc second). The GIS is constructed around three themes: topography, fuel availability and climate. From the topographical data, six derived sub-themes are created and converted to a fuzzy membership based on the catchment area statistics. The fuel availability score is a composite of four data layers: land cover, wood loads, biomass, biovolumes. As input for the climatological sub-model reanalysed daily averaged, weather-related data is used, which is accumulated to a global weekly time-window (to account for the uncertainty within the climatological model) and forms the temporal component of the model. The final product is a wildfire risk score (from 0 to 1) by week, representing the average wildfire risk in an area. To compute the potential wildfire risk the sub-models are combined usinga Multi-Criteria Approach, and the model results are validated against the area under the Receiver Operating Characteristic curve.

  19. An Empirical Agent-Based Model to Simulate the Adoption of Water Reuse Using the Social Amplification of Risk Framework.

    Science.gov (United States)

    Kandiah, Venu; Binder, Andrew R; Berglund, Emily Z

    2017-10-01

    Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large-scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision-making process. Based on the social amplification of risk framework, our agent-based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the "risk publics" model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community-level parameters-including social groups, relationships, and communication variables, also from survey data-are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks. © 2017 Society for Risk Analysis.

  20. Knowledge-Based Energy Damage Model for Evaluating Industrialised Building Systems (IBS Occupational Health and Safety (OHS Risk

    Directory of Open Access Journals (Sweden)

    Abas Nor Haslinda

    2016-01-01

    Full Text Available Malaysia’s construction industry has been long considered hazardous, owing to its poor health and safety record. It is proposed that one of the ways to improve safety and health in the construction industry is through the implementation of ‘off-site’ systems, commonly termed ‘industrialised building systems (IBS’ in Malaysia. This is deemed safer based on the risk concept of reduced exposure, brought about by the reduction in onsite workers; however, no method yet exists for determining the relative safety of various construction methods, including IBS. This study presents a comparative evaluation of the occupational health and safety (OHS risk presented by different construction approaches, namely IBS and traditional methods. The evaluation involved developing a model based on the concept of ‘argumentation theory’, which helps construction designers integrate the management of OHS risk into the design process. In addition, an ‘energy damage model’ was used as an underpinning framework. Development of the model was achieved through three phases, namely Phase I – knowledge acquisitaion, Phase II – argument trees mapping, and Phase III – validation of the model. The research revealed that different approaches/methods of construction projects carried a different level of energy damage, depending on how the activities were carried out. A study of the way in which the risks change from one construction process to another shows that there is a difference in the profile of OHS risk between IBS construction and traditional methods.Therefore, whether the option is an IBS or traditional approach, the fundamental idea of the model is to motivate construction designers or decision-makers to address safety in the design process and encourage them to examine carefully the probable OHS risk variables surrounding an action, thus preventing accidents in construction.

  1. Area-based assessment of extinction risk.

    Science.gov (United States)

    Hei, Fangliang

    2012-05-01

    Underpinning the International Union for Conservation of Nature (IUCN) Red List is the assessment of extinction risk as determined by the size and degree of loss of populations. The IUCN system lists a species as Critically Endangered, Endangered, or Vulnerable if its population size declines 80%, 50%, or 30% within a given time frame. However, effective implementation of the system faces substantial challenges and uncertainty because geographic scale data on population size and long-term dynamics are scarce. I develop a model to quantify extinction risk using a measure based on a species' distribution, a much more readily obtained quantity. The model calculates the loss of the area of occupancy that is equivalent to the loss of a given proportion of a population. It is a very simple yet general model that has no free parameters and is independent of scale. The model predicted well the distributions of 302 tree species at a local scale and the distributions of 348 species of North American land birds. This area-based model provides a solution to the long-standing problem for IUCN assessments of lack of data on population sizes, and thus it will contribute to facilitating the quantification of extinction risk worldwide.

  2. Estimating radiation-induced cancer risk using MVK two-stage model for carcinogenesis

    International Nuclear Information System (INIS)

    Kai, M.; Kusama, T.; Aoki, Y.

    1993-01-01

    Based on the carcinogenesis model as proposed by Moolgavkar et al., time-dependent relative risk models were derived for projecting the time variation in excess relative risk. If it is assumed that each process is described by time-independent linear dose-response relationship, the time variation in excess relative risk is influenced by the parameter related with the promotion process. The risk model based carcinogenesis theory would play a marked role in estimating radiation-induced cancer risk in constructing a projection model or transfer model

  3. A Graphical Adversarial Risk Analysis Model for Oil and Gas Drilling Cybersecurity

    OpenAIRE

    Vieira, Aitor Couce; Houmb, Siv Hilde; Insua, David Rios

    2014-01-01

    Oil and gas drilling is based, increasingly, on operational technology, whose cybersecurity is complicated by several challenges. We propose a graphical model for cybersecurity risk assessment based on Adversarial Risk Analysis to face those challenges. We also provide an example of the model in the context of an offshore drilling rig. The proposed model provides a more formal and comprehensive analysis of risks, still using the standard business language based on decisions, risks, and value.

  4. Prototype Biology-Based Radiation Risk Module Project

    Science.gov (United States)

    Terrier, Douglas; Clayton, Ronald G.; Patel, Zarana; Hu, Shaowen; Huff, Janice

    2015-01-01

    Biological effects of space radiation and risk mitigation are strategic knowledge gaps for the Evolvable Mars Campaign. The current epidemiology-based NASA Space Cancer Risk (NSCR) model contains large uncertainties (HAT #6.5a) due to lack of information on the radiobiology of galactic cosmic rays (GCR) and lack of human data. The use of experimental models that most accurately replicate the response of human tissues is critical for precision in risk projections. Our proposed study will compare DNA damage, histological, and cell kinetic parameters after irradiation in normal 2D human cells versus 3D tissue models, and it will use a multi-scale computational model (CHASTE) to investigate various biological processes that may contribute to carcinogenesis, including radiation-induced cellular signaling pathways. This cross-disciplinary work, with biological validation of an evolvable mathematical computational model, will help reduce uncertainties within NSCR and aid risk mitigation for radiation-induced carcinogenesis.

  5. The risk management of perishable supply chain based on coloured Petri Net modeling

    Directory of Open Access Journals (Sweden)

    Lu Liu

    2018-03-01

    Full Text Available The supply chain of perishable products is a combination of information organization, sharing and integration. The information modeling of supply chain is constructed to abstract key quality information including environment information, processing procedures and product quality assessments based on principle of quality safety factors and property of decay rate. The coloured Petri Net is applied for integrated description of independent information classification, aiming at risk identification and risk management framework. Well, according to the quality deterioration tendency, risk grades management and decision-making system are established. Practically, the circulation system of aquatic products is studied in this paper for full processing description. The simulation experiments are manipulated on environmental information, processing information and product quality information by the coloured Petri Net. Eventually, the conclusion turns out precisely as such that the coloured Petri Net conclusive for information classification and information transmission while integrated information management is available of efficient risk identification and decision-making system in supply chain of perishable products. Meanwhile, the validity of evaluating management and shelf-life estimation of perishable products are technically feasible.

  6. Development of a diagnosis- and procedure-based risk model for 30-day outcome after pediatric cardiac surgery.

    Science.gov (United States)

    Crowe, Sonya; Brown, Kate L; Pagel, Christina; Muthialu, Nagarajan; Cunningham, David; Gibbs, John; Bull, Catherine; Franklin, Rodney; Utley, Martin; Tsang, Victor T

    2013-05-01

    The study objective was to develop a risk model incorporating diagnostic information to adjust for case-mix severity during routine monitoring of outcomes for pediatric cardiac surgery. Data from the Central Cardiac Audit Database for all pediatric cardiac surgery procedures performed in the United Kingdom between 2000 and 2010 were included: 70% for model development and 30% for validation. Units of analysis were 30-day episodes after the first surgical procedure. We used logistic regression for 30-day mortality. Risk factors considered included procedural information based on Central Cardiac Audit Database "specific procedures," diagnostic information defined by 24 "primary" cardiac diagnoses and "univentricular" status, and other patient characteristics. Of the 27,140 30-day episodes in the development set, 25,613 were survivals, 834 were deaths, and 693 were of unknown status (mortality, 3.2%). The risk model includes procedure, cardiac diagnosis, univentricular status, age band (neonate, infant, child), continuous age, continuous weight, presence of non-Down syndrome comorbidity, bypass, and year of operation 2007 or later (because of decreasing mortality). A risk score was calculated for 95% of cases in the validation set (weight missing in 5%). The model discriminated well; the C-index for validation set was 0.77 (0.81 for post-2007 data). Removal of all but procedural information gave a reduced C-index of 0.72. The model performed well across the spectrum of predicted risk, but there was evidence of underestimation of mortality risk in neonates undergoing operation from 2007. The risk model performs well. Diagnostic information added useful discriminatory power. A future application is risk adjustment during routine monitoring of outcomes in the United Kingdom to assist quality assurance. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  7. A risk-based model for maintenance decision support of civil structures using RAMS

    NARCIS (Netherlands)

    Viana Da Rocha, T. C.; Stipanovic, I.; Hartmann, A.; Bakker, J.

    2017-01-01

    As a cornerstone of transportation asset management, risk-based approaches have been used to support maintenance decisions of civil structures. However, ambiguous and subjective risk criteria and inconsistency on the use of risk-based approaches can lead to a fuzzy understanding of the risks

  8. A risk-based microbiological criterion that uses the relative risk as the critical limit

    DEFF Research Database (Denmark)

    Andersen, Jens Kirk; Nørrung, Birgit; da Costa Alves Machado, Simone

    2015-01-01

    A risk-based microbiological criterion is described, that is based on the relative risk associated to the analytical result of a number of samples taken from a food lot. The acceptable limit is a specific level of risk and not a specific number of microorganisms, as in other microbiological...... criteria. The approach requires the availability of a quantitative microbiological risk assessment model to get risk estimates for food products from sampled food lots. By relating these food lot risk estimates to the mean risk estimate associated to a representative baseline data set, a relative risk...... estimate can be obtained. This relative risk estimate then can be compared with a critical value, defined by the criterion. This microbiological criterion based on a relative risk limit is particularly useful when quantitative enumeration data are available and when the prevalence of the microorganism...

  9. A Graphical Adversarial Risk Analysis Model for Oil and Gas Drilling Cybersecurity

    Directory of Open Access Journals (Sweden)

    Aitor Couce Vieira

    2014-04-01

    Full Text Available Oil and gas drilling is based, increasingly, on operational technology, whose cybersecurity is complicated by several challenges. We propose a graphical model for cybersecurity risk assessment based on Adversarial Risk Analysis to face those challenges. We also provide an example of the model in the context of an offshore drilling rig. The proposed model provides a more formal and comprehensive analysis of risks, still using the standard business language based on decisions, risks, and value.

  10. THE MODEL FOR RISK ASSESSMENT ERP-SYSTEMS INFORMATION SECURITY

    Directory of Open Access Journals (Sweden)

    V. S. Oladko

    2016-12-01

    Full Text Available The article deals with the problem assessment of information security risks in the ERP-system. ERP-system functions and architecture are studied. The model malicious impacts on levels of ERP-system architecture are composed. Model-based risk assessment, which is the quantitative and qualitative approach to risk assessment, built on the partial unification 3 methods for studying the risks of information security - security models with full overlapping technique CRAMM and FRAP techniques developed.

  11. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  12. Predictive Accuracy of the PanCan Lung Cancer Risk Prediction Model -External Validation based on CT from the Danish Lung Cancer Screening Trial

    DEFF Research Database (Denmark)

    Winkler Wille, Mathilde M.; van Riel, Sarah J.; Saghir, Zaigham

    2015-01-01

    Objectives: Lung cancer risk models should be externally validated to test generalizability and clinical usefulness. The Danish Lung Cancer Screening Trial (DLCST) is a population-based prospective cohort study, used to assess the discriminative performances of the PanCan models. Methods: From...... the DLCST database, 1,152 nodules from 718 participants were included. Parsimonious and full PanCan risk prediction models were applied to DLCST data, and also coefficients of the model were recalculated using DLCST data. Receiver operating characteristics (ROC) curves and area under the curve (AUC) were...... used to evaluate risk discrimination. Results: AUCs of 0.826–0.870 were found for DLCST data based on PanCan risk prediction models. In the DLCST, age and family history were significant predictors (p = 0.001 and p = 0.013). Female sex was not confirmed to be associated with higher risk of lung cancer...

  13. A geographical information system-based web model of arbovirus transmission risk in the continental United States of America

    Directory of Open Access Journals (Sweden)

    Sarah K. Konrad

    2012-11-01

    Full Text Available A degree-day (DD model of West Nile virus capable of forecasting real-time transmission risk in the continental United States of America up to one week in advance using a 50-km grid is available online at https://sites. google.com/site/arbovirusmap/. Daily averages of historical risk based on temperatures for 1994-2003 are available at 10- km resolution. Transmission risk maps can be downloaded from 2010 to the present. The model can be adapted to work with any arbovirus for which the temperature-related parameters are known, e.g. Rift Valley fever virus. To more effectively assess virus establishment and transmission, the model incorporates “compound risk” maps and forecasts, which includes livestock density as a parameter.

  14. Calibration plots for risk prediction models in the presence of competing risks.

    Science.gov (United States)

    Gerds, Thomas A; Andersen, Per K; Kattan, Michael W

    2014-08-15

    A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks such as death due to other causes. For personalized medicine and patient counseling, it is necessary to check that the model is calibrated in the sense that it provides reliable predictions for all subjects. There are three often encountered practical problems when the aim is to display or test if a risk prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves for competing risks models based on jackknife pseudo-values that are combined with a nearest neighborhood smoother and a cross-validation approach to deal with all three problems. Copyright © 2014 John Wiley & Sons, Ltd.

  15. Risk-Based Operation and Maintenance of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Nielsen, Jannie Sønderkær

    to oil and gas structures. In addition, condition monitoring systems are often available, and the information should be taken into account when making decisions. In this thesis, methods for risk-based maintenance planning using Bayesian methods are investigated, with the aim of making optimal decisions......, but presently maintenance is not planned using advanced methods taking all available information into account in a consistent manner. Maintenance decisions can be made based on risk-based methods, where the total expected life cycle costs are minimized. Methods have been developed for assessing the corrective...... considering all available information. First, a theoretical damage model is formulated, the model is then updated using condition monitoring data, and the updated model is used as basis for risk-based decision making. Several approaches for solving the decision problems have been considered: various types...

  16. Mode of action based risk assessment of the botanical food-borne alkenylbenzene apiol from parsley using physiologically based kinetic (PBK) modelling and read-across from safrole

    NARCIS (Netherlands)

    Alajlouni, A.M.; Al-Malahmeh, A.J.; Kiwamoto, Reiko; Wesseling, Sebastiaan; Soffers, A.E.M.F.; Al-Subeihi, A.A.A.; Vervoort, Jacques; Rietjens, I.M.C.M.

    2016-01-01

    The present study developed physiologically-based kinetic (PBK) models for the alkenylbenzene apiol in order to facilitate risk assessment based on read-across from the related alkenylbenzene safrole. Model predictions indicate that in rat liver the formation of the 1'-sulfoxy metabolite is about

  17. Risk management model of winter navigation operations

    International Nuclear Information System (INIS)

    Valdez Banda, Osiris A.; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-01-01

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish–Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible. - Highlights: •A model to assess and manage the risk of winter navigation operations is proposed. •The risks of oil spills in winter navigation in the Gulf of Finland are analysed. •The model assesses and prioritizes actions to control the risk of the operations. •The model suggests navigational training as the most efficient risk control option.

  18. An in-depth assessment of a diagnosis-based risk adjustment model based on national health insurance claims: the application of the Johns Hopkins Adjusted Clinical Group case-mix system in Taiwan.

    Science.gov (United States)

    Chang, Hsien-Yen; Weiner, Jonathan P

    2010-01-18

    Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI) programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG) case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234), while those in both 2002 and 2003 were included for prospective analyses (n = 164,562). Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group) model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster). When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Given the widespread availability of claims data and the superior explanatory

  19. Uncertainty and sensitivity analysis of flood risk management decisions based on stationary and nonstationary model choices

    Directory of Open Access Journals (Sweden)

    Rehan Balqis M.

    2016-01-01

    Full Text Available Current practice in flood frequency analysis assumes that the stochastic properties of extreme floods follow that of stationary conditions. As human intervention and anthropogenic climate change influences in hydrometeorological variables are becoming evident in some places, there have been suggestions that nonstationary statistics would be better to represent the stochastic properties of the extreme floods. The probabilistic estimation of non-stationary models, however, is surrounded with uncertainty related to scarcity of observations and modelling complexities hence the difficulty to project the future condition. In the face of uncertain future and the subjectivity of model choices, this study attempts to demonstrate the practical implications of applying a nonstationary model and compares it with a stationary model in flood risk assessment. A fully integrated framework to simulate decision makers’ behaviour in flood frequency analysis is thereby developed. The framework is applied to hypothetical flood risk management decisions and the outcomes are compared with those of known underlying future conditions. Uncertainty of the economic performance of the risk-based decisions is assessed through Monte Carlo simulations. Sensitivity of the results is also tested by varying the possible magnitude of future changes. The application provides quantitative and qualitative comparative results that satisfy a preliminary analysis of whether the nonstationary model complexity should be applied to improve the economic performance of decisions. Results obtained from the case study shows that the relative differences of competing models for all considered possible future changes are small, suggesting that stationary assumptions are preferred to a shift to nonstationary statistics for practical application of flood risk management. Nevertheless, nonstationary assumption should also be considered during a planning stage in addition to stationary assumption

  20. Use of a risk-based hydrogeologic model to set remedial goals in a Puget Sound basin watershed

    International Nuclear Information System (INIS)

    Pascoe, G.; Gould, L.; Martin, J.; Riley, M.; Floyd, T.

    1995-01-01

    The Port of Seattle is redeveloping industrial land for a container terminal along the southwest Seattle waterfront. Concrete, asphalt, ballast, and a landfill geomembrane will cover the site and prevent direct contact with surface soils, so remedial goals focused on groundwater contamination from subsurface soils. Groundwater at the site flows along an old stormwater drain, in a filled estuary of a small creek, to Elliott Bay. Remedial goals for a variety of organic chemicals, metals, and TPH in subsurface soils were identified to protect marine receptors in the bay and their consumers. Washington State and federal marine water quality criteria were the starting points in the risk-based model, and corresponding concentrations of chemicals in groundwater were back-calculated through a hydrogeologic model. The hydrogeologic model included a mixing zone component in the bay and dilution/attenuation factors along the groundwater transport pathway that were determined from onsite groundwater and surface water chemical concentrations. A rearranged Summers equation was then applied in a second back-calculation to determine subsurface soil concentrations corresponding to the back calculated groundwater concentrations. The equation was based on calculated aquifer flow rates for the small creek watershed and rates of infiltration through surface materials calculated for each redevelopment soil cover type by the HELP model. Results of the risk-based hydrogeologic back-calculation model indicate that, depending on soil cover type at the site, concentrations in subsurface soils of PCBs from 2 to 1,000 mg/kg and of TPH up to free phase concentration would not result in risks to marine organisms or their consumers in Elliott Bay

  1. Developing a suitable model for supplier selection based on supply chain risks: an empirical study from Iranian pharmaceutical companies.

    Science.gov (United States)

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts' opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry.

  2. Developing a Suitable Model for Supplier Selection Based on Supply Chain Risks: An Empirical Study from Iranian Pharmaceutical Companies

    Science.gov (United States)

    Mehralian, Gholamhossein; Rajabzadeh Gatari, Ali; Morakabati, Mohadese; Vatanpour, Hossein

    2012-01-01

    The supply chain represents the critical link between the development of new product and the market in pharmaceutical industry. Over the years, improvements made in supply chain operations have focused largely on ways to reduce cost and gain efficiencies in scale. In addition, powerful regulatory and market forces have provided new incentives for pharmaceutical firms to basically rethink the way they produce and distribute products, and also to re-imagine the role of the supply chain in driving strategic growth, brand differentiation and economic value in the health continuum. The purpose of this paper is to formulate basic factors involved in risk analysis of pharmaceutical industry, and also determine the effective factors involved in suppliers selection and their priorities. This paper is based on the results of literature review, experts’ opinion acquisition, statistical analysis and also using MADM models on data gathered from distributed questionnaires. The model consists of the following steps and components: first factors involved in to supply chain risks are determined. Based on them a framework is considered. According the result of statistical analysis and MADM models the risk factors are formulated. The paper determines the main components and influenceial factors involving in the supply chain risks. Results showed that delivery risk can make an important contribution to mitigate the risk of pharmaceutical industry. PMID:24250442

  3. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  4. A risk explicit interval linear programming model for uncertainty-based environmental economic optimization in the Lake Fuxian watershed, China.

    Science.gov (United States)

    Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan

    2013-01-01

    The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  5. A Risk Explicit Interval Linear Programming Model for Uncertainty-Based Environmental Economic Optimization in the Lake Fuxian Watershed, China

    Directory of Open Access Journals (Sweden)

    Xiaoling Zhang

    2013-01-01

    Full Text Available The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers’ preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of “low risk and high return efficiency” in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  6. A stochastic multicriteria model for evidence-based decision making in drug benefit-risk analysis.

    Science.gov (United States)

    Tervonen, Tommi; van Valkenhoef, Gert; Buskens, Erik; Hillege, Hans L; Postmus, Douwe

    2011-05-30

    Drug benefit-risk (BR) analysis is based on firm clinical evidence regarding various safety and efficacy outcomes. In this paper, we propose a new and more formal approach for constructing a supporting multi-criteria model that fully takes into account the evidence on efficacy and adverse drug reactions. Our approach is based on the stochastic multi-criteria acceptability analysis methodology, which allows us to compute the typical value judgments that support a decision, to quantify decision uncertainty, and to compute a comprehensive BR profile. We construct a multi-criteria model for the therapeutic group of second-generation antidepressants. We assess fluoxetine and venlafaxine together with placebo according to incidence of treatment response and three common adverse drug reactions by using data from a published study. Our model shows that there are clear trade-offs among the treatment alternatives. Copyright © 2011 John Wiley & Sons, Ltd.

  7. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    Science.gov (United States)

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.

  8. Model-based mitigation of availability risks

    NARCIS (Netherlands)

    Zambon, E.; Bolzoni, D.; Etalle, S.; Salvato, M.

    2007-01-01

    The assessment and mitigation of risks related to the availability of the IT infrastructure is becoming increasingly important in modern organizations. Unfortunately, present standards for risk assessment and mitigation show limitations when evaluating and mitigating availability risks. This is due

  9. Model-Based Mitigation of Availability Risks

    NARCIS (Netherlands)

    Zambon, Emmanuele; Bolzoni, D.; Etalle, Sandro; Salvato, Marco

    2007-01-01

    The assessment and mitigation of risks related to the availability of the IT infrastructure is becoming increasingly important in modern organizations. Unfortunately, present standards for Risk Assessment and Mitigation show limitations when evaluating and mitigating availability risks. This is due

  10. Risk analysis of urban gas pipeline network based on improved bow-tie model

    Science.gov (United States)

    Hao, M. J.; You, Q. J.; Yue, Z.

    2017-11-01

    Gas pipeline network is a major hazard source in urban areas. In the event of an accident, there could be grave consequences. In order to understand more clearly the causes and consequences of gas pipeline network accidents, and to develop prevention and mitigation measures, the author puts forward the application of improved bow-tie model to analyze risks of urban gas pipeline network. The improved bow-tie model analyzes accident causes from four aspects: human, materials, environment and management; it also analyzes the consequences from four aspects: casualty, property loss, environment and society. Then it quantifies the causes and consequences. Risk identification, risk analysis, risk assessment, risk control, and risk management will be clearly shown in the model figures. Then it can suggest prevention and mitigation measures accordingly to help reduce accident rate of gas pipeline network. The results show that the whole process of an accident can be visually investigated using the bow-tie model. It can also provide reasons for and predict consequences of an unfortunate event. It is of great significance in order to analyze leakage failure of gas pipeline network.

  11. Adoption of Building Information Modelling in project planning risk management

    Science.gov (United States)

    Mering, M. M.; Aminudin, E.; Chai, C. S.; Zakaria, R.; Tan, C. S.; Lee, Y. Y.; Redzuan, A. A.

    2017-11-01

    An efficient and effective risk management required a systematic and proper methodology besides knowledge and experience. However, if the risk management is not discussed from the starting of the project, this duty is notably complicated and no longer efficient. This paper presents the adoption of Building Information Modelling (BIM) in project planning risk management. The objectives is to identify the traditional risk management practices and its function, besides, determine the best function of BIM in risk management and investigating the efficiency of adopting BIM-based risk management during the project planning phase. In order to obtain data, a quantitative approach is adopted in this research. Based on data analysis, the lack of compliance with project requirements and failure to recognise risk and develop responses to opportunity are the risks occurred when traditional risk management is implemented. When using BIM in project planning, it works as the tracking of cost control and cash flow give impact on the project cycle to be completed on time. 5D cost estimation or cash flow modeling benefit risk management in planning, controlling and managing budget and cost reasonably. There were two factors that mostly benefit a BIM-based technology which were formwork plan with integrated fall plan and design for safety model check. By adopting risk management, potential risks linked with a project and acknowledging to those risks can be identified to reduce them to an acceptable extent. This means recognizing potential risks and avoiding threat by reducing their negative effects. The BIM-based risk management can enhance the planning process of construction projects. It benefits the construction players in various aspects. It is important to know the application of BIM-based risk management as it can be a lesson learnt to others to implement BIM and increase the quality of the project.

  12. Risk-based safety indicators

    International Nuclear Information System (INIS)

    Szikszai, T.

    1997-01-01

    The presentation discusses the following issues: The objectives of the risk-based indicator programme. The characteristics of the risk-based indicators. The objectives of risk-based safety indicators - in monitoring safety; in PSA applications. What indicators? How to produce the risk based indicators? PSA requirements

  13. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    International Nuclear Information System (INIS)

    Qiu, Zeyang; Liang, Wei; Lin, Yang; Zhang, Meng; Wang, Xue

    2017-01-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor. (paper)

  14. Predictive accuracy of the PanCan lung cancer risk prediction model - external validation based on CT from the Danish Lung Cancer Screening Trial

    International Nuclear Information System (INIS)

    Winkler Wille, Mathilde M.; Dirksen, Asger; Riel, Sarah J. van; Jacobs, Colin; Scholten, Ernst T.; Ginneken, Bram van; Saghir, Zaigham; Pedersen, Jesper Holst; Hohwue Thomsen, Laura; Skovgaard, Lene T.

    2015-01-01

    Lung cancer risk models should be externally validated to test generalizability and clinical usefulness. The Danish Lung Cancer Screening Trial (DLCST) is a population-based prospective cohort study, used to assess the discriminative performances of the PanCan models. From the DLCST database, 1,152 nodules from 718 participants were included. Parsimonious and full PanCan risk prediction models were applied to DLCST data, and also coefficients of the model were recalculated using DLCST data. Receiver operating characteristics (ROC) curves and area under the curve (AUC) were used to evaluate risk discrimination. AUCs of 0.826-0.870 were found for DLCST data based on PanCan risk prediction models. In the DLCST, age and family history were significant predictors (p = 0.001 and p = 0.013). Female sex was not confirmed to be associated with higher risk of lung cancer; in fact opposing effects of sex were observed in the two cohorts. Thus, female sex appeared to lower the risk (p = 0.047 and p = 0.040) in the DLCST. High risk discrimination was validated in the DLCST cohort, mainly determined by nodule size. Age and family history of lung cancer were significant predictors and could be included in the parsimonious model. Sex appears to be a less useful predictor. (orig.)

  15. Predictive accuracy of the PanCan lung cancer risk prediction model - external validation based on CT from the Danish Lung Cancer Screening Trial

    Energy Technology Data Exchange (ETDEWEB)

    Winkler Wille, Mathilde M.; Dirksen, Asger [Gentofte Hospital, Department of Respiratory Medicine, Hellerup (Denmark); Riel, Sarah J. van; Jacobs, Colin; Scholten, Ernst T.; Ginneken, Bram van [Radboud University Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Saghir, Zaigham [Herlev Hospital, Department of Respiratory Medicine, Herlev (Denmark); Pedersen, Jesper Holst [Copenhagen University Hospital, Department of Thoracic Surgery, Rigshospitalet, Koebenhavn Oe (Denmark); Hohwue Thomsen, Laura [Hvidovre Hospital, Department of Respiratory Medicine, Hvidovre (Denmark); Skovgaard, Lene T. [University of Copenhagen, Department of Biostatistics, Koebenhavn Oe (Denmark)

    2015-10-15

    Lung cancer risk models should be externally validated to test generalizability and clinical usefulness. The Danish Lung Cancer Screening Trial (DLCST) is a population-based prospective cohort study, used to assess the discriminative performances of the PanCan models. From the DLCST database, 1,152 nodules from 718 participants were included. Parsimonious and full PanCan risk prediction models were applied to DLCST data, and also coefficients of the model were recalculated using DLCST data. Receiver operating characteristics (ROC) curves and area under the curve (AUC) were used to evaluate risk discrimination. AUCs of 0.826-0.870 were found for DLCST data based on PanCan risk prediction models. In the DLCST, age and family history were significant predictors (p = 0.001 and p = 0.013). Female sex was not confirmed to be associated with higher risk of lung cancer; in fact opposing effects of sex were observed in the two cohorts. Thus, female sex appeared to lower the risk (p = 0.047 and p = 0.040) in the DLCST. High risk discrimination was validated in the DLCST cohort, mainly determined by nodule size. Age and family history of lung cancer were significant predictors and could be included in the parsimonious model. Sex appears to be a less useful predictor. (orig.)

  16. A multiparametric magnetic resonance imaging-based risk model to determine the risk of significant prostate cancer prior to biopsy.

    Science.gov (United States)

    van Leeuwen, Pim J; Hayen, Andrew; Thompson, James E; Moses, Daniel; Shnier, Ron; Böhm, Maret; Abuodha, Magdaline; Haynes, Anne-Maree; Ting, Francis; Barentsz, Jelle; Roobol, Monique; Vass, Justin; Rasiah, Krishan; Delprado, Warick; Stricker, Phillip D

    2017-12-01

    To develop and externally validate a predictive model for detection of significant prostate cancer. Development of the model was based on a prospective cohort including 393 men who underwent multiparametric magnetic resonance imaging (mpMRI) before biopsy. External validity of the model was then examined retrospectively in 198 men from a separate institution whom underwent mpMRI followed by biopsy for abnormal prostate-specific antigen (PSA) level or digital rectal examination (DRE). A model was developed with age, PSA level, DRE, prostate volume, previous biopsy, and Prostate Imaging Reporting and Data System (PIRADS) score, as predictors for significant prostate cancer (Gleason 7 with >5% grade 4, ≥20% cores positive or ≥7 mm of cancer in any core). Probability was studied via logistic regression. Discriminatory performance was quantified by concordance statistics and internally validated with bootstrap resampling. In all, 393 men had complete data and 149 (37.9%) had significant prostate cancer. While the variable model had good accuracy in predicting significant prostate cancer, area under the curve (AUC) of 0.80, the advanced model (incorporating mpMRI) had a significantly higher AUC of 0.88 (P prostate cancer. Individualised risk assessment of significant prostate cancer using a predictive model that incorporates mpMRI PIRADS score and clinical data allows a considerable reduction in unnecessary biopsies and reduction of the risk of over-detection of insignificant prostate cancer at the cost of a very small increase in the number of significant cancers missed. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.

  17. Risk Evaluation of a UHV Power Transmission Construction Project Based on a Cloud Model and FCE Method for Sustainability

    Directory of Open Access Journals (Sweden)

    Huiru Zhao

    2015-03-01

    Full Text Available In order to achieve the sustainable development of energy, Ultra High Voltage (UHV power transmission construction projects are being established in China currently. Their high-tech nature, the massive amount of money involved, and the need for multi-agent collaboration as well as complex construction environments bring many challenges and risks. Risk management, therefore, is critical to reduce the risks and realize sustainable development of projects. Unfortunately, many traditional risk assessment methods may not perform well due to the great uncertainty and randomness inherent in UHV power construction projects. This paper, therefore, proposes a risk evaluation index system and a hybrid risk evaluation model to evaluate the risk of UHV projects and find out the key risk factors. This model based on a cloud model and fuzzy comprehensive evaluation (FCE method combines the superiority of the cloud model for reflecting randomness and discreteness with the advantages of the fuzzy comprehensive evaluation method in handling uncertain and vague issues. For the sake of proving our framework, an empirical study of “Zhejiang-Fuzhou” UHV power transmission construction project is presented. As key contributions, we find the risk of this project lies at a “middle” to “high” level and closer to a “middle” level; the “management risk” and “social risk” are identified as the most important risk factors requiring more attention; and some risk control recommendations are proposed. This article demonstrates the value of our approach in risk identification, which seeks to improve the risk control level and the sustainable development of UHV power transmission construction projects.

  18. An in-depth assessment of a diagnosis-based risk adjustment model based on national health insurance claims: the application of the Johns Hopkins Adjusted Clinical Group case-mix system in Taiwan

    Directory of Open Access Journals (Sweden)

    Weiner Jonathan P

    2010-01-01

    Full Text Available Abstract Background Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. Methods A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234, while those in both 2002 and 2003 were included for prospective analyses (n = 164,562. Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. Results The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster. When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Conclusions Given the

  19. Risk of the Maritime Supply Chain System Based on Interpretative Structural Model

    OpenAIRE

    Jiang He; Xiong Wei; Cao Yonghui

    2017-01-01

    Marine transportation is the most important transport mode of in the international trade, but the maritime supply chain is facing with many risks. At present, most of the researches on the risk of the maritime supply chain focus on the risk identification and risk management, and barely carry on the quantitative analysis of the logical structure of each influencing factor. This paper uses the interpretative structure model to analysis the maritime supply chain risk system. On the basis of com...

  20. Statistical and RBF NN models : providing forecasts and risk assessment

    OpenAIRE

    Marček, Milan

    2009-01-01

    Forecast accuracy of economic and financial processes is a popular measure for quantifying the risk in decision making. In this paper, we develop forecasting models based on statistical (stochastic) methods, sometimes called hard computing, and on a soft method using granular computing. We consider the accuracy of forecasting models as a measure for risk evaluation. It is found that the risk estimation process based on soft methods is simplified and less critical to the question w...

  1. Assessing surface water flood risk and management strategies under future climate change: Insights from an Agent-Based Model.

    Science.gov (United States)

    Jenkins, K; Surminski, S; Hall, J; Crick, F

    2017-10-01

    Climate change and increasing urbanization are projected to result in an increase in surface water flooding and consequential damages in the future. In this paper, we present insights from a novel Agent Based Model (ABM), applied to a London case study of surface water flood risk, designed to assess the interplay between different adaptation options; how risk reduction could be achieved by homeowners and government; and the role of flood insurance and the new flood insurance pool, Flood Re, in the context of climate change. The analysis highlights that while combined investment in property-level flood protection and sustainable urban drainage systems reduce surface water flood risk, the benefits can be outweighed by continued development in high risk areas and the effects of climate change. In our simulations, Flood Re is beneficial in its function to provide affordable insurance, even under climate change. However, the scheme does face increasing financial pressure due to rising surface water flood damages. If the intended transition to risk-based pricing is to take place then a determined and coordinated strategy will be needed to manage flood risk, which utilises insurance incentives, limits new development, and supports resilience measures. Our modelling approach and findings are highly relevant for the ongoing regulatory and political approval process for Flood Re as well as for wider discussions on the potential of insurance schemes to incentivise flood risk management and climate adaptation in the UK and internationally. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Dynamic occupational risk model for offshore operations in harsh environments

    International Nuclear Information System (INIS)

    Song, Guozheng; Khan, Faisal; Wang, Hangzhou; Leighton, Shelly; Yuan, Zhi; Liu, Hanwen

    2016-01-01

    The expansion of offshore oil exploitation into remote areas (e.g., Arctic) with harsh environments has significantly increased occupational risks. Among occupational accidents, slips, trips and falls from height (STFs) account for a significant portion. Thus, a dynamic risk assessment of the three main occupational accidents is meaningful to decrease offshore occupational risks. Bow-tie Models (BTs) were established in this study for the risk analysis of STFs considering extreme environmental factors. To relax the limitations of BTs, Bayesian networks (BNs) were developed based on BTs to dynamically assess risks of STFs. The occurrence and consequence probabilities of STFs were respectively calculated using BTs and BNs, and the obtained probabilities verified BNs' rationality and advantage. Furthermore, the probability adaptation for STFs was accomplished in a specific scenario with BNs. Finally, posterior probabilities of basic events were achieved through diagnostic analysis, and critical basic events were analyzed based on their posterior likelihood to cause occupational accidents. The highlight is systematically analyzing STF accidents for offshore operations and dynamically assessing their risks considering the harsh environmental factors. This study can guide the allocation of prevention resources and benefit the safety management of offshore operations. - Highlights: • A novel dynamic risk model for occupational accidents. • First time consideration of harsh environment in occupational accident modeling. • A Bayesian network based model for risk management strategies.

  3. Development of a risk prediction model for lung cancer: The Japan Public Health Center-based Prospective Study.

    Science.gov (United States)

    Charvat, Hadrien; Sasazuki, Shizuka; Shimazu, Taichi; Budhathoki, Sanjeev; Inoue, Manami; Iwasaki, Motoki; Sawada, Norie; Yamaji, Taiki; Tsugane, Shoichiro

    2018-03-01

    Although the impact of tobacco consumption on the occurrence of lung cancer is well-established, risk estimation could be improved by risk prediction models that consider various smoking habits, such as quantity, duration, and time since quitting. We constructed a risk prediction model using a population of 59 161 individuals from the Japan Public Health Center (JPHC) Study Cohort II. A parametric survival model was used to assess the impact of age, gender, and smoking-related factors (cumulative smoking intensity measured in pack-years, age at initiation, and time since cessation). Ten-year cumulative probability of lung cancer occurrence estimates were calculated with consideration of the competing risk of death from other causes. Finally, the model was externally validated using 47 501 individuals from JPHC Study Cohort I. A total of 1210 cases of lung cancer occurred during 986 408 person-years of follow-up. We found a dose-dependent effect of tobacco consumption with hazard ratios for current smokers ranging from 3.78 (2.00-7.16) for cumulative consumption ≤15 pack-years to 15.80 (9.67-25.79) for >75 pack-years. Risk decreased with time since cessation. Ten-year cumulative probability of lung cancer occurrence estimates ranged from 0.04% to 11.14% in men and 0.07% to 6.55% in women. The model showed good predictive performance regarding discrimination (cross-validated c-index = 0.793) and calibration (cross-validated χ 2 = 6.60; P-value = .58). The model still showed good discrimination in the external validation population (c-index = 0.772). In conclusion, we developed a prediction model to estimate the probability of developing lung cancer based on age, gender, and tobacco consumption. This model appears useful in encouraging high-risk individuals to quit smoking and undergo increased surveillance. © 2018 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.

  4. Analytical Modeling for Underground Risk Assessment in Smart Cities

    Directory of Open Access Journals (Sweden)

    Israr Ullah

    2018-06-01

    Full Text Available In the developed world, underground facilities are increasing day-by-day, as it is considered as an improved utilization of available space in smart cities. Typical facilities include underground railway lines, electricity lines, parking lots, water supply systems, sewerage network, etc. Besides its utility, these facilities also pose serious threats to citizens and property. To preempt accidental loss of precious human lives and properties, a real time monitoring system is highly desirable for conducting risk assessment on continuous basis and timely report any abnormality before its too late. In this paper, we present an analytical formulation to model system behavior for risk analysis and assessment based on various risk contributing factors. Based on proposed analytical model, we have evaluated three approximation techniques for computing final risk index: (a simple linear approximation based on multiple linear regression analysis; (b hierarchical fuzzy logic based technique in which related risk factors are combined in a tree like structure; and (c hybrid approximation approach which is a combination of (a and (b. Experimental results shows that simple linear approximation fails to accurately estimate final risk index as compared to hierarchical fuzzy logic based system which shows that the latter provides an efficient method for monitoring and forecasting critical issues in the underground facilities and may assist in maintenance efficiency as well. Estimation results based on hybrid approach fails to accurately estimate final risk index. However, hybrid scheme reveals some interesting and detailed information by performing automatic clustering based on location risk index.

  5. Prevention of oil spill from shipping by modelling of dynamic risk.

    Science.gov (United States)

    Eide, Magnus S; Endresen, Oyvind; Breivik, Oyvind; Brude, Odd Willy; Ellingsen, Ingrid H; Røang, Kjell; Hauge, Jarle; Brett, Per Olaf

    2007-10-01

    This paper presents a new dynamic environmental risk model, with intended use within a new, dynamical approach for risk based ship traffic prioritisation. The philosophy behind this newly developed approach is that shipping risk can be reduced by directing efforts towards ships and areas that have been identified as high priority (high risk), prior to a potential accident. The risk model proposed in this paper separates itself from previous models by drawing on available information on dynamic factors and by focusing on the ship's surroundings. The model estimates the environmental risk of drift grounding accidents for oil tankers in real time and in forecast mode, combining the probability of grounding with oil spill impact on the coastline. Results show that the inherent dynamic risk introduced by an oil tanker sailing along the North Norwegian coast depends, not surprisingly, significantly upon wind and ocean currents, as well as tug position and cargo oil type. Results of this study indicate that the risk model is well suited for real time risk assessment, and effectively separates low risk and high risk situations. The model is well suited as a tool to prioritise oil tankers and coastal segments. This enables dynamic risk based positioning of tugs, using both real-time and projected risk, for effective support in case of a drifting ship situation.

  6. Construction of risk prediction model of type 2 diabetes mellitus based on logistic regression

    Directory of Open Access Journals (Sweden)

    Li Jian

    2017-01-01

    Full Text Available Objective: to construct multi factor prediction model for the individual risk of T2DM, and to explore new ideas for early warning, prevention and personalized health services for T2DM. Methods: using logistic regression techniques to screen the risk factors for T2DM and construct the risk prediction model of T2DM. Results: Male’s risk prediction model logistic regression equation: logit(P=BMI × 0.735+ vegetables × (−0.671 + age × 0.838+ diastolic pressure × 0.296+ physical activity× (−2.287 + sleep ×(−0.009 +smoking ×0.214; Female’s risk prediction model logistic regression equation: logit(P=BMI ×1.979+ vegetables× (−0.292 + age × 1.355+ diastolic pressure× 0.522+ physical activity × (−2.287 + sleep × (−0.010.The area under the ROC curve of male was 0.83, the sensitivity was 0.72, the specificity was 0.86, the area under the ROC curve of female was 0.84, the sensitivity was 0.75, the specificity was 0.90. Conclusion: This study model data is from a compared study of nested case, the risk prediction model has been established by using the more mature logistic regression techniques, and the model is higher predictive sensitivity, specificity and stability.

  7. Risk-based management of invading plant disease.

    Science.gov (United States)

    Hyatt-Twynam, Samuel R; Parnell, Stephen; Stutt, Richard O J H; Gottwald, Tim R; Gilligan, Christopher A; Cunniffe, Nik J

    2017-05-01

    Effective control of plant disease remains a key challenge. Eradication attempts often involve removal of host plants within a certain radius of detection, targeting asymptomatic infection. Here we develop and test potentially more effective, epidemiologically motivated, control strategies, using a mathematical model previously fitted to the spread of citrus canker in Florida. We test risk-based control, which preferentially removes hosts expected to cause a high number of infections in the remaining host population. Removals then depend on past patterns of pathogen spread and host removal, which might be nontransparent to affected stakeholders. This motivates a variable radius strategy, which approximates risk-based control via removal radii that vary by location, but which are fixed in advance of any epidemic. Risk-based control outperforms variable radius control, which in turn outperforms constant radius removal. This result is robust to changes in disease spread parameters and initial patterns of susceptible host plants. However, efficiency degrades if epidemiological parameters are incorrectly characterised. Risk-based control including additional epidemiology can be used to improve disease management, but it requires good prior knowledge for optimal performance. This focuses attention on gaining maximal information from past epidemics, on understanding model transferability between locations and on adaptive management strategies that change over time. © 2017 The Authors. New Phytologist © 2017 New Phytologist Trust.

  8. Quantitative microbial risk assessment for spray irrigation of dairy manure based on an empirical fate and transport model

    Science.gov (United States)

    Burch, Tucker R; Spencer, Susan K.; Stokdyk, Joel; Kieke, Burney A; Larson, Rebecca A; Firnstahl, Aaron; Rule, Ana M; Borchardt, Mark A.

    2017-01-01

    BACKGROUND: Spray irrigation for land-applying livestock manure is increasing in the United States as farms become larger and economies of scale make manure irrigation affordable. Human health risks from exposure to zoonotic pathogens aerosolized during manure irrigation are not well understood. OBJECTIVES: We aimed to a) estimate human health risks due to aerosolized zoonotic pathogens downwind of spray-irrigated dairy manure; and b) determine which factors (e.g., distance, weather conditions) have the greatest influence on risk estimates. METHODS: We sampled downwind air concentrations of manure-borne fecal indicators and zoonotic pathogens during 21 full-scale dairy manure irri- gation events at three farms. We fit these data to hierarchical empirical models and used model outputs in a quantitative microbial risk assessment (QMRA) to estimate risk [probability of acute gastrointestinal illness (AGI)] for individuals exposed to spray-irrigated dairy manure containing Campylobacter jejuni, enterohemorrhagic Escherichia coli (EHEC), or Salmonella spp. RESULTS: Median risk estimates from Monte Carlo simulations ranged from 10−5 to 10−2 and decreased with distance from the source. Risk estimates for Salmonella or EHEC-related AGI were most sensitive to the assumed level of pathogen prevalence in dairy manure, while risk estimates for C. jejuni were not sensitive to any single variable. Airborne microbe concentrations were negatively associated with distance and positively associated with wind speed, both of which were retained in models as a significant predictor more often than relative humidity, solar irradiation, or temperature. CONCLUSIONS: Our model-based estimates suggest that reducing pathogen prevalence and concentration in source manure would reduce the risk of AGI from exposure to manure irrigation, and that increasing the distance from irrigated manure (i.e., setbacks) and limiting irrigation to times of low wind speed may also reduce risk.

  9. Health-Based Capitation Risk Adjustment in Minnesota Public Health Care Programs

    Science.gov (United States)

    Gifford, Gregory A.; Edwards, Kevan R.; Knutson, David J.

    2004-01-01

    This article documents the history and implementation of health-based capitation risk adjustment in Minnesota public health care programs, and identifies key implementation issues. Capitation payments in these programs are risk adjusted using an historical, health plan risk score, based on concurrent risk assessment. Phased implementation of capitation risk adjustment for these programs began January 1, 2000. Minnesota's experience with capitation risk adjustment suggests that: (1) implementation can accelerate encounter data submission, (2) administrative decisions made during implementation can create issues that impact payment model performance, and (3) changes in diagnosis data management during implementation may require changes to the payment model. PMID:25372356

  10. The air emissions risk assessment model (AERAM)

    International Nuclear Information System (INIS)

    Gratt, L.B.

    1991-01-01

    AERAM is an environmental analysis and power generation station investment decision support tool. AERAM calculates the public health risk (in terms of the lifetime cancers) in the nearby population from pollutants released into the air. AERAM consists of four main subroutines: Emissions, Air, Exposure and Risk. The Emission subroutine uses power plant parameters to calculate the expected release of the pollutants. A coal-fired and oil-fired power plant are currently available. A gas-fired plant model is under preparation. The release of the pollutants into the air is followed by their dispersal in the environment. The dispersion in the Air Subroutine uses the Environmental Protection Agency's model, Industrial Source Complex-Long Term. Additional dispersion models (Industrial Source Complex - Short Term and Cooling Tower Drift) are being implemented for future AERAM versions. The Expose Subroutine uses the ambient concentrations to compute population exposures for the pollutants of concern. The exposures are used with corresponding dose-response model in the Risk Subroutine to estimate both the total population risk and individual risk. The risk for the dispersion receptor-population centroid for the maximum concentration is also calculated for regulatory-population purposes. In addition, automated interfaces with AirTox (an air risk decision model) have been implemented to extend AERAM's steady-state single solution to the decision-under-uncertainty domain. AERAM was used for public health risks, the investment decision for additional pollution control systems based on health risk reductions, and the economics of fuel vs. health risk tradeoffs. AERAM provides that state-of-the-art capability for evaluating the public health impact airborne toxic substances in response to regulations and public concern

  11. Risk Decision Making Model for Reservoir Floodwater resources Utilization

    Science.gov (United States)

    Huang, X.

    2017-12-01

    Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.

  12. Acute Myocardial Infarction Readmission Risk Prediction Models: A Systematic Review of Model Performance.

    Science.gov (United States)

    Smith, Lauren N; Makam, Anil N; Darden, Douglas; Mayo, Helen; Das, Sandeep R; Halm, Ethan A; Nguyen, Oanh Kieu

    2018-01-01

    Hospitals are subject to federal financial penalties for excessive 30-day hospital readmissions for acute myocardial infarction (AMI). Prospectively identifying patients hospitalized with AMI at high risk for readmission could help prevent 30-day readmissions by enabling targeted interventions. However, the performance of AMI-specific readmission risk prediction models is unknown. We systematically searched the published literature through March 2017 for studies of risk prediction models for 30-day hospital readmission among adults with AMI. We identified 11 studies of 18 unique risk prediction models across diverse settings primarily in the United States, of which 16 models were specific to AMI. The median overall observed all-cause 30-day readmission rate across studies was 16.3% (range, 10.6%-21.0%). Six models were based on administrative data; 4 on electronic health record data; 3 on clinical hospital data; and 5 on cardiac registry data. Models included 7 to 37 predictors, of which demographics, comorbidities, and utilization metrics were the most frequently included domains. Most models, including the Centers for Medicare and Medicaid Services AMI administrative model, had modest discrimination (median C statistic, 0.65; range, 0.53-0.79). Of the 16 reported AMI-specific models, only 8 models were assessed in a validation cohort, limiting generalizability. Observed risk-stratified readmission rates ranged from 3.0% among the lowest-risk individuals to 43.0% among the highest-risk individuals, suggesting good risk stratification across all models. Current AMI-specific readmission risk prediction models have modest predictive ability and uncertain generalizability given methodological limitations. No existing models provide actionable information in real time to enable early identification and risk-stratification of patients with AMI before hospital discharge, a functionality needed to optimize the potential effectiveness of readmission reduction interventions

  13. Assessment of credit risk based on fuzzy relations

    Science.gov (United States)

    Tsabadze, Teimuraz

    2017-06-01

    The purpose of this paper is to develop a new approach for an assessment of the credit risk to corporate borrowers. There are different models for borrowers' risk assessment. These models are divided into two groups: statistical and theoretical. When assessing the credit risk for corporate borrowers, statistical model is unacceptable due to the lack of sufficiently large history of defaults. At the same time, we cannot use some theoretical models due to the lack of stock exchange. In those cases, when studying a particular borrower given that statistical base does not exist, the decision-making process is always of expert nature. The paper describes a new approach that may be used in group decision-making. An example of the application of the proposed approach is given.

  14. Methods and models used in comparative risk studies

    International Nuclear Information System (INIS)

    Devooght, J.

    1983-01-01

    Comparative risk studies make use of a large number of methods and models based upon a set of assumptions incompletely formulated or of value judgements. Owing to the multidimensionality of risks and benefits, the economic and social context may notably influence the final result. Five classes of models are briefly reviewed: accounting of fluxes of effluents, radiation and energy; transport models and health effects; systems reliability and bayesian analysis; economic analysis of reliability and cost-risk-benefit analysis; decision theory in presence of uncertainty and multiple objectives. Purpose and prospect of comparative studies are assessed in view of probable diminishing returns for large generic comparisons [fr

  15. Modifying EPA radiation risk models based on BEIR VII

    International Nuclear Information System (INIS)

    Pawel, D.; Puskin, J.

    2007-01-01

    This paper summarizes a 'draft White Paper' that provides details on proposed changes in EPA's methodology for estimating radiogenic cancer risks. Many of the changes are based on the contents of a recent National Academy of Sciences (NAS) report (BEIR VII), that addresses cancer and genetic risks from low doses of low-LET radiation. The draft White Paper was prepared for a meeting with the EPA's Science Advisory Board's Radiation Advisory Committee (RAC) in September for seeking advice on the application of BEIR VII and on issues relating to these modifications and expansions. After receiving the Advisory review, we plan to implement the changes by publishing the new methodology in an EPA report, which we expect to submit to the RAC for final review. The revised methodology could then be applied to update the cancer risk coefficients for over 800 radionuclides that are published in EPA's Federal Guidance Report 13. (author)

  16. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  17. A risk-based multi-objective model for optimal placement of sensors in water distribution system

    Science.gov (United States)

    Naserizade, Sareh S.; Nikoo, Mohammad Reza; Montaseri, Hossein

    2018-02-01

    In this study, a new stochastic model based on Conditional Value at Risk (CVaR) and multi-objective optimization methods is developed for optimal placement of sensors in water distribution system (WDS). This model determines minimization of risk which is caused by simultaneous multi-point contamination injection in WDS using CVaR approach. The CVaR considers uncertainties of contamination injection in the form of probability distribution function and calculates low-probability extreme events. In this approach, extreme losses occur at tail of the losses distribution function. Four-objective optimization model based on NSGA-II algorithm is developed to minimize losses of contamination injection (through CVaR of affected population and detection time) and also minimize the two other main criteria of optimal placement of sensors including probability of undetected events and cost. Finally, to determine the best solution, Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), as a subgroup of Multi Criteria Decision Making (MCDM) approach, is utilized to rank the alternatives on the trade-off curve among objective functions. Also, sensitivity analysis is done to investigate the importance of each criterion on PROMETHEE results considering three relative weighting scenarios. The effectiveness of the proposed methodology is examined through applying it to Lamerd WDS in the southwestern part of Iran. The PROMETHEE suggests 6 sensors with suitable distribution that approximately cover all regions of WDS. Optimal values related to CVaR of affected population and detection time as well as probability of undetected events for the best optimal solution are equal to 17,055 persons, 31 mins and 0.045%, respectively. The obtained results of the proposed methodology in Lamerd WDS show applicability of CVaR-based multi-objective simulation-optimization model for incorporating the main uncertainties of contamination injection in order to evaluate extreme value

  18. Risk prediction model: Statistical and artificial neural network approach

    Science.gov (United States)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  19. Contribution to modeling and dynamic risk hedging in energy markets

    International Nuclear Information System (INIS)

    Noufel, Frikha

    2010-12-01

    This thesis is concerned with probabilistic numerical problems about modeling, risk control and risk hedging motivated by applications to energy markets. The main tool is based on stochastic approximation and simulation methods. This thesis consists of three parts. The first one is devoted to the computation of two risk measures of the portfolio loss distribution L: the Value-at-Risk (VaR) and the Conditional Value-at-Risk (CVaR). This computation uses a stochastic algorithm combined with an adaptive variance reduction technique. The first part of this chapter deals with the finite dimensional case, the second part extends the results of the first part to the case of a path-dependency process and the last one deals low discrepancy sequences. The second chapter is devoted with risk minimizing hedging strategies in an incomplete market operating in discrete time using quantization based stochastic approximation. Theoretical results on CVaR hedging are presented then numerical aspects are addressed in a Markovian framework. The last part deals with joint modeling of Gas and Electricity spot prices. The multi-factor model presented is based on stationary Ornstein process with parameterized diffusion coefficient. (author)

  20. Designing an integrated model based on the indicators Quality and Earned Value for risk management in Information Technology Projects

    OpenAIRE

    TATLARI, Mohammad Reza; KAZEMİPOOR, Hamed

    2015-01-01

    There are two effective factors on Information Technology (IT) projects risk including quality and earned value so that by controlling these two factors and their increased level in IT projects, the corresponding risk can be decreased. Therefore in present study, an integrated model was designed based on quality and earned value indicators for risk management in IT projects on a new and efficient approach. The proposed algorithm included the steps such as preparing a list of several indicator...

  1. On Modeling Risk Shocks

    OpenAIRE

    Dorofeenko, Victor; Lee, Gabriel; Salyer, Kevin; Strobel, Johannes

    2016-01-01

    Within the context of a financial accelerator model, we model time-varying uncertainty (i.e. risk shocks) through the use of a mixture Normal model with time variation in the weights applied to the underlying distributions characterizing entrepreneur productivity. Specifically, we model capital producers (i.e. the entrepreneurs) as either low-risk (relatively small second moment for productivity) and high-risk (relatively large second moment for productivity) and the fraction of both types is...

  2. The Application of Asymmetric Liquidity Risk Measure in Modelling the Risk of Investment

    Directory of Open Access Journals (Sweden)

    Garsztka Przemysław

    2015-06-01

    Full Text Available The article analyses the relationship between investment risk (as measured by the variance of returns or standard deviation of returns and liquidity risk. The paper presents a method for calculating a new measure of liquidity risk, based on the characteristic line. In addition, it is checked what is the impact of liquidity risk to the volatility of daily returns. To describe this relationship dynamic econometric models were used. It was found that there was an econometric relationship between the proposed measure liquidity risk and the variance of returns.

  3. The Earnings/Price Risk Factor in Capital Asset Pricing Models

    Directory of Open Access Journals (Sweden)

    Rafael Falcão Noda

    2015-01-01

    Full Text Available This article integrates the ideas from two major lines of research on cost of equity and asset pricing: multi-factor models and ex ante accounting models. The earnings/price ratio is used as a proxy for the ex ante cost of equity, in order to explain realized returns of Brazilian companies within the period from 1995 to 2013. The initial finding was that stocks with high (low earnings/price ratios have higher (lower risk-adjusted realized returns, already controlled by the capital asset pricing model's beta. The results show that selecting stocks based on high earnings/price ratios has led to significantly higher risk-adjusted returns in the Brazilian market, with average abnormal returns close to 1.3% per month. We design asset pricing models including an earnings/price risk factor, i.e. high earnings minus low earnings, based on the Fama and French three-factor model. We conclude that such a risk factor is significant to explain returns on portfolios, even when controlled by size and market/book ratios. Models including the high earnings minus low earnings risk factor were better to explain stock returns in Brazil when compared to the capital asset pricing model and to the Fama and French three-factor model, having the lowest number of significant intercepts. These findings may be due to the impact of historically high inflation rates, which reduce the information content of book values, thus making the models based on earnings/price ratios better than those based on market/book ratios. Such results are different from those obtained in more developed markets and the superiority of the earnings/price ratio for asset pricing may also exist in other emerging markets.

  4. Construction of Site Risk Model using Individual Unit Risk Model in a NPP Site

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Ho Gon; Han, Sang Hoon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    Since Fukushima accident, strong needs to estimate site risk has been increased to identify the possibility of re-occurrence of such a tremendous disaster and prevent such a disaster. Especially, in a site which has large fleet of nuclear power plants, reliable site risk assessment is very emergent to confirm the safety. In Korea, there are several nuclear power plant site which have more than 6 NPPs. In general, risk model of a NPP in terms of PSA is very complicated and furthermore, it is expected that the site risk model is more complex than that. In this paper, the method for constructing site risk model is proposed by using individual unit risk model. Procedure for the development of site damage (risk) model was proposed in the present paper. Since the site damage model is complicated in the sense of the scale of the system and dependency of the components of the system, conventional method may not be applicable in many side of the problem.

  5. A comparative review of radiation-induced cancer risk models

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Hee; Kim, Ju Youl [FNC Technology Co., Ltd., Yongin (Korea, Republic of); Han, Seok Jung [Risk and Environmental Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-06-15

    With the need for a domestic level 3 probabilistic safety assessment (PSA), it is essential to develop a Korea-specific code. Health effect assessments study radiation-induced impacts; in particular, long-term health effects are evaluated in terms of cancer risk. The objective of this study was to analyze the latest cancer risk models developed by foreign organizations and to compare the methodology of how they were developed. This paper also provides suggestions regarding the development of Korean cancer risk models. A review of cancer risk models was carried out targeting the latest models: the NUREG model (1993), the BEIR VII model (2006), the UNSCEAR model (2006), the ICRP 103 model (2007), and the U.S. EPA model (2011). The methodology of how each model was developed is explained, and the cancer sites, dose and dose rate effectiveness factor (DDREF) and mathematical models are also described in the sections presenting differences among the models. The NUREG model was developed by assuming that the risk was proportional to the risk coefficient and dose, while the BEIR VII, UNSCEAR, ICRP, and U.S. EPA models were derived from epidemiological data, principally from Japanese atomic bomb survivors. The risk coefficient does not consider individual characteristics, as the values were calculated in terms of population-averaged cancer risk per unit dose. However, the models derived by epidemiological data are a function of sex, exposure age, and attained age of the exposed individual. Moreover, the methodologies can be used to apply the latest epidemiological data. Therefore, methodologies using epidemiological data should be considered first for developing a Korean cancer risk model, and the cancer sites and DDREF should also be determined based on Korea-specific studies. This review can be used as a basis for developing a Korean cancer risk model in the future.

  6. MATHEMATICAL RISK ANALYSIS: VIA NICHOLAS RISK MODEL AND BAYESIAN ANALYSIS

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-07-01

    Full Text Available The objective of this second part of a two-phased study was to explorethe predictive power of quantitative risk analysis (QRA method andprocess within Higher Education Institution (HEI. The method and process investigated the use impact analysis via Nicholas risk model and Bayesian analysis, with a sample of hundred (100 risk analysts in a historically black South African University in the greater Eastern Cape Province.The first findings supported and confirmed previous literature (KingIII report, 2009: Nicholas and Steyn, 2008: Stoney, 2007: COSA, 2004 that there was a direct relationship between risk factor, its likelihood and impact, certiris paribus. The second finding in relation to either controlling the likelihood or the impact of occurrence of risk (Nicholas risk model was that to have a brighter risk reward, it was important to control the likelihood ofoccurrence of risks as compared with its impact so to have a direct effect on entire University. On the Bayesian analysis, thus third finding, the impact of risk should be predicted along three aspects. These aspects included the human impact (decisions made, the property impact (students and infrastructural based and the business impact. Lastly, the study revealed that although in most business cases, where as business cycles considerably vary dependingon the industry and or the institution, this study revealed that, most impacts in HEI (University was within the period of one academic.The recommendation was that application of quantitative risk analysisshould be related to current legislative framework that affects HEI.

  7. Incentivising flood risk adaptation through ris based insurance premiums: trade-offs between affordability and risk reduction

    NARCIS (Netherlands)

    Hudson, P.G.M.B.; Botzen, W.J.W.; Feyen, L.; Aerts, J.C.J.H.

    2016-01-01

    The financial incentives offered by the risk-based pricing of insurance can stimulate policyholder adaptation to flood risk while potentially conflicting with affordability. We examine the trade-off between risk reduction and affordability in a model of public-private flood insurance in France and

  8. Synthetic biology between challenges and risks: suggestions for a model of governance and a regulatory framework, based on fundamental rights.

    Science.gov (United States)

    Colussi, Ilaria Anna

    2013-01-01

    This paper deals with the emerging synthetic biology, its challenges and risks, and tries to design a model for the governance and regulation of the field. The model is called of "prudent vigilance" (inspired by the report about synthetic biology, drafted by the U.S. Presidential Commission on Bioethics, 2010), and it entails (a) an ongoing and periodically revised process of assessment and management of all the risks and concerns, and (b) the adoption of policies - taken through "hard law" and "soft law" sources - that are based on the principle of proportionality (among benefits and risks), on a reasonable balancing between different interests and rights at stake, and are oriented by a constitutional frame, which is represented by the protection of fundamental human rights emerging in the field of synthetic biology (right to life, right to health, dignity, freedom of scientific research, right to environment). After the theoretical explanation of the model, its operability is "checked", by considering its application with reference to only one specific risk brought up by synthetic biology - biosecurity risk, i.e. the risk of bioterrorism.

  9. A risk-based sensor placement methodology

    International Nuclear Information System (INIS)

    Lee, Ronald W.; Kulesz, James J.

    2008-01-01

    A risk-based sensor placement methodology is proposed to solve the problem of optimal location of sensors to protect population against the exposure to, and effects of, known and/or postulated chemical, biological, and/or radiological threats. Risk is calculated as a quantitative value representing population at risk from exposure at standard exposure levels. Historical meteorological data are used to characterize weather conditions as the frequency of wind speed and direction pairs. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate risk values. Sensor locations are determined via an iterative dynamic programming algorithm whereby threats detected by sensors placed in prior iterations are removed from consideration in subsequent iterations. In addition to the risk-based placement algorithm, the proposed methodology provides a quantification of the marginal utility of each additional sensor. This is the fraction of the total risk accounted for by placement of the sensor. Thus, the criteria for halting the iterative process can be the number of sensors available, a threshold marginal utility value, and/or a minimum cumulative utility achieved with all sensors

  10. Effects of a risk-based online mammography intervention on accuracy of perceived risk and mammography intentions.

    Science.gov (United States)

    Seitz, Holli H; Gibson, Laura; Skubisz, Christine; Forquer, Heather; Mello, Susan; Schapira, Marilyn M; Armstrong, Katrina; Cappella, Joseph N

    2016-10-01

    This experiment tested the effects of an individualized risk-based online mammography decision intervention. The intervention employs exemplification theory and the Elaboration Likelihood Model of persuasion to improve the match between breast cancer risk and mammography intentions. 2918 women ages 35-49 were stratified into two levels of 10-year breast cancer risk (<1.5%; ≥1.5%) then randomly assigned to one of eight conditions: two comparison conditions and six risk-based intervention conditions that varied according to a 2 (amount of content: brief vs. extended) x 3 (format: expository vs. untailored exemplar [example case] vs. tailored exemplar) design. Outcomes included mammography intentions and accuracy of perceived breast cancer risk. Risk-based intervention conditions improved the match between objective risk estimates and perceived risk, especially for high-numeracy women with a 10-year breast cancer risk ≤1.5%. For women with a risk≤1.5%, exemplars improved accuracy of perceived risk and all risk-based interventions increased intentions to wait until age 50 to screen. A risk-based mammography intervention improved accuracy of perceived risk and the match between objective risk estimates and mammography intentions. Interventions could be applied in online or clinical settings to help women understand risk and make mammography decisions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Development and Evaluation of a Simple, Multifactorial Model Based on Landing Performance to Indicate Injury Risk in Surfing Athletes.

    Science.gov (United States)

    Lundgren, Lina E; Tran, Tai T; Nimphius, Sophia; Raymond, Ellen; Secomb, Josh L; Farley, Oliver R L; Newton, Robert U; Steele, Julie R; Sheppard, Jeremy M

    2015-11-01

    To develop and evaluate a multifactorial model based on landing performance to estimate injury risk for surfing athletes. Five measures were collected from 78 competitive surfing athletes and used to create a model to serve as a screening tool for landing tasks and potential injury risk. In the second part of the study, the model was evaluated using junior surfing athletes (n = 32) with a longitudinal follow-up of their injuries over 26 wk. Two models were compared based on the collected data, and magnitude-based inferences were applied to determine the likelihood of differences between injured and noninjured groups. The study resulted in a model based on 5 measures--ankle-dorsiflexion range of motion, isometric midthigh-pull lower-body strength, time to stabilization during a drop-and-stick (DS) landing, relative peak force during a DS landing, and frontal-plane DS-landing video analysis--for male and female professional surfers and male and female junior surfers. Evaluation of the model showed that a scaled probability score was more likely to detect injuries in junior surfing athletes and reported a correlation of r = .66, P = .001, with a model of equal variable importance. The injured (n = 7) surfers had a lower probability score (0.18 ± 0.16) than the noninjured group (n = 25, 0.36 ± 0.15), with 98% likelihood, Cohen d = 1.04. The proposed model seems sensitive and easy to implement and interpret. Further research is recommended to show full validity for potential adaptations for other sports.

  12. Malaria in Africa: vector species' niche models and relative risk maps.

    Directory of Open Access Journals (Sweden)

    Alexander Moffett

    2007-09-01

    Full Text Available A central theoretical goal of epidemiology is the construction of spatial models of disease prevalence and risk, including maps for the potential spread of infectious disease. We provide three continent-wide maps representing the relative risk of malaria in Africa based on ecological niche models of vector species and risk analysis at a spatial resolution of 1 arc-minute (9 185 275 cells of approximately 4 sq km. Using a maximum entropy method we construct niche models for 10 malaria vector species based on species occurrence records since 1980, 19 climatic variables, altitude, and land cover data (in 14 classes. For seven vectors (Anopheles coustani, A. funestus, A. melas, A. merus, A. moucheti, A. nili, and A. paludis these are the first published niche models. We predict that Central Africa has poor habitat for both A. arabiensis and A. gambiae, and that A. quadriannulatus and A. arabiensis have restricted habitats in Southern Africa as claimed by field experts in criticism of previous models. The results of the niche models are incorporated into three relative risk models which assume different ecological interactions between vector species. The "additive" model assumes no interaction; the "minimax" model assumes maximum relative risk due to any vector in a cell; and the "competitive exclusion" model assumes the relative risk that arises from the most suitable vector for a cell. All models include variable anthrophilicity of vectors and spatial variation in human population density. Relative risk maps are produced from these models. All models predict that human population density is the critical factor determining malaria risk. Our method of constructing relative risk maps is equally general. We discuss the limits of the relative risk maps reported here, and the additional data that are required for their improvement. The protocol developed here can be used for any other vector-borne disease.

  13. Custom v. Standardized Risk Models

    Directory of Open Access Journals (Sweden)

    Zura Kakushadze

    2015-05-01

    Full Text Available We discuss when and why custom multi-factor risk models are warranted and give source code for computing some risk factors. Pension/mutual funds do not require customization but standardization. However, using standardized risk models in quant trading with much shorter holding horizons is suboptimal: (1 longer horizon risk factors (value, growth, etc. increase noise trades and trading costs; (2 arbitrary risk factors can neutralize alpha; (3 “standardized” industries are artificial and insufficiently granular; (4 normalization of style risk factors is lost for the trading universe; (5 diversifying risk models lowers P&L correlations, reduces turnover and market impact, and increases capacity. We discuss various aspects of custom risk model building.

  14. Model of personalised risk assessment of phytoestrogen intake based on 11 SNP in ESR1 and ESR2 genes

    Directory of Open Access Journals (Sweden)

    Radoslav Zidek

    2016-12-01

    Full Text Available Phytoestrogens can induce biological responses in vertebrates by mimicking or modulating the action or production of endogenous hormones, and because of their structural similarity with estradiol they have the ability to cause estrogenic or anti-estrogenic effects. Risk assessment of phytoestrogens intake may therefore provide important information useful in the adjustment of nutrients composition, as one of nutrigenomics approaches. Proper risk assessment is an essential part of good nutrient composition. The current risk assessment procedures does use an additive effect of genes, but the accumulation of relevant factors do not count with the distribution of risk in the European population. A combination of approaches based on genetic score, along with the use of the data bases like 1000 genomes and dbSNP is a powerful tool for population risk modelling that would provide reasonable results without needs of as testing a representative number of individual genetic profiles.

  15. Using risk based tools in emergency response

    International Nuclear Information System (INIS)

    Dixon, B.W.; Ferns, K.G.

    1987-01-01

    Probabilistic Risk Assessment (PRA) techniques are used by the nuclear industry to model the potential response of a reactor subjected to unusual conditions. The knowledge contained in these models can aid in emergency response decision making. This paper presents requirements for a PRA based emergency response support system to date. A brief discussion of published work provides background for a detailed description of recent developments. A rapid deep assessment capability for specific portions of full plant models is presented. The program uses a screening rule base to control search space expansion in a combinational algorithm

  16. A comparison of radiological risk assessment models: Risk assessment models used by the BEIR V Committee, UNSCEAR, ICRP, and EPA (for NESHAP)

    International Nuclear Information System (INIS)

    Wahl, L.E.

    1994-03-01

    Radiological risk assessments and resulting risk estimates have been developed by numerous national and international organizations, including the National Research Council's fifth Committee on the Biological Effects of Ionizing Radiations (BEIR V), the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR), and the International Commission on Radiological Protection (ICRP). A fourth organization, the Environmental Protection Agency (EPA), has also performed a risk assessment as a basis for the National Emission Standards for Hazardous Air Pollutants (NESHAP). This paper compares the EPA's model of risk assessment with the models used by the BEIR V Committee, UNSCEAR, and ICRP. Comparison is made of the values chosen by each organization for several model parameters: populations used in studies and population transfer coefficients, dose-response curves and dose-rate effects, risk projection methods, and risk estimates. This comparison suggests that the EPA has based its risk assessment on outdated information and that the organization should consider adopting the method used by the BEIR V Committee, UNSCEAR, or ICRP

  17. Quantifying population-level risks using an individual-based model: sea otters, Harlequin Ducks, and the Exxon Valdez oil spill.

    Science.gov (United States)

    Harwell, Mark A; Gentile, John H; Parker, Keith R

    2012-07-01

    Ecological risk assessments need to advance beyond evaluating risks to individuals that are largely based on toxicity studies conducted on a few species under laboratory conditions, to assessing population-level risks to the environment, including considerations of variability and uncertainty. Two individual-based models (IBMs), recently developed to assess current risks to sea otters and seaducks in Prince William Sound more than 2 decades after the Exxon Valdez oil spill (EVOS), are used to explore population-level risks. In each case, the models had previously shown that there were essentially no remaining risks to individuals from polycyclic aromatic hydrocarbons (PAHs) derived from the EVOS. New sensitivity analyses are reported here in which hypothetical environmental exposures to PAHs were heuristically increased until assimilated doses reached toxicity reference values (TRVs) derived at the no-observed-adverse-effects and lowest-observed-adverse-effects levels (NOAEL and LOAEL, respectively). For the sea otters, this was accomplished by artificially increasing the number of sea otter pits that would intersect remaining patches of subsurface oil residues by orders of magnitude over actual estimated rates. Similarly, in the seaduck assessment, the PAH concentrations in the constituents of diet, sediments, and seawater were increased in proportion to their relative contributions to the assimilated doses by orders of magnitude over measured environmental concentrations, to reach the NOAEL and LOAEL thresholds. The stochastic IBMs simulated millions of individuals. From these outputs, frequency distributions were derived of assimilated doses for populations of 500,000 sea otters or seaducks in each of 7 or 8 classes, respectively. Doses to several selected quantiles were analyzed, ranging from the 1-in-1000th most-exposed individuals (99.9% quantile) to the median-exposed individuals (50% quantile). The resulting families of quantile curves provide the basis for

  18. Method for assessing coal-floor water-inrush risk based on the variable-weight model and unascertained measure theory

    Science.gov (United States)

    Wu, Qiang; Zhao, Dekang; Wang, Yang; Shen, Jianjun; Mu, Wenping; Liu, Honglei

    2017-11-01

    Water inrush from coal-seam floors greatly threatens mining safety in North China and is a complex process controlled by multiple factors. This study presents a mathematical assessment system for coal-floor water-inrush risk based on the variable-weight model (VWM) and unascertained measure theory (UMT). In contrast to the traditional constant-weight model (CWM), which assigns a fixed weight to each factor, the VWM varies with the factor-state value. The UMT employs the confidence principle, which is more effective in ordered partition problems than the maximum membership principle adopted in the former mathematical theory. The method is applied to the Datang Tashan Coal Mine in North China. First, eight main controlling factors are selected to construct the comprehensive evaluation index system. Subsequently, an incentive-penalty variable-weight model is built to calculate the variable weights of each factor. Then, the VWM-UMT model is established using the quantitative risk-grade divide of each factor according to the UMT. On this basis, the risk of coal-floor water inrush in Tashan Mine No. 8 is divided into five grades. For comparison, the CWM is also adopted for the risk assessment, and a differences distribution map is obtained between the two methods. Finally, the verification of water-inrush points indicates that the VWM-UMT model is powerful and more feasible and reasonable. The model has great potential and practical significance in future engineering applications.

  19. A risk-model for hospital mortality among patients with severe sepsis or septic shock based on German national administrative claims data.

    Science.gov (United States)

    Schwarzkopf, Daniel; Fleischmann-Struzek, Carolin; Rüddel, Hendrik; Reinhart, Konrad; Thomas-Rüddel, Daniel O

    2018-01-01

    Sepsis is a major cause of preventable deaths in hospitals. Feasible and valid methods for comparing quality of sepsis care between hospitals are needed. The aim of this study was to develop a risk-adjustment model suitable for comparing sepsis-related mortality between German hospitals. We developed a risk-model using national German claims data. Since these data are available with a time-lag of 1.5 years only, the stability of the model across time was investigated. The model was derived from inpatient cases with severe sepsis or septic shock treated in 2013 using logistic regression with backward selection and generalized estimating equations to correct for clustering. It was validated among cases treated in 2015. Finally, the model development was repeated in 2015. To investigate secular changes, the risk-adjusted trajectory of mortality across the years 2010-2015 was analyzed. The 2013 deviation sample consisted of 113,750 cases; the 2015 validation sample consisted of 134,851 cases. The model developed in 2013 showed good validity regarding discrimination (AUC = 0.74), calibration (observed mortality in 1st and 10th risk-decile: 11%-78%), and fit (R2 = 0.16). Validity remained stable when the model was applied to 2015 (AUC = 0.74, 1st and 10th risk-decile: 10%-77%, R2 = 0.17). There was no indication of overfitting of the model. The final model developed in year 2015 contained 40 risk-factors. Between 2010 and 2015 hospital mortality in sepsis decreased from 48% to 42%. Adjusted for risk-factors the trajectory of decrease was still significant. The risk-model shows good predictive validity and stability across time. The model is suitable to be used as an external algorithm for comparing risk-adjusted sepsis mortality among German hospitals or regions based on administrative claims data, but secular changes need to be taken into account when interpreting risk-adjusted mortality.

  20. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    Science.gov (United States)

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  1. Risk-based performance indicators

    International Nuclear Information System (INIS)

    Azarm, M.A.; Boccio, J.L.; Vesely, W.E.; Lofgren, E.

    1987-01-01

    The purpose of risk-based indicators is to monitor plant safety. Safety is measured by monitoring the potential for core melt (core-melt frequency) and the public risk. Targets for these measures can be set consistent with NRC safety goals. In this process, the performance of safety systems, support systems, major components, and initiating events can be monitored using measures such as unavailability, failure or occurrence frequency. The changes in performance measures and their trends are determined from the time behavior of monitored measures by differentiation between stochastical and actual variations. Therefore, degradation, as well as improvement in the plant safety performance, can be determined. The development of risk-based performance indicators will also provide the means to trace a change in the safety measures to specific problem areas which are amenable to root cause analysis and inspection audits. In addition, systematic methods will be developed to identify specific improvement policies using the plant information system for the identified problem areas. The final product of the performance indicator project will be a methodology, and an integrated and validated set of software packages which, if properly interfaced with the logic model software of a plant, can monitor the plant performance as plant information is provided as input

  2. A new active portfolio risk management for an electricity retailer based on a drawdown risk preference

    International Nuclear Information System (INIS)

    Charwand, Mansour; Gitizadeh, Mohsen; Siano, Pierluigi

    2017-01-01

    This paper addresses the deciding under uncertainty of an electricity retailer in order to maximise its total expected rate of return. The developed methodology is based on the modelling of the stochastic evolution of zonal prices that seeks to manage a portfolio of different contracts. Retailer's load and the price at each zone are forecasted using the seasonal autoregressive integrated moving average (SARIMA) model and a clustering technique is used for scenario reduction. Supply sources include the pool, self-production facilities, forward and bilateral contracts. The risk of cost fluctuation due to uncertainties is explicitly modelled using the multi-scenario drawdown methodology. This risk function quantifies in aggregated format the frequency and magnitude of the portfolio drawdowns over planning horizon. In-sample and out-of-sample runs are performed for a portfolio allocation problem. Carried out experimental results on the basis of realistic data, show that imposing risk constraints improve the “real” performance of a portfolio management in out-of-sample runs. - Highlights: • A new drawdown-based method is introduced to retailer deciding under uncertainty. • This tool is used to assess the risk levels regarding retailer midterm strategies. • The methodology is based on the modeling of the stochastic evolution of zonal prices. • The risk function quantifies the frequency and magnitude of the portfolio drawdowns. • In-sample and out-of-sample runs are performed for a portfolio allocation problem.

  3. Fuzzy logic model to quantify risk perception

    International Nuclear Information System (INIS)

    Bukh, Julia; Dickstein, Phineas

    2008-01-01

    The aim of this study is a quantification of public risk perception towards the nuclear field so as to be considered in decision making whenever the public involvement is sought. The proposed model includes both qualitative factors such as familiarity and voluntariness and numerical factors influencing risk perception, such as probability of occurrence and severity of consequence. Since part of these factors can be characterized only by qualitative expressions and the determination of them are linked with vagueness, imprecision and uncertainty, the most suitable method for the risk level assessment is Fuzzy Logic, which models qualitative aspects of knowledge and reasoning processes without employing precise quantitative analyses. This work, then, offers a Fuzzy-Logic based mean of representing the risk perception by a single numerical feature, which can be weighted and accounted for in decision making procedures. (author)

  4. Big data based fraud risk management at Alibaba

    OpenAIRE

    Chen, Jidong; Tao, Ye; Wang, Haoran; Chen, Tao

    2015-01-01

    With development of mobile internet and finance, fraud risk comes in all shapes and sizes. This paper is to introduce the Fraud Risk Management at Alibaba under big data. Alibaba has built a fraud risk monitoring and management system based on real-time big data processing and intelligent risk models. It captures fraud signals directly from huge amount data of user behaviors and network, analyzes them in real-time using machine learning, and accurately predicts the bad users and transactions....

  5. A Fuzzy Comprehensive Evaluation Model for Sustainability Risk Evaluation of PPP Projects

    Directory of Open Access Journals (Sweden)

    Libiao Bai

    2017-10-01

    Full Text Available Evaluating the sustainability risk level of public–private partnership (PPP projects can reduce project risk incidents and achieve the sustainable development of the organization. However, the existing studies about PPP projects risk management mainly focus on exploring the impact of financial and revenue risks but ignore the sustainability risks, causing the concept of “sustainability” to be missing while evaluating the risk level of PPP projects. To evaluate the sustainability risk level and achieve the most important objective of providing a reference for the public and private sectors when making decisions on PPP project management, this paper constructs a factor system of sustainability risk of PPP projects based on an extensive literature review and develops a mathematical model based on the methods of fuzzy comprehensive evaluation model (FCEM and failure mode, effects and criticality analysis (FMECA for evaluating the sustainability risk level of PPP projects. In addition, this paper conducts computational experiment based on a questionnaire survey to verify the effectiveness and feasibility of this proposed model. The results suggest that this model is reasonable for evaluating the sustainability risk level of PPP projects. To our knowledge, this paper is the first study to evaluate the sustainability risk of PPP projects, which would not only enrich the theories of project risk management, but also serve as a reference for the public and private sectors for the sustainable planning and development. Keywords: sustainability risk eva

  6. APPLICATION OF KMV MODEL TO ASSESS CREDIT RISK OF INDIVIDUAL ENTREPRENEURS

    Directory of Open Access Journals (Sweden)

    Taishin A. A.

    2014-09-01

    Full Text Available The problem of credit risk is relevant for the bank. The purpose of scientific research - to develop a technique of adaptation and application of the model for the evaluation risk of KMV Russian entrepreneurs. The proposed method of evaluation credit risk of KMV Russian entrepreneurs has many advantages. Automation of calculations, based on plausible assumptions, will significantly reduce the time to process the customer's request. The article contains analysis of the KMV model based on the up-to-date results of the theory. The author investigates the possibility of modification, generalization of the model and practical implementation of the risk estimate of default entrepreneur KMV model using software package Visual Basic for Application on the example Management reporting of the entrepreneur. Showing the features of its application in the light of the modern achievements in the theory and practice of financial analysis. In this article suggested the finished result of evaluation risk of KMV Russian entrepreneurs, for risk assessment offered more precise recommendations for the practical use of KMV as a basic tool.

  7. Study of operational risk-based configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Vesely, W E [Science Applications International Corp., Dublin, OH (United States); Samanta, P K; Kim, I S [Brookhaven National Lab., Upton, NY (United States)

    1991-08-01

    This report studies aspects of a risk-based configuration control system to detect and control plant configurations from a risk perspective. Configuration control, as the term is used here, is the management of component configurations to achieve specific objectives. One important objective is to control risk and safety. Another is to operate efficiently and make effective use of available resources. PSA-based evaluations are performed to study configuration to core-melt frequency and core-melt probability for two plants. Some equipment configurations can cause large core-melt frequency and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the core-melt probability contributions are also generally small. The insights from this evaluation are used to develop the framework for an effective risk-based configuration control system. The focal points of such a system and the requirements for tools development for implementing the system are defined. The requirements of risk models needed for the system, and the uses of plant-specific data are also discussed. 18 refs., 25 figs., 10 tabs.

  8. Comparison of additive (absolute) risk projection models and multiplicative (relative) risk projection models in estimating radiation-induced lifetime cancer risk

    International Nuclear Information System (INIS)

    Kai, Michiaki; Kusama, Tomoko

    1990-01-01

    Lifetime cancer risk estimates depend on risk projection models. While the increasing lengths of follow-up observation periods of atomic bomb survivors in Hiroshima and Nagasaki bring about changes in cancer risk estimates, the validity of the two risk projection models, the additive risk projection model (AR) and multiplicative risk projection model (MR), comes into question. This paper compares the lifetime risk or loss of life-expectancy between the two projection models on the basis of BEIR-III report or recently published RERF report. With Japanese cancer statistics the estimates of MR were greater than those of AR, but a reversal of these results was seen when the cancer hazard function for India was used. When we investigated the validity of the two projection models using epidemiological human data and animal data, the results suggested that MR was superior to AR with respect to temporal change, but there was little evidence to support its validity. (author)

  9. Erosion risk assessment in the southern Amazon - Data Preprocessing, data base application and process based modelling

    Science.gov (United States)

    Schindewolf, Marcus; Herrmann, Marie-Kristin; Herrmann, Anne-Katrin; Schultze, Nico; Amorim, Ricardo S. S.; Schmidt, Jürgen

    2015-04-01

    The study region along the BR 16 highway belongs to the "Deforestation Arc" at the southern border of the Amazon rainforest. At the same time, it incorporates a land use gradient as colonization started in the 1975-1990 in Central Mato Grosso in 1990 in northern Mato Grosso and most recently in 2004-2005 in southern Pará. Based on present knowledge soil erosion is one of the key driver of soil degradation. Hence, there is a strong need to implement soil erosion control measures in eroding landscapes. Planning and dimensioning of such measures require reliable and detailed information on the temporal and spatial distribution of soil loss, sediment transport and deposition. Soil erosion models are increasingly used, in order to simulate the physical processes involved and to predict the effects of soil erosion control measures. The process based EROSION 3D simulation model is used for surveying soil erosion and deposition on regional catchments. Although EROSION 3D is a widespread, extensively validated model, the application of the model on regional scale remains challenging due to the enormous data requirements and complex data processing operations. In this context the study includes the compilation, validation and generalisation of existing land use and soil data in order to generate a consistent EROSION 3D input datasets. As a part of this process a GIS-linked data base application allows to transfer the original soil and land use data into model specific parameter files. This combined methodology provides different risk assessment maps for certain demands on regional scale. Besides soil loss and sediment transport, sediment pass over points into surface water bodies and particle enrichment can be simulated using the EROSION 3D model. Thus the estimation of particle bound nutrient and pollutant inputs into surface water bodies becomes possible. The study ended up in a user-friendly, timesaving and improved software package for the simulation of soil loss and

  10. Modeling risk assessment for nuclear processing plants with LAVA

    International Nuclear Information System (INIS)

    Smith, S.T.; Tisinger, R.M.

    1988-01-01

    Using the Los Alamos Vulnerability and Risk Assessment (LAVA) methodology, the authors developed a model for assessing risks associated with nuclear processing plants. LAVA is a three-part systematic approach to risk assessment. The first part is the mathematical methodology; the second is the general personal computer-based software engine; and the third is the application itself. The methodology provides a framework for creating applications for the software engine to operate upon; all application-specific information is data. Using LAVA, the authors build knowledge-based expert systems to assess risks in applications systems comprising a subject system and a safeguards system. The subject system model is sets of threats, assets, and undesirable outcomes. The safeguards system model is sets of safeguards functions for protecting the assets from the threats by preventing or ameliorating the undesirable outcomes, sets of safeguards subfunctions whose performance determine whether the function is adequate and complete, and sets of issues, appearing as interactive questionnaires, whose measures (in both monetary and linguistic terms) define both the weaknesses in the safeguards system and the potential costs of an undesirable outcome occurring

  11. Fuzzy rule-based modelling for human health risk from naturally occurring radioactive materials in produced water

    International Nuclear Information System (INIS)

    Shakhawat, Chowdhury; Tahir, Husain; Neil, Bose

    2006-01-01

    Produced water, discharged from offshore oil and gas operations, contains chemicals from formation water, condensed water, and any chemical added down hole or during the oil/water separation process. Although, most of the contaminants fall below the detection limits within a short distance from the discharge port, a few of the remaining contaminants including naturally occurring radioactive materials (NORM) are of concern due to their bioavailability in the media and bioaccumulation characteristics in finfish and shellfish species used for human consumption. In the past, several initiatives have been taken to model human health risk from NORM in produced water. The parameters of the available risk assessment models are imprecise and sparse in nature. In this study, a fuzzy possibilistic evaluation using fuzzy rule based modeling has been presented. Being conservative in nature, the possibilistic approach considers possible input parameter values; thus provides better environmental prediction than the Monte Carlo (MC) calculation. The uncertainties of the input parameters were captured with fuzzy triangular membership functions (TFNs). Fuzzy if-then rules were applied for input concentrations of two isotopes of radium, namely 226 Ra, and 228 Ra, available in produced water and bulk dilution to evaluate the radium concentration in fish tissue used for human consumption. The bulk dilution was predicted using four input parameters: produced water discharge rate, ambient seawater velocity, depth of discharge port and density gradient. The evaluated cancer risk shows compliance with the regulatory guidelines; thus minimum risk to human health is expected from NORM components in produced water

  12. Operational Risk Measurement of Chinese Commercial Banks Based on Extreme Value Theory

    Science.gov (United States)

    Song, Jiashan; Li, Yong; Ji, Feng; Peng, Cheng

    The financial institutions and supervision institutions have all agreed on strengthening the measurement and management of operational risks. This paper attempts to build a model on the loss of operational risks basing on Peak Over Threshold model, emphasizing on weighted least square, which improved Hill’s estimation method, while discussing the situation of small sample, and fix the sample threshold more objectively basing on the media-published data of primary banks loss on operational risk from 1994 to 2007.

  13. 77 FR 53059 - Risk-Based Capital Guidelines: Market Risk

    Science.gov (United States)

    2012-08-30

    ...'' framework that includes (1) Risk-based capital requirements for credit risk, market risk, and operational... default and credit quality migration risk for non-securitization credit products. With respect to... securitization positions, the revisions assign a specific risk- weighting factor based on the credit rating of a...

  14. Mode of action based risk assessment of the botanical food-borne alkenylbenzene apiol from parsley using physiologically based kinetic (PBK) modelling and read-across from safrole.

    Science.gov (United States)

    Alajlouni, Abdalmajeed M; Al Malahmeh, Amer J; Kiwamoto, Reiko; Wesseling, Sebastiaan; Soffers, Ans E M F; Al-Subeihi, Ala A A; Vervoort, Jacques; Rietjens, Ivonne M C M

    2016-03-01

    The present study developed physiologically-based kinetic (PBK) models for the alkenylbenzene apiol in order to facilitate risk assessment based on read-across from the related alkenylbenzene safrole. Model predictions indicate that in rat liver the formation of the 1'-sulfoxy metabolite is about 3 times lower for apiol than for safrole. These data support that the lower confidence limit of the benchmark dose resulting in a 10% extra cancer incidence (BMDL10) that would be obtained in a rodent carcinogenicity study with apiol may be 3-fold higher for apiol than for safrole. These results enable a preliminary risk assessment for apiol, for which tumor data are not available, using a BMDL10 value of 3 times the BMDL10 for safrole. Based on an estimated BMDL10 for apiol of 5.7-15.3 mg/kg body wt per day and an estimated daily intake of 4 × 10(-5) mg/kg body wt per day, the margin of exposure (MOE) would amount to 140,000-385,000. This indicates a low priority for risk management. The present study shows how PBK modelling can contribute to the development of alternatives for animal testing, facilitating read-across from compounds for which in vivo toxicity studies on tumor formation are available to compounds for which these data are unavailable. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  16. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  17. Agent based models for testing city evacuation strategies under a flood event as strategy to reduce flood risk

    Science.gov (United States)

    Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran

    2016-04-01

    This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city

  18. Expert judgment based multi-criteria decision model to address uncertainties in risk assessment of nanotechnology-enabled food products

    International Nuclear Information System (INIS)

    Flari, Villie; Chaudhry, Qasim; Neslo, Rabin; Cooke, Roger

    2011-01-01

    Currently, risk assessment of nanotechnology-enabled food products is considered difficult due to the large number of uncertainties involved. We developed an approach which could address some of the main uncertainties through the use of expert judgment. Our approach employs a multi-criteria decision model, based on probabilistic inversion that enables capturing experts’ preferences in regard to safety of nanotechnology-enabled food products, and identifying their opinions in regard to the significance of key criteria that are important in determining the safety of such products. An advantage of these sample-based techniques is that they provide out-of-sample validation and therefore a robust scientific basis. This validation in turn adds predictive power to the model developed. We achieved out-of-sample validation in two ways: (1) a portion of the expert preference data was excluded from the model’s fitting and was then predicted by the model fitted on the remaining rankings and (2) a (partially) different set of experts generated new scenarios, using the same criteria employed in the model, and ranked them; their ranks were compared with ranks predicted by the model. The degree of validation in each method was less than perfect but reasonably substantial. The validated model we applied captured and modelled experts’ preferences regarding safety of hypothetical nanotechnology-enabled food products. It appears therefore that such an approach can provide a promising route to explore further for assessing the risk of nanotechnology-enabled food products.

  19. A Probabilistic Typhoon Risk Model for Vietnam

    Science.gov (United States)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  20. Liver stiffness value-based risk estimation of late recurrence after curative resection of hepatocellular carcinoma: development and validation of a predictive model.

    Directory of Open Access Journals (Sweden)

    Kyu Sik Jung

    Full Text Available Preoperative liver stiffness (LS measurement using transient elastography (TE is useful for predicting late recurrence after curative resection of hepatocellular carcinoma (HCC. We developed and validated a novel LS value-based predictive model for late recurrence of HCC.Patients who were due to undergo curative resection of HCC between August 2006 and January 2010 were prospectively enrolled and TE was performed prior to operations by study protocol. The predictive model of late recurrence was constructed based on a multiple logistic regression model. Discrimination and calibration were used to validate the model.Among a total of 139 patients who were finally analyzed, late recurrence occurred in 44 patients, with a median follow-up of 24.5 months (range, 12.4-68.1. We developed a predictive model for late recurrence of HCC using LS value, activity grade II-III, presence of multiple tumors, and indocyanine green retention rate at 15 min (ICG R15, which showed fairly good discrimination capability with an area under the receiver operating characteristic curve (AUROC of 0.724 (95% confidence intervals [CIs], 0.632-0.816. In the validation, using a bootstrap method to assess discrimination, the AUROC remained largely unchanged between iterations, with an average AUROC of 0.722 (95% CIs, 0.718-0.724. When we plotted a calibration chart for predicted and observed risk of late recurrence, the predicted risk of late recurrence correlated well with observed risk, with a correlation coefficient of 0.873 (P<0.001.A simple LS value-based predictive model could estimate the risk of late recurrence in patients who underwent curative resection of HCC.

  1. The Global Earthquake Model and Disaster Risk Reduction

    Science.gov (United States)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  2. Breast cancer screening in an era of personalized regimens: a conceptual model and National Cancer Institute initiative for risk-based and preference-based approaches at a population level.

    Science.gov (United States)

    Onega, Tracy; Beaber, Elisabeth F; Sprague, Brian L; Barlow, William E; Haas, Jennifer S; Tosteson, Anna N A; D Schnall, Mitchell; Armstrong, Katrina; Schapira, Marilyn M; Geller, Berta; Weaver, Donald L; Conant, Emily F

    2014-10-01

    Breast cancer screening holds a prominent place in public health, health care delivery, policy, and women's health care decisions. Several factors are driving shifts in how population-based breast cancer screening is approached, including advanced imaging technologies, health system performance measures, health care reform, concern for "overdiagnosis," and improved understanding of risk. Maximizing benefits while minimizing the harms of screening requires moving from a "1-size-fits-all" guideline paradigm to more personalized strategies. A refined conceptual model for breast cancer screening is needed to align women's risks and preferences with screening regimens. A conceptual model of personalized breast cancer screening is presented herein that emphasizes key domains and transitions throughout the screening process, as well as multilevel perspectives. The key domains of screening awareness, detection, diagnosis, and treatment and survivorship are conceptualized to function at the level of the patient, provider, facility, health care system, and population/policy arena. Personalized breast cancer screening can be assessed across these domains with both process and outcome measures. Identifying, evaluating, and monitoring process measures in screening is a focus of a National Cancer Institute initiative entitled PROSPR (Population-based Research Optimizing Screening through Personalized Regimens), which will provide generalizable evidence for a risk-based model of breast cancer screening, The model presented builds on prior breast cancer screening models and may serve to identify new measures to optimize benefits-to-harms tradeoffs in population-based screening, which is a timely goal in the era of health care reform. © 2014 American Cancer Society.

  3. Statistical models for competing risk analysis

    International Nuclear Information System (INIS)

    Sather, H.N.

    1976-08-01

    Research results on three new models for potential applications in competing risks problems. One section covers the basic statistical relationships underlying the subsequent competing risks model development. Another discusses the problem of comparing cause-specific risk structure by competing risks theory in two homogeneous populations, P1 and P2. Weibull models which allow more generality than the Berkson and Elveback models are studied for the effect of time on the hazard function. The use of concomitant information for modeling single-risk survival is extended to the multiple failure mode domain of competing risks. The model used to illustrate the use of this methodology is a life table model which has constant hazards within pre-designated intervals of the time scale. Two parametric models for bivariate dependent competing risks, which provide interesting alternatives, are proposed and examined

  4. Development of good modelling practice for phsiologically based pharmacokinetic models for use in risk assessment: The first steps

    Science.gov (United States)

    The increasing use of tissue dosimetry estimated using pharmacokinetic models in chemical risk assessments in multiple countries necessitates the need to develop internationally recognized good modelling practices. These practices would facilitate sharing of models and model eva...

  5. A Risk-Based Approach for Asset Allocation with A Defaultable Share

    Directory of Open Access Journals (Sweden)

    Yang Shen

    2018-02-01

    Full Text Available This paper presents a novel risk-based approach for an optimal asset allocation problem with default risk, where a money market account, an ordinary share and a defaultable security are investment opportunities in a general non-Markovian economy incorporating random market parameters. The objective of an investor is to select an optimal mix of these securities such that a risk metric of an investment portfolio is minimized. By adopting a sub-additive convex risk measure, which takes into account interest rate risk, as a measure for risk, the investment problem is discussed mathematically in a form of a two-player, zero-sum, stochastic differential game between the investor and the market. A backward stochastic differential equation approach is used to provide a flexible and theoretically sound way to solve the game problem. Closed-form expressions for the optimal strategies of the investor and the market are obtained when the penalty function is a quadratic function and when the risk measure is a sub-additive coherent risk measure. An important case of the general non-Markovian model, namely the self-exciting threshold diffusion model with time delay, is considered. Numerical examples based on simulations for the self-exciting threshold diffusion model with and without time delay are provided to illustrate how the proposed model can be applied in this important case. The proposed model can be implemented using Excel spreadsheets.

  6. A Macroeconomic Model of Credit Risk in Uruguay

    Directory of Open Access Journals (Sweden)

    Gabriel Illanes

    Full Text Available In this paper we evaluate credit risk of the economy as a whole, aiming at the study of the financial stability. This analysis uses as proxy the credit granted by the banking system. We use a non-linear parametric model based on Merton's structural framework for the analysis of the risk associated to a loan portfolio. In this model, default occurs when the return of an economic agent falls under certain threshold which depends on different macroeconomic variables. We use this model to assess the credit risk module in stress tests for the local banking system. We also estimate the "elasticities" of credit categories correspondig to corporate credit and consumer credit, both in national currency and american dollars. We obtain the parameters for the model using maximum likelihood, where the likelihood function contains a random latent factor which is assumed to follow a normal distribution.

  7. STRUCTURE OF MODELS FOR AGGREGATE ASSESSMENT OF FINANCIAL RISK COMMERCIAL BANKS

    OpenAIRE

    G. Kryshtal

    2016-01-01

    Conceptual approaches use a structural model for assessment of financial risk commercial banks, namely the risk measurement in combination: a comparison of its capital, calculated based on the standard approach of Basel II advanced approaches of Basel II and the structural model. Analysis of the application of the model in a economics crisis situation, such as the capital adequacy of the commercial banks. Deals with a unified approach to the choice of measure and its risk parameters to measur...

  8. Risk Analysis Method Based on FMEA for Transmission Line in Lightning Hazards

    Directory of Open Access Journals (Sweden)

    You-Yuan WANG

    2014-05-01

    Full Text Available Failure rate of transmission line and reliability of power system are significantly affected by Lightning meteorological factor. In view of the complexity and variability of Lightning meteorological factors, this paper presents lightning trip-out rate model of transmission line in considering distribution of ground flash density and lightning day hours. Meanwhile, presents a failure rate model of transmission line in different condition, and a risk analysis method for transmission line considering multiple risk factors based on risk quantification. This method takes Lightning meteorological factor as the main evaluation standard, and establishes risk degree evaluation system for transmission line including another five evaluation standard. Put forward the risk indicators by quantify the risk factors based on experience date of transmission line in service. Based on the risk indexes comprehensive evaluation is conducted, and the evaluation result closer to practice is achieved, providing basis for transmission line risk warning and maintenance strategy. Through the risk analysis for 220 kV transmission line in a certain power supply bureau, the effectiveness of the proposed method is validated.

  9. A risk assessment model based on fuzzy logic for electricity distribution system asset management

    Directory of Open Access Journals (Sweden)

    Alireza Yazdani

    2014-06-01

    Full Text Available Electricity distribution systems are considered as the most critical sectors in countries because of the essentiality of power supplement security, socioeconomic security, and way of life. According to the central role of electricity distribution systems, risk analysis helps decision maker determine the most serious risk items to allocate the optimal amount of resources and time. Probability-impact (PI matrix is one of the most popular methods for assessment of the risks involved in the system. However, the traditional PI matrix is criticized for its inability to take into account the inherent uncertainty imposed by real-world systems. On the other hand, fuzzy sets are capable of handling the uncertainty. Thus, in this paper, fuzzy risk assessment model is developed in order to assess risk and management for electricity distribution system asset protection. Finally, a comparison analysis is conducted to show the effectiveness and the capability of the new risk assessment model.

  10. Predicting the cumulative risk of death during hospitalization by modeling weekend, weekday and diurnal mortality risks.

    Science.gov (United States)

    Coiera, Enrico; Wang, Ying; Magrabi, Farah; Concha, Oscar Perez; Gallego, Blanca; Runciman, William

    2014-05-21

    Current prognostic models factor in patient and disease specific variables but do not consider cumulative risks of hospitalization over time. We developed risk models of the likelihood of death associated with cumulative exposure to hospitalization, based on time-varying risks of hospitalization over any given day, as well as day of the week. Model performance was evaluated alone, and in combination with simple disease-specific models. Patients admitted between 2000 and 2006 from 501 public and private hospitals in NSW, Australia were used for training and 2007 data for evaluation. The impact of hospital care delivered over different days of the week and or times of the day was modeled by separating hospitalization risk into 21 separate time periods (morning, day, night across the days of the week). Three models were developed to predict death up to 7-days post-discharge: 1/a simple background risk model using age, gender; 2/a time-varying risk model for exposure to hospitalization (admission time, days in hospital); 3/disease specific models (Charlson co-morbidity index, DRG). Combining these three generated a full model. Models were evaluated by accuracy, AUC, Akaike and Bayesian information criteria. There was a clear diurnal rhythm to hospital mortality in the data set, peaking in the evening, as well as the well-known 'weekend-effect' where mortality peaks with weekend admissions. Individual models had modest performance on the test data set (AUC 0.71, 0.79 and 0.79 respectively). The combined model which included time-varying risk however yielded an average AUC of 0.92. This model performed best for stays up to 7-days (93% of admissions), peaking at days 3 to 5 (AUC 0.94). Risks of hospitalization vary not just with the day of the week but also time of the day, and can be used to make predictions about the cumulative risk of death associated with an individual's hospitalization. Combining disease specific models with such time varying- estimates appears to

  11. Model risk analysis for risk management and option pricing

    NARCIS (Netherlands)

    Kerkhof, F.L.J.

    2003-01-01

    Due to the growing complexity of products in financial markets, market participants rely more and more on quantitative models for trading and risk management decisions. This introduces a fairly new type of risk, namely, model risk. In the first part of this thesis we investigate the quantitative

  12. Challenges of Modeling Flood Risk at Large Scales

    Science.gov (United States)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    Flood risk management is a major concern for many nations and for the insurance sector in places where this peril is insured. A prerequisite for risk management, whether in the public sector or in the private sector is an accurate estimation of the risk. Mitigation measures and traditional flood management techniques are most successful when the problem is viewed at a large regional scale such that all inter-dependencies in a river network are well understood. From an insurance perspective the jury is still out there on whether flood is an insurable peril. However, with advances in modeling techniques and computer power it is possible to develop models that allow proper risk quantification at the scale suitable for a viable insurance market for flood peril. In order to serve the insurance market a model has to be event-simulation based and has to provide financial risk estimation that forms the basis for risk pricing, risk transfer and risk management at all levels of insurance industry at large. In short, for a collection of properties, henceforth referred to as a portfolio, the critical output of the model is an annual probability distribution of economic losses from a single flood occurrence (flood event) or from an aggregation of all events in any given year. In this paper, the challenges of developing such a model are discussed in the context of Great Britain for which a model has been developed. The model comprises of several, physically motivated components so that the primary attributes of the phenomenon are accounted for. The first component, the rainfall generator simulates a continuous series of rainfall events in space and time over thousands of years, which are physically realistic while maintaining the statistical properties of rainfall at all locations over the model domain. A physically based runoff generation module feeds all the rivers in Great Britain, whose total length of stream links amounts to about 60,000 km. A dynamical flow routing

  13. Prediction of Banking Systemic Risk Based on Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Shouwei Li

    2013-01-01

    Full Text Available Banking systemic risk is a complex nonlinear phenomenon and has shed light on the importance of safeguarding financial stability by recent financial crisis. According to the complex nonlinear characteristics of banking systemic risk, in this paper we apply support vector machine (SVM to the prediction of banking systemic risk in an attempt to suggest a new model with better explanatory power and stability. We conduct a case study of an SVM-based prediction model for Chinese banking systemic risk and find the experiment results showing that support vector machine is an efficient method in such case.

  14. Predictive Accuracy of a Cardiovascular Disease Risk Prediction Model in Rural South India – A Community Based Retrospective Cohort Study

    Directory of Open Access Journals (Sweden)

    Farah N Fathima

    2015-03-01

    Full Text Available Background: Identification of individuals at risk of developing cardiovascular diseases by risk stratification is the first step in primary prevention. Aims & Objectives: To assess the five year risk of developing a cardiovascular event from retrospective data and to assess the predictive accuracy of the non laboratory based National Health and Nutrition Examination Survey (NHANES risk prediction model among individuals in a rural South Indian population. Materials & Methods: A community based retrospective cohort study was conducted in three villages where risk stratification was done for all eligible adults aged between 35-74 years at the time of initial assessment using the NHANES risk prediction charts. Household visits were made after a period of five years by trained doctors to determine cardiovascular outcomes. Results: 521 people fulfilled the eligibility criteria of whom 486 (93.3% could be traced after five years. 56.8% were in low risk, 36.6% were in moderate risk and 6.6% were in high risk categories. 29 persons (5.97% had had cardiovascular events over the last five years of which 24 events (82.7% were nonfatal and five (17.25% were fatal. The mean age of the people who developed cardiovascular events was 57.24 ± 9.09 years. The odds ratios for the three levels of risk showed a linear trend with the odds ratios for the moderate risk and high risk category being 1.35 and 1.94 respectively with the low risk category as baseline. Conclusion: The non laboratory based NHANES charts did not accurately predict the occurrence of cardiovascular events in any of the risk categories.

  15. MODEL NON LINIER GARCH (NGARCH UNTUK MENGESTIMASI NILAI VALUE at RISK (VaR PADA IHSG

    Directory of Open Access Journals (Sweden)

    I KOMANG TRY BAYU MAHENDRA

    2015-06-01

    Full Text Available In investment, risk measurement is important. One of risk measure is Value at Risk (VaR. There are many methods that can be used to estimate risk based on VaR framework. One of them Non Linier GARCH (NGARCH model. In this research, determination of VaR used NGARCH model. NGARCH model allowed for asymetric behaviour in the volatility such that “good news” or positive return and “bad news” or negative return. Based on calculations of VaR, the higher of the confidence level and the longer the investment period, the risk was greater. Determination of VaR using NGARCH model was less than GARCH model.

  16. Big data based fraud risk management at Alibaba

    Directory of Open Access Journals (Sweden)

    Jidong Chen

    2015-12-01

    Full Text Available With development of mobile internet and finance, fraud risk comes in all shapes and sizes. This paper is to introduce the Fraud Risk Management at Alibaba under big data. Alibaba has built a fraud risk monitoring and management system based on real-time big data processing and intelligent risk models. It captures fraud signals directly from huge amount data of user behaviors and network, analyzes them in real-time using machine learning, and accurately predicts the bad users and transactions. To extend the fraud risk prevention ability to external customers, Alibaba also built up a big data based fraud prevention product called AntBuckler. AntBuckler aims to identify and prevent all flavors of malicious behaviors with flexibility and intelligence for online merchants and banks. By combining large amount data of Alibaba and customers', AntBuckler uses the RAIN score engine to quantify risk levels of users or transactions for fraud prevention. It also has a user-friendly visualization UI with risk scores, top reasons and fraud connections.

  17. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  18. Choosing where to work at work - towards a theoretical model of benefits and risks of activity-based flexible offices.

    Science.gov (United States)

    Wohlers, Christina; Hertel, Guido

    2017-04-01

    Although there is a trend in today's organisations to implement activity-based flexible offices (A-FOs), only a few studies examine consequences of this new office type. Moreover, the underlying mechanisms why A-FOs might lead to different consequences as compared to cellular and open-plan offices are still unclear. This paper introduces a theoretical framework explaining benefits and risks of A-FOs based on theories from work and organisational psychology. After deriving working conditions specific for A-FOs (territoriality, autonomy, privacy, proximity and visibility), differences in working conditions between A-FOs and alternative office types are proposed. Further, we suggest how these differences in working conditions might affect work-related consequences such as well-being, satisfaction, motivation and performance on the individual, the team and the organisational level. Finally, we consider task-related (e.g. task variety), person-related (e.g. personality) and organisational (e.g. leadership) moderators. Based on this model, future research directions as well as practical implications are discussed. Practitioner Summary: Activity-based flexible offices (A-FOs) are popular in today's organisations. This article presents a theoretical model explaining why and when working in an A-FO evokes benefits and risks for individuals, teams and organisations. According to the model, A-FOs are beneficial when management encourages employees to use the environment appropriately and supports teams.

  19. Risk analysis for rumor propagation in metropolises based on improved 8-state ICSAR model and dynamic personal activity trajectories

    Science.gov (United States)

    Zhang, N.; Huang, H.; Duarte, M.; Zhang, J.

    2016-06-01

    Social media has developed extremely fast in metropolises in recent years resulting in more and more rumors disturbing our daily lives. Knowing the characteristics of rumor propagation in metropolises can help the government make efficient rumor refutation plans. In this paper, we established a dynamic spatio-temporal comprehensive risk assessment model for rumor propagation based on an improved 8-state ICSAR model (Ignorant, Information Carrier, Information Spreader, Advocate, Removal), large personal activity trajectory data, and governmental rumor refutation (anti-rumor) scenarios. Combining these relevant data with the 'big' traffic data on the use of subways, buses, and taxis, we simulated daily oral communications among inhabitants in Beijing. In order to analyze rumor and anti-rumor competition in the actual social network, personal resistance, personal preference, conformity, rumor intensity, government rumor refutation and other influencing factors were considered. Based on the developed risk assessment model, a long-term dynamic rumor propagation simulation for a seven day period was conducted and a comprehensive rumor propagation risk distribution map was obtained. A set of the sensitivity analyses were conducted for different social media and propagation routes. We assessed different anti-rumor coverage ratios and the rumor-spreading thresholds at which the government started to launch anti-rumor actions. The results we obtained provide worthwhile references useful for governmental decision making towards control of social-disrupting rumors.

  20. Demonstration of a modelling-based multi-criteria decision analysis procedure for prioritisation of occupational risks from manufactured nanomaterials.

    Science.gov (United States)

    Hristozov, Danail; Zabeo, Alex; Alstrup Jensen, Keld; Gottardo, Stefania; Isigonis, Panagiotis; Maccalman, Laura; Critto, Andrea; Marcomini, Antonio

    2016-11-01

    Several tools to facilitate the risk assessment and management of manufactured nanomaterials (MN) have been developed. Most of them require input data on physicochemical properties, toxicity and scenario-specific exposure information. However, such data are yet not readily available, and tools that can handle data gaps in a structured way to ensure transparent risk analysis for industrial and regulatory decision making are needed. This paper proposes such a quantitative risk prioritisation tool, based on a multi-criteria decision analysis algorithm, which combines advanced exposure and dose-response modelling to calculate margins of exposure (MoE) for a number of MN in order to rank their occupational risks. We demonstrated the tool in a number of workplace exposure scenarios (ES) involving the production and handling of nanoscale titanium dioxide, zinc oxide (ZnO), silver and multi-walled carbon nanotubes. The results of this application demonstrated that bag/bin filling, manual un/loading and dumping of large amounts of dry powders led to high emissions, which resulted in high risk associated with these ES. The ZnO MN revealed considerable hazard potential in vivo, which significantly influenced the risk prioritisation results. In order to study how variations in the input data affect our results, we performed probabilistic Monte Carlo sensitivity/uncertainty analysis, which demonstrated that the performance of the proposed model is stable against changes in the exposure and hazard input variables.

  1. Risk prediction models for selection of lung cancer screening candidates: A retrospective validation study.

    Directory of Open Access Journals (Sweden)

    Kevin Ten Haaf

    2017-04-01

    Full Text Available Selection of candidates for lung cancer screening based on individual risk has been proposed as an alternative to criteria based on age and cumulative smoking exposure (pack-years. Nine previously established risk models were assessed for their ability to identify those most likely to develop or die from lung cancer. All models considered age and various aspects of smoking exposure (smoking status, smoking duration, cigarettes per day, pack-years smoked, time since smoking cessation as risk predictors. In addition, some models considered factors such as gender, race, ethnicity, education, body mass index, chronic obstructive pulmonary disease, emphysema, personal history of cancer, personal history of pneumonia, and family history of lung cancer.Retrospective analyses were performed on 53,452 National Lung Screening Trial (NLST participants (1,925 lung cancer cases and 884 lung cancer deaths and 80,672 Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO ever-smoking participants (1,463 lung cancer cases and 915 lung cancer deaths. Six-year lung cancer incidence and mortality risk predictions were assessed for (1 calibration (graphically by comparing the agreement between the predicted and the observed risks, (2 discrimination (area under the receiver operating characteristic curve [AUC] between individuals with and without lung cancer (death, and (3 clinical usefulness (net benefit in decision curve analysis by identifying risk thresholds at which applying risk-based eligibility would improve lung cancer screening efficacy. To further assess performance, risk model sensitivities and specificities in the PLCO were compared to those based on the NLST eligibility criteria. Calibration was satisfactory, but discrimination ranged widely (AUCs from 0.61 to 0.81. The models outperformed the NLST eligibility criteria over a substantial range of risk thresholds in decision curve analysis, with a higher sensitivity for all models and a

  2. Distributionally Robust Return-Risk Optimization Models and Their Applications

    Directory of Open Access Journals (Sweden)

    Li Yang

    2014-01-01

    Full Text Available Based on the risk control of conditional value-at-risk, distributionally robust return-risk optimization models with box constraints of random vector are proposed. They describe uncertainty in both the distribution form and moments (mean and covariance matrix of random vector. It is difficult to solve them directly. Using the conic duality theory and the minimax theorem, the models are reformulated as semidefinite programming problems, which can be solved by interior point algorithms in polynomial time. An important theoretical basis is therefore provided for applications of the models. Moreover, an application of the models to a practical example of portfolio selection is considered, and the example is evaluated using a historical data set of four stocks. Numerical results show that proposed methods are robust and the investment strategy is safe.

  3. A scenario-based procedure for seismic risk analysis

    International Nuclear Information System (INIS)

    Kluegel, J.-U.; Mualchin, L.; Panza, G.F.

    2006-12-01

    A new methodology for seismic risk analysis based on probabilistic interpretation of deterministic or scenario-based hazard analysis, in full compliance with the likelihood principle and therefore meeting the requirements of modern risk analysis, has been developed. The proposed methodology can easily be adjusted to deliver its output in a format required for safety analysts and civil engineers. The scenario-based approach allows the incorporation of all available information collected in a geological, seismotectonic and geotechnical database of the site of interest as well as advanced physical modelling techniques to provide a reliable and robust deterministic design basis for civil infrastructures. The robustness of this approach is of special importance for critical infrastructures. At the same time a scenario-based seismic hazard analysis allows the development of the required input for probabilistic risk assessment (PRA) as required by safety analysts and insurance companies. The scenario-based approach removes the ambiguity in the results of probabilistic seismic hazard analysis (PSHA) which relies on the projections of Gutenberg-Richter (G-R) equation. The problems in the validity of G-R projections, because of incomplete to total absence of data for making the projections, are still unresolved. Consequently, the information from G-R must not be used in decisions for design of critical structures or critical elements in a structure. The scenario-based methodology is strictly based on observable facts and data and complemented by physical modelling techniques, which can be submitted to a formalised validation process. By means of sensitivity analysis, knowledge gaps related to lack of data can be dealt with easily, due to the limited amount of scenarios to be investigated. The proposed seismic risk analysis can be used with confidence for planning, insurance and engineering applications. (author)

  4. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  5. Flexible competing risks regression modeling and goodness-of-fit

    DEFF Research Database (Denmark)

    Scheike, Thomas; Zhang, Mei-Jie

    2008-01-01

    In this paper we consider different approaches for estimation and assessment of covariate effects for the cumulative incidence curve in the competing risks model. The classic approach is to model all cause-specific hazards and then estimate the cumulative incidence curve based on these cause...... models that is easy to fit and contains the Fine-Gray model as a special case. One advantage of this approach is that our regression modeling allows for non-proportional hazards. This leads to a new simple goodness-of-fit procedure for the proportional subdistribution hazards assumption that is very easy...... of the flexible regression models to analyze competing risks data when non-proportionality is present in the data....

  6. Predicting the Risk of Attrition for Undergraduate Students with Time Based Modelling

    Science.gov (United States)

    Chai, Kevin E. K.; Gibson, David

    2015-01-01

    Improving student retention is an important and challenging problem for universities. This paper reports on the development of a student attrition model for predicting which first year students are most at-risk of leaving at various points in time during their first semester of study. The objective of developing such a model is to assist…

  7. Model Risk in Portfolio Optimization

    Directory of Open Access Journals (Sweden)

    David Stefanovits

    2014-08-01

    Full Text Available We consider a one-period portfolio optimization problem under model uncertainty. For this purpose, we introduce a measure of model risk. We derive analytical results for this measure of model risk in the mean-variance problem assuming we have observations drawn from a normal variance mixture model. This model allows for heavy tails, tail dependence and leptokurtosis of marginals. The results show that mean-variance optimization is seriously compromised by model uncertainty, in particular, for non-Gaussian data and small sample sizes. To mitigate these shortcomings, we propose a method to adjust the sample covariance matrix in order to reduce model risk.

  8. Criterion of Semi-Markov Dependent Risk Model

    Institute of Scientific and Technical Information of China (English)

    Xiao Yun MO; Xiang Qun YANG

    2014-01-01

    A rigorous definition of semi-Markov dependent risk model is given. This model is a generalization of the Markov dependent risk model. A criterion and necessary conditions of semi-Markov dependent risk model are obtained. The results clarify relations between elements among semi-Markov dependent risk model more clear and are applicable for Markov dependent risk model.

  9. Using Geographic Information System-based Ecologic Niche Models to Forecast the Risk of Hantavirus Infection in Shandong Province, China

    Science.gov (United States)

    Wei, Lan; Qian, Quan; Wang, Zhi-Qiang; Glass, Gregory E.; Song, Shao-Xia; Zhang, Wen-Yi; Li, Xiu-Jun; Yang, Hong; Wang, Xian-Jun; Fang, Li-Qun; Cao, Wu-Chun

    2011-01-01

    Hemorrhagic fever with renal syndrome (HFRS) is an important public health problem in Shandong Province, China. In this study, we combined ecologic niche modeling with geographic information systems (GIS) and remote sensing techniques to identify the risk factors and affected areas of hantavirus infections in rodent hosts. Land cover and elevation were found to be closely associated with the presence of hantavirus-infected rodent hosts. The averaged area under the receiver operating characteristic curve was 0.864, implying good performance. The predicted risk maps based on the model were validated both by the hantavirus-infected rodents' distribution and HFRS human case localities with a good fit. These findings have the applications for targeting control and prevention efforts. PMID:21363991

  10. Risk assessment and food allergy: the probabilistic model applied to allergens

    NARCIS (Netherlands)

    Spanjersberg, M.Q.I.; Kruizinga, A.G.; Rennen, M.A.J.; Houben, G.F.

    2007-01-01

    In order to assess the risk of unintended exposure to food allergens, traditional deterministic risk assessment is usually applied, leading to inconsequential conclusions as 'an allergic reaction cannot be excluded'. TNO therefore developed a quantitative risk assessment model for allergens based on

  11. Aquaplaning : Development of a Risk Pond Model from Road Surface Measurements

    OpenAIRE

    Nygårdhs, Sara

    2003-01-01

    Aquaplaning accidents are relatively rare, but could have fatal effects. The task of this master’s thesis is to use data from the Laser Road Surface Tester to detect road sections with risk of aquaplaning. A three-dimensional model based on data from road surface measurements is created using MATLAB (version 6.1). From this general geometrical model of the road, a pond model is produced from which the theoretical risk ponds are detected. A risk pond indication table is fur-ther created. The...

  12. Probabilistic Modeling of Seismic Risk Based Design for a Dual System Structure

    OpenAIRE

    Sidi, Indra Djati

    2017-01-01

    The dual system structure concept has gained popularity in the construction of high-rise buildings over the last decades. Meanwhile, earthquake engineering design provisions for buildings have moved from the uniform hazard concept to the uniform risk concept upon recognizing the uncertainties involved in the earthquake resistance of concrete structures. In this study, a probabilistic model for the evaluation of such risk is proposed for a dual system structure consisting of shear walls or cor...

  13. Development and Sensitivity Analysis of a Frost Risk model based primarily on freely distributed Earth Observation data

    Science.gov (United States)

    Louka, Panagiota; Petropoulos, George; Papanikolaou, Ioannis

    2015-04-01

    The ability to map the spatiotemporal distribution of extreme climatic conditions, such as frost, is a significant tool in successful agricultural management and decision making. Nowadays, with the development of Earth Observation (EO) technology, it is possible to obtain accurately, timely and in a cost-effective way information on the spatiotemporal distribution of frost conditions, particularly over large and otherwise inaccessible areas. The present study aimed at developing and evaluating a frost risk prediction model, exploiting primarily EO data from MODIS and ASTER sensors and ancillary ground observation data. For the evaluation of our model, a region in north-western Greece was selected as test site and a detailed sensitivity analysis was implemented. The agreement between the model predictions and the observed (remotely sensed) frost frequency obtained by MODIS sensor was evaluated thoroughly. Also, detailed comparisons of the model predictions were performed against reference frost ground observations acquired from the Greek Agricultural Insurance Organization (ELGA) over a period of 10-years (2000-2010). Overall, results evidenced the ability of the model to produce reasonably well the frost conditions, following largely explainable patterns in respect to the study site and local weather conditions characteristics. Implementation of our proposed frost risk model is based primarily on satellite imagery analysis provided nowadays globally at no cost. It is also straightforward and computationally inexpensive, requiring much less effort in comparison for example to field surveying. Finally, the method is adjustable to be potentially integrated with other high resolution data available from both commercial and non-commercial vendors. Keywords: Sensitivity analysis, frost risk mapping, GIS, remote sensing, MODIS, Greece

  14. Strengthening air traffic safety management by moving from outcome-based towards risk-based evaluation of runway incursions

    International Nuclear Information System (INIS)

    Stroeve, Sybert H.; Som, Pradip; Doorn, Bas A. van; Bakker, G.J.

    2016-01-01

    Current safety management of aerodrome operations uses judgements of severity categories to evaluate runway incursions. Incident data show a small minority of severe incursions and a large majority of less severe incursions. We show that these severity judgements are mainly based upon the outcomes of runway incursions, in particular on the closest distances attained. As such, the severity-based evaluation leads to coincidental safety management feedback, wherein causes and risk implications of runway incursions are not well considered. In this paper we present a new framework for the evaluation of runway incursions, which effectively uses all runway incursions, which judges same types of causes similarly, and which structures causes and risk implications. The framework is based on risks of scenarios associated with the initiation of runway incursions. As a basis an inventory of scenarios is provided, which can represent almost all runway incursions involving a conflict with an aircraft. A main step in the framework is the assessment of the conditional probability of a collision given a runway incursion scenario. This can be effectively achieved for large sets of scenarios by agent-based dynamic risk modelling. The results provide detailed feedback on risks of runway incursion scenarios, thus enabling effective safety management. - Highlights: • Current evaluation of runway incursions is primarily based on their outcomes. • A new framework assesses collision risk given initiation of runway incursions. • Agent-based dynamic risk modelling can evaluate the risks of many scenarios. • A developed scenario inventory can represent almost all runway incursions. • The framework provides detailed feedback to safety management.

  15. Stochastic multi-objective model for optimal energy exchange optimization of networked microgrids with presence of renewable generation under risk-based strategies.

    Science.gov (United States)

    Gazijahani, Farhad Samadi; Ravadanegh, Sajad Najafi; Salehi, Javad

    2018-02-01

    The inherent volatility and unpredictable nature of renewable generations and load demand pose considerable challenges for energy exchange optimization of microgrids (MG). To address these challenges, this paper proposes a new risk-based multi-objective energy exchange optimization for networked MGs from economic and reliability standpoints under load consumption and renewable power generation uncertainties. In so doing, three various risk-based strategies are distinguished by using conditional value at risk (CVaR) approach. The proposed model is specified as a two-distinct objective function. The first function minimizes the operation and maintenance costs, cost of power transaction between upstream network and MGs as well as power loss cost, whereas the second function minimizes the energy not supplied (ENS) value. Furthermore, the stochastic scenario-based approach is incorporated into the approach in order to handle the uncertainty. Also, Kantorovich distance scenario reduction method has been implemented to reduce the computational burden. Finally, non-dominated sorting genetic algorithm (NSGAII) is applied to minimize the objective functions simultaneously and the best solution is extracted by fuzzy satisfying method with respect to risk-based strategies. To indicate the performance of the proposed model, it is performed on the modified IEEE 33-bus distribution system and the obtained results show that the presented approach can be considered as an efficient tool for optimal energy exchange optimization of MGs. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Risk-based configuration control

    International Nuclear Information System (INIS)

    Szikszai, T.

    1997-01-01

    The presentation discusses the following issues: The Configuration Control; The Risk-based Configuration Control (during power operation mode, and during shutdown mode). PSA requirements. Use of Risk-based Configuration Control System. Configuration Management (basic elements, benefits, information requirements)

  17. Development and validation of risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team.

    Science.gov (United States)

    Harrison, David A; Patel, Krishna; Nixon, Edel; Soar, Jasmeet; Smith, Gary B; Gwinnutt, Carl; Nolan, Jerry P; Rowan, Kathryn M

    2014-08-01

    The National Cardiac Arrest Audit (NCAA) is the UK national clinical audit for in-hospital cardiac arrest. To make fair comparisons among health care providers, clinical indicators require case mix adjustment using a validated risk model. The aim of this study was to develop and validate risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team in UK hospitals. Risk models for two outcomes-return of spontaneous circulation (ROSC) for greater than 20min and survival to hospital discharge-were developed and validated using data for in-hospital cardiac arrests between April 2011 and March 2013. For each outcome, a full model was fitted and then simplified by testing for non-linearity, combining categories and stepwise reduction. Finally, interactions between predictors were considered. Models were assessed for discrimination, calibration and accuracy. 22,479 in-hospital cardiac arrests in 143 hospitals were included (14,688 development, 7791 validation). The final risk model for ROSC>20min included: age (non-linear), sex, prior length of stay in hospital, reason for attendance, location of arrest, presenting rhythm, and interactions between presenting rhythm and location of arrest. The model for hospital survival included the same predictors, excluding sex. Both models had acceptable performance across the range of measures, although discrimination for hospital mortality exceeded that for ROSC>20min (c index 0.81 versus 0.72). Validated risk models for ROSC>20min and hospital survival following in-hospital cardiac arrest have been developed. These models will strengthen comparative reporting in NCAA and support local quality improvement. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  18. Development and validation of risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team☆

    Science.gov (United States)

    Harrison, David A.; Patel, Krishna; Nixon, Edel; Soar, Jasmeet; Smith, Gary B.; Gwinnutt, Carl; Nolan, Jerry P.; Rowan, Kathryn M.

    2014-01-01

    Aim The National Cardiac Arrest Audit (NCAA) is the UK national clinical audit for in-hospital cardiac arrest. To make fair comparisons among health care providers, clinical indicators require case mix adjustment using a validated risk model. The aim of this study was to develop and validate risk models to predict outcomes following in-hospital cardiac arrest attended by a hospital-based resuscitation team in UK hospitals. Methods Risk models for two outcomes—return of spontaneous circulation (ROSC) for greater than 20 min and survival to hospital discharge—were developed and validated using data for in-hospital cardiac arrests between April 2011 and March 2013. For each outcome, a full model was fitted and then simplified by testing for non-linearity, combining categories and stepwise reduction. Finally, interactions between predictors were considered. Models were assessed for discrimination, calibration and accuracy. Results 22,479 in-hospital cardiac arrests in 143 hospitals were included (14,688 development, 7791 validation). The final risk model for ROSC > 20 min included: age (non-linear), sex, prior length of stay in hospital, reason for attendance, location of arrest, presenting rhythm, and interactions between presenting rhythm and location of arrest. The model for hospital survival included the same predictors, excluding sex. Both models had acceptable performance across the range of measures, although discrimination for hospital mortality exceeded that for ROSC > 20 min (c index 0.81 versus 0.72). Conclusions Validated risk models for ROSC > 20 min and hospital survival following in-hospital cardiac arrest have been developed. These models will strengthen comparative reporting in NCAA and support local quality improvement. PMID:24830872

  19. A Vulnerability-Based, Bottom-up Assessment of Future Riverine Flood Risk Using a Modified Peaks-Over-Threshold Approach and a Physically Based Hydrologic Model

    Science.gov (United States)

    Knighton, James; Steinschneider, Scott; Walter, M. Todd

    2017-12-01

    There is a chronic disconnection among purely probabilistic flood frequency analysis of flood hazards, flood risks, and hydrological flood mechanisms, which hamper our ability to assess future flood impacts. We present a vulnerability-based approach to estimating riverine flood risk that accommodates a more direct linkage between decision-relevant metrics of risk and the dominant mechanisms that cause riverine flooding. We adapt the conventional peaks-over-threshold (POT) framework to be used with extreme precipitation from different climate processes and rainfall-runoff-based model output. We quantify the probability that at least one adverse hydrologic threshold, potentially defined by stakeholders, will be exceeded within the next N years. This approach allows us to consider flood risk as the summation of risk from separate atmospheric mechanisms, and supports a more direct mapping between hazards and societal outcomes. We perform this analysis within a bottom-up framework to consider the relevance and consequences of information, with varying levels of credibility, on changes to atmospheric patterns driving extreme precipitation events. We demonstrate our proposed approach using a case study for Fall Creek in Ithaca, NY, USA, where we estimate the risk of stakeholder-defined flood metrics from three dominant mechanisms: summer convection, tropical cyclones, and spring rain and snowmelt. Using downscaled climate projections, we determine how flood risk associated with a subset of mechanisms may change in the future, and the resultant shift to annual flood risk. The flood risk approach we propose can provide powerful new insights into future flood threats.

  20. Risk-based cost-benefit analysis for evaluating microbial risk mitigation in a drinking water system.

    Science.gov (United States)

    Bergion, Viktor; Lindhe, Andreas; Sokolova, Ekaterina; Rosén, Lars

    2018-04-01

    Waterborne outbreaks of gastrointestinal diseases can cause large costs to society. Risk management needs to be holistic and transparent in order to reduce these risks in an effective manner. Microbial risk mitigation measures in a drinking water system were investigated using a novel approach combining probabilistic risk assessment and cost-benefit analysis. Lake Vomb in Sweden was used to exemplify and illustrate the risk-based decision model. Four mitigation alternatives were compared, where the first three alternatives, A1-A3, represented connecting 25, 50 and 75%, respectively, of on-site wastewater treatment systems in the catchment to the municipal wastewater treatment plant. The fourth alternative, A4, represented installing a UV-disinfection unit in the drinking water treatment plant. Quantitative microbial risk assessment was used to estimate the positive health effects in terms of quality adjusted life years (QALYs), resulting from the four mitigation alternatives. The health benefits were monetised using a unit cost per QALY. For each mitigation alternative, the net present value of health and environmental benefits and investment, maintenance and running costs was calculated. The results showed that only A4 can reduce the risk (probability of infection) below the World Health Organization guidelines of 10 -4 infections per person per year (looking at the 95th percentile). Furthermore, all alternatives resulted in a negative net present value. However, the net present value would be positive (looking at the 50 th percentile using a 1% discount rate) if non-monetised benefits (e.g. increased property value divided evenly over the studied time horizon and reduced microbial risks posed to animals), estimated at 800-1200 SEK (€100-150) per connected on-site wastewater treatment system per year, were included. This risk-based decision model creates a robust and transparent decision support tool. It is flexible enough to be tailored and applied to local

  1. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Rasmussen, Martin; Ulrich, Thomas; Groth, Katrina; Smith, Curtis

    2016-01-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: • Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.

  2. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rasmussen, Martin [Norwegian Univ. of Science and Technology, Trondheim (Norway). Social Research; Herberger, Sarah [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ulrich, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-06-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: • Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.

  3. Probabilistic risk assessment of gold nanoparticles after intravenous administration by integrating in vitro and in vivo toxicity with physiologically based pharmacokinetic modeling.

    Science.gov (United States)

    Cheng, Yi-Hsien; Riviere, Jim E; Monteiro-Riviere, Nancy A; Lin, Zhoumeng

    2018-04-14

    This study aimed to conduct an integrated and probabilistic risk assessment of gold nanoparticles (AuNPs) based on recently published in vitro and in vivo toxicity studies coupled to a physiologically based pharmacokinetic (PBPK) model. Dose-response relationships were characterized based on cell viability assays in various human cell types. A previously well-validated human PBPK model for AuNPs was applied to quantify internal concentrations in liver, kidney, skin, and venous plasma. By applying a Bayesian-based probabilistic risk assessment approach incorporating Monte Carlo simulation, probable human cell death fractions were characterized. Additionally, we implemented in vitro to in vivo and animal-to-human extrapolation approaches to independently estimate external exposure levels of AuNPs that cause minimal toxicity. Our results suggest that under the highest dosing level employed in existing animal studies (worst-case scenario), AuNPs coated with branched polyethylenimine (BPEI) would likely induce ∼90-100% cellular death, implying high cytotoxicity compared to risk prediction, and point of departure estimation of AuNP exposure for humans and illustrate an approach that could be applied to other NPs when sufficient data are available.

  4. [A model list of high risk drugs].

    Science.gov (United States)

    Cotrina Luque, J; Guerrero Aznar, M D; Alvarez del Vayo Benito, C; Jimenez Mesa, E; Guzman Laura, K P; Fernández Fernández, L

    2013-12-01

    «High-risk drugs» are those that have a very high «risk» of causing death or serious injury if an error occurs during its use. The Institute for Safe Medication Practices (ISMP) has prepared a high-risk drugs list applicable to the general population (with no differences between the pediatric and adult population). Thus, there is a lack of information for the pediatric population. The main objective of this work is to develop a high-risk drug list adapted to the neonatal or pediatric population as a reference model for the pediatric hospital health workforce. We made a literature search in May 2012 to identify any published lists or references in relation to pediatric and/or neonatal high-risk drugs. A total of 15 studies were found, from which 9 were selected. A model list was developed mainly based on the ISMP one, adding strongly perceived pediatric risk drugs and removing those where the pediatric use was anecdotal. There is no published list that suits pediatric risk management. The list of pediatric and neonatal high-risk drugs presented here could be a «reference list of high-risk drugs » for pediatric hospitals. Using this list and training will help to prevent medication errors in each drug supply chain (prescribing, transcribing, dispensing and administration). Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  5. Health risks maps. Modelling of air quality as a tool to map health risks

    International Nuclear Information System (INIS)

    Van Doorn, R.; Hegger, C.

    2000-01-01

    Environmental departments consider geographical maps with information on air quality as the final product of a complicated process of measuring, modelling and presentation. Municipal health departments consider such maps a useful starting point to solve the problem whether air pollution causes health risks for citizens. The answer to this question cannot be reduced to checking if threshold limit values are exceeded. Based on the results of measurements and modelling of concentrations of nitrogen dioxide in air, the health significance of air pollution caused by nitrogen dioxide is illuminated. A proposal is presented to map health risks of air pollution by using the results of measurements and modelling of air pollution. 7 refs

  6. A Probabilistic Asteroid Impact Risk Model

    Science.gov (United States)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  7. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2017-12-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  8. Quantitative evaluation of the risk induced by dominant geomorphological processes on different land uses, based on GIS spatial analysis models

    Science.gov (United States)

    Ştefan, Bilaşco; Sanda, Roşca; Ioan, Fodorean; Iuliu, Vescan; Sorin, Filip; Dănuţ, Petrea

    2018-06-01

    Maramureş Land is mostly characterized by agricultural and forestry land use due to its specific configuration of topography and its specific pedoclimatic conditions. Taking into consideration the trend of the last century from the perspective of land management, a decrease in the surface of agricultural lands to the advantage of built-up and grass lands, as well as an accelerated decrease in the forest cover due to uncontrolled and irrational forest exploitation, has become obvious. The field analysis performed on the territory of Maramureş Land has highlighted a high frequency of two geomorphologic processes — landslides and soil erosion — which have a major negative impact on land use due to their rate of occurrence. The main aim of the present study is the GIS modeling of the two geomorphologic processes, determining a state of vulnerability (the USLE model for soil erosion and a quantitative model based on the morphometric characteristics of the territory, derived from the HG. 447/2003) and their integration in a complex model of cumulated vulnerability identification. The modeling of the risk exposure was performed using a quantitative approach based on models and equations of spatial analysis, which were developed with modeled raster data structures and primary vector data, through a matrix highlighting the correspondence between vulnerability and land use classes. The quantitative analysis of the risk was performed by taking into consideration the exposure classes as modeled databases and the land price as a primary alphanumeric database using spatial analysis techniques for each class by means of the attribute table. The spatial results highlight the territories with a high risk to present geomorphologic processes that have a high degree of occurrence and represent a useful tool in the process of spatial planning.

  9. Modeling and managing risk early in software development

    Science.gov (United States)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  10. Risk Factors Analysis and Death Prediction in Some Life-Threatening Ailments Using Chi-Square Case-Based Reasoning (χ2 CBR) Model.

    Science.gov (United States)

    Adeniyi, D A; Wei, Z; Yang, Y

    2018-01-30

    A wealth of data are available within the health care system, however, effective analysis tools for exploring the hidden patterns in these datasets are lacking. To alleviate this limitation, this paper proposes a simple but promising hybrid predictive model by suitably combining the Chi-square distance measurement with case-based reasoning technique. The study presents the realization of an automated risk calculator and death prediction in some life-threatening ailments using Chi-square case-based reasoning (χ 2 CBR) model. The proposed predictive engine is capable of reducing runtime and speeds up execution process through the use of critical χ 2 distribution value. This work also showcases the development of a novel feature selection method referred to as frequent item based rule (FIBR) method. This FIBR method is used for selecting the best feature for the proposed χ 2 CBR model at the preprocessing stage of the predictive procedures. The implementation of the proposed risk calculator is achieved through the use of an in-house developed PHP program experimented with XAMP/Apache HTTP server as hosting server. The process of data acquisition and case-based development is implemented using the MySQL application. Performance comparison between our system, the NBY, the ED-KNN, the ANN, the SVM, the Random Forest and the traditional CBR techniques shows that the quality of predictions produced by our system outperformed the baseline methods studied. The result of our experiment shows that the precision rate and predictive quality of our system in most cases are equal to or greater than 70%. Our result also shows that the proposed system executes faster than the baseline methods studied. Therefore, the proposed risk calculator is capable of providing useful, consistent, faster, accurate and efficient risk level prediction to both the patients and the physicians at any time, online and on a real-time basis.

  11. Risk-based audit selection of dairy farms.

    Science.gov (United States)

    van Asseldonk, M A P M; Velthuis, A G J

    2014-02-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. Risk Estimation for Lung Cancer in Libya: Analysis Based on Standardized Morbidity Ratio, Poisson-Gamma Model, BYM Model and Mixture Model

    Science.gov (United States)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-03-01

    Cancer is the most rapidly spreading disease in the world, especially in developing countries, including Libya. Cancer represents a significant burden on patients, families, and their societies. This disease can be controlled if detected early. Therefore, disease mapping has recently become an important method in the fields of public health research and disease epidemiology. The correct choice of statistical model is a very important step to producing a good map of a disease. Libya was selected to perform this work and to examine its geographical variation in the incidence of lung cancer. The objective of this paper is to estimate the relative risk for lung cancer. Four statistical models to estimate the relative risk for lung cancer and population censuses of the study area for the time period 2006 to 2011 were used in this work. They are initially known as Standardized Morbidity Ratio, which is the most popular statistic, which used in the field of disease mapping, Poisson-gamma model, which is one of the earliest applications of Bayesian methodology, Besag, York and Mollie (BYM) model and Mixture model. As an initial step, this study begins by providing a review of all proposed models, which we then apply to lung cancer data in Libya. Maps, tables and graph, goodness-of-fit (GOF) were used to compare and present the preliminary results. This GOF is common in statistical modelling to compare fitted models. The main general results presented in this study show that the Poisson-gamma model, BYM model, and Mixture model can overcome the problem of the first model (SMR) when there is no observed lung cancer case in certain districts. Results show that the Mixture model is most robust and provides better relative risk estimates across a range of models. Creative Commons Attribution License

  13. Modeling Crossing Behavior of Drivers at Unsignalized Intersections with Consideration of Risk Perception

    Directory of Open Access Journals (Sweden)

    Liu Miaomiao

    2016-01-01

    Full Text Available Drivers’ risk perception is vital to driving behavior and traffic safety. In the dynamic interaction of a driver-vehicle-environment system, drivers’ risk perception changes dynamically. This study focused on drivers’ risk perception at unsignalized intersections in China and analyzed drivers’ crossing behavior. Based on cognitive psychology theory and an adaptive neuro-fuzzy inference system, quantitative models of drivers’ risk perception were established for the crossing processes between two straight-moving vehicles from the orthogonal direction. The acceptable risk perception levels of drivers were identified using a self-developed data analysis method. Based on game theory, the relationship among the quantitative value of drivers’ risk perception, acceptable risk perception level, and vehicle motion state was analyzed. The models of drivers’ crossing behavior were then established. Finally, the behavior models were validated using data collected from real-world vehicle movements and driver decisions. The results showed that the developed behavior models had both high accuracy and good applicability. This study would provide theoretical and algorithmic references for the microscopic simulation and active safety control system of vehicles.

  14. A discriminant analysis prediction model of non-syndromic cleft lip with or without cleft palate based on risk factors.

    Science.gov (United States)

    Li, Huixia; Luo, Miyang; Luo, Jiayou; Zheng, Jianfei; Zeng, Rong; Du, Qiyun; Fang, Junqun; Ouyang, Na

    2016-11-23

    A risk prediction model of non-syndromic cleft lip with or without cleft palate (NSCL/P) was established by a discriminant analysis to predict the individual risk of NSCL/P in pregnant women. A hospital-based case-control study was conducted with 113 cases of NSCL/P and 226 controls without NSCL/P. The cases and the controls were obtained from 52 birth defects' surveillance hospitals in Hunan Province, China. A questionnaire was administered in person to collect the variables relevant to NSCL/P by face to face interviews. Logistic regression models were used to analyze the influencing factors of NSCL/P, and a stepwise Fisher discriminant analysis was subsequently used to construct the prediction model. In the univariate analysis, 13 influencing factors were related to NSCL/P, of which the following 8 influencing factors as predictors determined the discriminant prediction model: family income, maternal occupational hazards exposure, premarital medical examination, housing renovation, milk/soymilk intake in the first trimester of pregnancy, paternal occupational hazards exposure, paternal strong tea drinking, and family history of NSCL/P. The model had statistical significance (lambda = 0.772, chi-square = 86.044, df = 8, P Self-verification showed that 83.8 % of the participants were correctly predicted to be NSCL/P cases or controls with a sensitivity of 74.3 % and a specificity of 88.5 %. The area under the receiver operating characteristic curve (AUC) was 0.846. The prediction model that was established using the risk factors of NSCL/P can be useful for predicting the risk of NSCL/P. Further research is needed to improve the model, and confirm the validity and reliability of the model.

  15. Research on Knowledge-Oriented Supply ChainRisk Management System Model

    OpenAIRE

    Yingchun Guo

    2011-01-01

    Based on analyzing the characteristics of supply chain risk management under the influences of knowledge, in this paper integrates basic theories and methods of knowledge management into the process of risk management, builds a knowledge-oriented supply chain risk management system model, and proposes relevant strategies, presenting references for practical application of knowledge-oriented supply chain risk management. By means of acquiring, storing, sharing, and transferring supply chain ri...

  16. Option-Based Estimation of the Price of Co-Skewness and Co-Kurtosis Risk

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Fournier, Mathieu; Fournier, Mathieu

    -neutral second moments, and the price of co-kurtosis risk corresponds to the spread between the physical and the risk-neutral third moments. The option-based estimates of the prices of risk lead to reasonable values of the associated risk premia. An out-of-sample analysis of factor models with co-skewness and co......We show that the prices of risk for factors that are nonlinear in the market return are readily obtained using index option prices. We apply this insight to the price of co-skewness and co-kurtosis risk. The price of co-skewness risk corresponds to the spread between the physical and the risk......-kurtosis risk indicates that the new estimates of the price of risk improve the models performance. Models with higher-order market moments also robustly outperform standard competitors such as the CAPM and the Fama-French model....

  17. Option-Based Estimation of the Price of Co-Skewness and Co-Kurtosis Risk

    DEFF Research Database (Denmark)

    Christoffersen, Peter; Fournier, Mathieu; Jacobs, Kris

    -neutral second moments, and the price of co-kurtosis risk corresponds to the spread between the physical and the risk-neutral third moments. The option-based estimates of the prices of risk lead to reasonable values of the associated risk premia. An out-of-sample analysis of factor models with co-skewness and co......We show that the prices of risk for factors that are nonlinear in the market return are readily obtained using index option prices. We apply this insight to the price of co-skewness and co-kurtosis risk. The price of co-skewness risk corresponds to the spread between the physical and the risk......-kurtosis risk indicates that the new estimates of the price of risk improve the models' performance. Models with higher-order market moments also robustly outperform standard competitors such as the CAPM and the Fama-French model....

  18. Regulatory activity based risk model identifies survival of stage II and III colorectal carcinoma.

    Science.gov (United States)

    Liu, Gang; Dong, Chuanpeng; Wang, Xing; Hou, Guojun; Zheng, Yu; Xu, Huilin; Zhan, Xiaohui; Liu, Lei

    2017-11-17

    Clinical and pathological indicators are inadequate for prognosis of stage II and III colorectal carcinoma (CRC). In this study, we utilized the activity of regulatory factors, univariate Cox regression and random forest for variable selection and developed a multivariate Cox model to predict the overall survival of Stage II/III colorectal carcinoma in GSE39582 datasets (469 samples). Patients in low-risk group showed a significant longer overall survival and recurrence-free survival time than those in high-risk group. This finding was further validated in five other independent datasets (GSE14333, GSE17536, GSE17537, GSE33113, and GSE37892). Besides, associations between clinicopathological information and risk score were analyzed. A nomogram including risk score was plotted to facilitate the utilization of risk score. The risk score model is also demonstrated to be effective on predicting both overall and recurrence-free survival of chemotherapy received patients. After performing Gene Set Enrichment Analysis (GSEA) between high and low risk groups, we found that several cell-cell interaction KEGG pathways were identified. Funnel plot results showed that there was no publication bias in these datasets. In summary, by utilizing the regulatory activity in stage II and III colorectal carcinoma, the risk score successfully predicts the survival of 1021 stage II/III CRC patients in six independent datasets.

  19. Development of Health Parameter Model for Risk Prediction of CVD Using SVM

    Directory of Open Access Journals (Sweden)

    P. Unnikrishnan

    2016-01-01

    Full Text Available Current methods of cardiovascular risk assessment are performed using health factors which are often based on the Framingham study. However, these methods have significant limitations due to their poor sensitivity and specificity. We have compared the parameters from the Framingham equation with linear regression analysis to establish the effect of training of the model for the local database. Support vector machine was used to determine the effectiveness of machine learning approach with the Framingham health parameters for risk assessment of cardiovascular disease (CVD. The result shows that while linear model trained using local database was an improvement on Framingham model, SVM based risk assessment model had high sensitivity and specificity of prediction of CVD. This indicates that using the health parameters identified using Framingham study, machine learning approach overcomes the low sensitivity and specificity of Framingham model.

  20. State-of-the-art risk-based approach to spill contingency planning and risk management

    International Nuclear Information System (INIS)

    Schmidt Etkin, Dagmar; Reilly, Timothy; French McCay, Deborah

    2011-01-01

    The paper proposes incorporating a comprehensive examination of spill risk into risk management and contingency planning, and applying state-of-the-art modeling tools to evaluate various alternatives for appropriate spill response measures and optimize protective responses. The approach allows spill contingency planners and decision-makers to determine the types of spill scenarios that may occur in a particular location or from a particular source and calculate the probability distribution of the various scenarios. The spill probability information is useful in assessing and putting into perspective the various costs options for spill control systems that will be recommended ultimately. Using advanced modeling tools helps in estimating the potential environmental and socioeconomic consequences of each spill scenario based on location-specific factors over a range of stochastic possibilities, simulating spill scenarios and determining optimal responses and protection strategies. The benefits and costs of various response alternatives and variations in response time can be calculated and modeling tools for training and risk allocation/transfer purposes used.

  1. Validation of risk-based performance indicators: Safety system function trends

    International Nuclear Information System (INIS)

    Boccio, J.L.; Vesely, W.E.; Azarm, M.A.; Carbonaro, J.F.; Usher, J.L.; Oden, N.

    1989-10-01

    This report describes and applies a process for validating a model for a risk-based performance indicator. The purpose of the risk-based indicator evaluated, Safety System Function Trend (SSFT), is to monitor the unavailability of selected safety systems. Interim validation of this indicator is based on three aspects: a theoretical basis, an empirical basis relying on statistical correlations, and case studies employing 25 plant years of historical data collected from five plants for a number of safety systems. Results using the SSFT model are encouraging. Application of the model through case studies dealing with the performance of important safety systems shows that statistically significant trends in, and levels of, system performance can be discerned which thereby can provide leading indications of degrading and/or improving performances. Methods for developing system performance tolerance bounds are discussed and applied to aid in the interpretation of the trends in this risk-based indicator. Some additional characteristics of the SSFT indicator, learned through the data-collection efforts and subsequent data analyses performed, are also discussed. The usefulness and practicality of other data sources for validation purposes are explored. Further validation of this indicator is noted. Also, additional research is underway in developing a more detailed estimator of system unavailability. 9 refs., 18 figs., 5 tabs

  2. Risk-based high-throughput chemical screening and prioritization using exposure models and in vitro bioactivity assays

    DEFF Research Database (Denmark)

    Shin, Hyeong-Moo; Ernstoff, Alexi; Arnot, Jon

    2015-01-01

    We present a risk-based high-throughput screening (HTS) method to identify chemicals for potential health concerns or for which additional information is needed. The method is applied to 180 organic chemicals as a case study. We first obtain information on how the chemical is used and identify....../oral contact, or dermal exposure. The method provides high-throughput estimates of exposure and important input for decision makers to identify chemicals of concern for further evaluation with additional information or more refined models....

  3. Data analyses and modelling for risk based monitoring of mycotoxins in animal feed

    NARCIS (Netherlands)

    Ine van der Fels-Klerx, H.J.; Adamse, Paulien; Punt, Ans; Asselt, van Esther D.

    2018-01-01

    Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study

  4. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    Science.gov (United States)

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  5. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    Directory of Open Access Journals (Sweden)

    Linus Hammar

    Full Text Available A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m, bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  6. Risk-based regulation: Challenges and opportunities

    International Nuclear Information System (INIS)

    Bari, R.A.

    1995-01-01

    Over the last twenty years, man has witnessed a gradual but steady movement toward increased usage of risk-based methods and results in the regulatory process. The ''risk perspective'' as a supportive view to existing (non-risk-based or deterministic) information used in decision making has a firm foothold now in most countries that regulate nuclear power. Furthermore, in the areas outside the nuclear power field, such as health risk assessment, risk-based information is used increasingly to make decisions on potential impacts of chemical, biological, and radiological exposures. Some of the principal concepts and issues that are pertinent to risk-based regulation are reviewed. There is a growing interest in most countries in the use of risk-based methods and results to facilitate decision-making associated with regulatory processes. A summary is presented of the challenges and opportunities related to expanded use of risk-based regulation

  7. Some computer simulations based on the linear relative risk model

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1991-10-01

    This report presents the results of computer simulations designed to evaluate and compare the performance of the likelihood ratio statistic and the score statistic for making inferences about the linear relative risk mode. The work was motivated by data on workers exposed to low doses of radiation, and the report includes illustration of several procedures for obtaining confidence limits for the excess relative risk coefficient based on data from three studies of nuclear workers. The computer simulations indicate that with small sample sizes and highly skewed dose distributions, asymptotic approximations to the score statistic or to the likelihood ratio statistic may not be adequate. For testing the null hypothesis that the excess relative risk is equal to zero, the asymptotic approximation to the likelihood ratio statistic was adequate, but use of the asymptotic approximation to the score statistic rejected the null hypothesis too often. Frequently the likelihood was maximized at the lower constraint, and when this occurred, the asymptotic approximations for the likelihood ratio and score statistics did not perform well in obtaining upper confidence limits. The score statistic and likelihood ratio statistics were found to perform comparably in terms of power and width of the confidence limits. It is recommended that with modest sample sizes, confidence limits be obtained using computer simulations based on the score statistic. Although nuclear worker studies are emphasized in this report, its results are relevant for any study investigating linear dose-response functions with highly skewed exposure distributions. 22 refs., 14 tabs

  8. Analysis of uncertainty in modeling perceived risks

    International Nuclear Information System (INIS)

    Melnyk, R.; Sandquist, G.M.

    2005-01-01

    Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)

  9. Estimating the value of a Country's built assets: investment-based exposure modelling for global risk assessment

    Science.gov (United States)

    Daniell, James; Pomonis, Antonios; Gunasekera, Rashmin; Ishizawa, Oscar; Gaspari, Maria; Lu, Xijie; Aubrecht, Christoph; Ungar, Joachim

    2017-04-01

    In order to quantify disaster risk, there is a demand and need for determining consistent and reliable economic value of built assets at national or sub national level exposed to natural hazards. The value of the built stock in the context of a city or a country is critical for risk modelling applications as it allows for the upper bound in potential losses to be established. Under the World Bank probabilistic disaster risk assessment - Country Disaster Risk Profiles (CDRP) Program and rapid post-disaster loss analyses in CATDAT, key methodologies have been developed that quantify the asset exposure of a country. In this study, we assess the complementary methods determining value of building stock through capital investment data vs aggregated ground up values based on built area and unit cost of construction analyses. Different approaches to modelling exposure around the world, have resulted in estimated values of built assets of some countries differing by order(s) of magnitude. Using the aforementioned methodology of comparing investment data based capital stock and bottom-up unit cost of construction values per square meter of assets; a suitable range of capital stock estimates for built assets have been created. A blind test format was undertaken to compare the two types of approaches from top-down (investment) and bottom-up (construction cost per unit), In many cases, census data, demographic, engineering and construction cost data are key for bottom-up calculations from previous years. Similarly for the top-down investment approach, distributed GFCF (Gross Fixed Capital Formation) data is also required. Over the past few years, numerous studies have been undertaken through the World Bank Caribbean and Central America disaster risk assessment program adopting this methodology initially developed by Gunasekera et al. (2015). The range of values of the building stock is tested for around 15 countries. In addition, three types of costs - Reconstruction cost

  10. A School-Based Violence Prevention Model for At-Risk Eighth Grade Youth.

    Science.gov (United States)

    Rollin, Stephen A.; Kaiser-Ulrey, Cheryl; Potts, Isabelle; Creason, Alia Haque

    2003-01-01

    Examines the effectiveness of a school and community-based violence prevention program for at-risk eighth-grade students. School officials matched intervention students with community-based mentors in an employment setting. Findings suggest that mentored students had significant reductions in total number and days of suspensions, days of sanction,…

  11. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  12. A multi-reservoir based water-hydroenergy management model for identifying the risk horizon of regional resources-energy policy under uncertainties

    International Nuclear Information System (INIS)

    Zeng, X.T.; Zhang, S.J.; Feng, J.; Huang, G.H.; Li, Y.P.; Zhang, P.; Chen, J.P.; Li, K.L.

    2017-01-01

    Highlights: • A multi-reservoir system can handle water/energy deficit, flood and sediment damage. • A MWH model is developed for planning a water allocation and energy generation issue. • A mixed fuzzy-stochastic risk analysis method (MFSR) can handle uncertainties in MWH. • A hybrid MWH model can plan human-recourse-energy with a robust and effective manner. • Results can support adjusting water-energy policy to satisfy increasing demands. - Abstract: In this study, a multi-reservoir based water-hydroenergy management (MWH) model is developed for planning water allocation and hydroenergy generation (WAHG) under uncertainties. A mixed fuzzy-stochastic risk analysis method (MFSR) is introduced to handle objective and subjective uncertainties in MWH model, which can couple fuzzy credibility programming and risk management within a general two-stage context, with aim to reflect the infeasibility risks between expected targets and random second-stage recourse costs. The developed MWH model (embedded by MFSR method) can be applied to a practical study of WAHG issue in Jing River Basin (China), which encounters conflicts between human activity and resource/energy crisis. The construction of water-energy nexus (WEN) is built to reflect integrity of economic development and resource/energy conservation, as well as confronting natural and artificial damages such as water deficit, electricity insufficient, floodwater, high sedimentation deposition contemporarily. Meanwhile, the obtained results with various credibility levels and target-violated risk levels can support generating a robust plan associated with risk control for identification of the optimized water-allocation and hydroenergy-generation alternatives, as well as flood controls. Moreover, results can be beneficial for policymakers to discern the optimal water/sediment release routes, reservoirs’ storage variations (impacted by sediment deposition), electricity supply schedules and system benefit

  13. 76 FR 1889 - Risk-Based Capital Guidelines: Market Risk

    Science.gov (United States)

    2011-01-11

    ... ``three-pillar'' framework that includes (i) risk-based capital requirements for credit risk, market risk... incremental risk capital requirement to capture default and credit quality migration risk for non... (advanced approaches rules) (collectively, the credit risk capital rules) \\8\\ by requiring any bank subject...

  14. Risk modelling in portfolio optimization

    Science.gov (United States)

    Lam, W. H.; Jaaman, Saiful Hafizah Hj.; Isa, Zaidi

    2013-09-01

    Risk management is very important in portfolio optimization. The mean-variance model has been used in portfolio optimization to minimize the investment risk. The objective of the mean-variance model is to minimize the portfolio risk and achieve the target rate of return. Variance is used as risk measure in the mean-variance model. The purpose of this study is to compare the portfolio composition as well as performance between the optimal portfolio of mean-variance model and equally weighted portfolio. Equally weighted portfolio means the proportions that are invested in each asset are equal. The results show that the portfolio composition of the mean-variance optimal portfolio and equally weighted portfolio are different. Besides that, the mean-variance optimal portfolio gives better performance because it gives higher performance ratio than the equally weighted portfolio.

  15. Model Development for Risk Assessment of Driving on Freeway under Rainy Weather Conditions.

    Directory of Open Access Journals (Sweden)

    Xiaonan Cai

    Full Text Available Rainy weather conditions could result in significantly negative impacts on driving on freeways. However, due to lack of enough historical data and monitoring facilities, many regions are not able to establish reliable risk assessment models to identify such impacts. Given the situation, this paper provides an alternative solution where the procedure of risk assessment is developed based on drivers' subjective questionnaire and its performance is validated by using actual crash data. First, an ordered logit model was developed, based on questionnaire data collected from Freeway G15 in China, to estimate the relationship between drivers' perceived risk and factors, including vehicle type, rain intensity, traffic volume, and location. Then, weighted driving risk for different conditions was obtained by the model, and further divided into four levels of early warning (specified by colors using a rank order cluster analysis. After that, a risk matrix was established to determine which warning color should be disseminated to drivers, given a specific condition. Finally, to validate the proposed procedure, actual crash data from Freeway G15 were compared with the safety prediction based on the risk matrix. The results show that the risk matrix obtained in the study is able to predict driving risk consistent with actual safety implications, under rainy weather conditions.

  16. Validation of a model for ranking aquaculture facilities for risk-based disease surveillance.

    Science.gov (United States)

    Diserens, Nicolas; Falzon, Laura Cristina; von Siebenthal, Beat; Schüpbach-Regula, Gertraud; Wahli, Thomas

    2017-09-15

    A semi-quantitative model for risk ranking of aquaculture facilities in Switzerland with regard to the introduction and spread of Viral Haemorrhagic Septicaemia (VHS) and Infectious Haematopoietic Necrosis (IHN) was developed in a previous study (Diserens et al., 2013). The objective of the present study was to validate this model using data collected during field visits on aquaculture sites in four Swiss cantons compared to data collected through a questionnaire in the previous study. A discrepancy between the values obtained with the two different methods was found in 32.8% of the parameters, resulting in a significant difference (pranking of Swiss aquaculture facilities according to their risk of getting infected with or spreading of VHS and IHN, as the five facilities that tested positive for these diseases in the last ten years were ranked as medium or high risk. Moreover, because the seven fish farms that were infected with Infectious Pancreatic Necrosis (IPN) during the same period also belonged to the risk categories medium and high, the classification appeared to correlate with the occurrence of this third viral fish disease. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Risk-based remediation of polluted sites: A critical perspective.

    Science.gov (United States)

    Kuppusamy, Saranya; Venkateswarlu, Kadiyala; Megharaj, Mallavarapu; Mayilswami, Srinithi; Lee, Yong Bok

    2017-11-01

    Sites contaminated with chemical pollutants represent a growing challenge, and remediation of such lands is of international concern. Risk-based land management (RBLM) is an emerging approach that integrates risk assessment practices with more traditional site-specific investigations and remediation activities. Developing countries are yet to adopt RBLM strategies for remediation. RBLM is considered to be practical, scientifically defensible and cost-efficient. However, it is inherently limited by: firstly, the accuracy of risk assessment models used; secondly, ramifications of the fact that they are more likely to leave contamination in place; and thirdly, uncertainties involved and having to consider the total concentrations of all contaminants in soils that overestimate the potential risks from exposure to the contaminants. Consideration of contaminant bioavailability as the underlying basis for risk assessment and setting remediation goals of those contaminated lands that pose a risk to environmental and human health may lead to the development of a more sophisticated risk-based approach. However, employing the bioavailability concept in RBLM has not been extensively studied and/or legalized. This review highlights the extent of global land contamination, and the concept of risk-based assessment and management of contaminated sites including its advantages and disadvantages. Furthermore, the concept of bioavailability-based RBLM strategy has been proposed, and the challenges of RBLM and the priority areas for future research are summarized. Thus, the present review may help achieve a better understanding and successful implementation of a sustainable bioavailability-based RBLM strategy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Conceptualizing a Dynamic Fall Risk Model Including Intrinsic Risks and Exposures.

    Science.gov (United States)

    Klenk, Jochen; Becker, Clemens; Palumbo, Pierpaolo; Schwickert, Lars; Rapp, Kilan; Helbostad, Jorunn L; Todd, Chris; Lord, Stephen R; Kerse, Ngaire

    2017-11-01

    Falls are a major cause of injury and disability in older people, leading to serious health and social consequences including fractures, poor quality of life, loss of independence, and institutionalization. To design and provide adequate prevention measures, accurate understanding and identification of person's individual fall risk is important. However, to date, the performance of fall risk models is weak compared with models estimating, for example, cardiovascular risk. This deficiency may result from 2 factors. First, current models consider risk factors to be stable for each person and not change over time, an assumption that does not reflect real-life experience. Second, current models do not consider the interplay of individual exposure including type of activity (eg, walking, undertaking transfers) and environmental risks (eg, lighting, floor conditions) in which activity is performed. Therefore, we posit a dynamic fall risk model consisting of intrinsic risk factors that vary over time and exposure (activity in context). eHealth sensor technology (eg, smartphones) begins to enable the continuous measurement of both the above factors. We illustrate our model with examples of real-world falls from the FARSEEING database. This dynamic framework for fall risk adds important aspects that may improve understanding of fall mechanisms, fall risk models, and the development of fall prevention interventions. Copyright © 2017 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  20. Predicting the risk of rheumatoid arthritis and its age of onset through modelling genetic risk variants with smoking.

    Directory of Open Access Journals (Sweden)

    Ian C Scott

    Full Text Available The improved characterisation of risk factors for rheumatoid arthritis (RA suggests they could be combined to identify individuals at increased disease risks in whom preventive strategies may be evaluated. We aimed to develop an RA prediction model capable of generating clinically relevant predictive data and to determine if it better predicted younger onset RA (YORA. Our novel modelling approach combined odds ratios for 15 four-digit/10 two-digit HLA-DRB1 alleles, 31 single nucleotide polymorphisms (SNPs and ever-smoking status in males to determine risk using computer simulation and confidence interval based risk categorisation. Only males were evaluated in our models incorporating smoking as ever-smoking is a significant risk factor for RA in men but not women. We developed multiple models to evaluate each risk factor's impact on prediction. Each model's ability to discriminate anti-citrullinated protein antibody (ACPA-positive RA from controls was evaluated in two cohorts: Wellcome Trust Case Control Consortium (WTCCC: 1,516 cases; 1,647 controls; UK RA Genetics Group Consortium (UKRAGG: 2,623 cases; 1,500 controls. HLA and smoking provided strongest prediction with good discrimination evidenced by an HLA-smoking model area under the curve (AUC value of 0.813 in both WTCCC and UKRAGG. SNPs provided minimal prediction (AUC 0.660 WTCCC/0.617 UKRAGG. Whilst high individual risks were identified, with some cases having estimated lifetime risks of 86%, only a minority overall had substantially increased odds for RA. High risks from the HLA model were associated with YORA (P<0.0001; ever-smoking associated with older onset disease. This latter finding suggests smoking's impact on RA risk manifests later in life. Our modelling demonstrates that combining risk factors provides clinically informative RA prediction; additionally HLA and smoking status can be used to predict the risk of younger and older onset RA, respectively.

  1. Model of Axiological Dimension Risk Management

    Directory of Open Access Journals (Sweden)

    Kulińska Ewa

    2016-01-01

    Full Text Available It was on the basis of the obtained results that identify the key prerequisites for the integration of the management of logistics processes, management of the value creation process, and risk management that the methodological basis for the construction of the axiological dimension of the risk management (ADRM model of logistics processes was determined. By taking into account the contribution of individual concepts to the new research area, its essence was defined as an integrated, structured instrumentation aimed at the identification and implementation of logistics processes supporting creation of the value added as well as the identification and elimination of risk factors disturbing the process of the value creation for internal and external customers. The base for the ADRM concept of logistics processes is the use of the potential being inherent in synergistic effects which are obtained by using prerequisites for the integration of the management of logistics processes, of value creation and risk management as the key determinants of the value creation.

  2. Credit Risk Analysis Using Machine and Deep Learning Models

    Directory of Open Access Journals (Sweden)

    Peter Martey Addo

    2018-04-01

    Full Text Available Due to the advanced technology associated with Big Data, data availability and computing power, most banks or lending institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision-making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modeling process to test the stability of binary classifiers by comparing their performance on separate data. We observe that the tree-based models are more stable than the models based on multilayer artificial neural networks. This opens several questions relative to the intensive use of deep learning systems in enterprises.

  3. Electricity market pricing, risk hedging and modeling

    Science.gov (United States)

    Cheng, Xu

    In this dissertation, we investigate the pricing, price risk hedging/arbitrage, and simplified system modeling for a centralized LMP-based electricity market. In an LMP-based market model, the full AC power flow model and the DC power flow model are most widely used to represent the transmission system. We investigate the differences of dispatching results, congestion pattern, and LMPs for the two power flow models. An appropriate LMP decomposition scheme to quantify the marginal costs of the congestion and real power losses is critical for the implementation of financial risk hedging markets. However, the traditional LMP decomposition heavily depends on the slack bus selection. In this dissertation we propose a slack-independent scheme to break LMP down into energy, congestion, and marginal loss components by analyzing the actual marginal cost of each bus at the optimal solution point. The physical and economic meanings of the marginal effect at each bus provide accurate price information for both congestion and losses, and thus the slack-dependency of the traditional scheme is eliminated. With electricity priced at the margin instead of the average value, the market operator typically collects more revenue from power sellers than that paid to power buyers. According to the LMP decomposition results, the revenue surplus is then divided into two parts: congestion charge surplus and marginal loss revenue surplus. We apply the LMP decomposition results to the financial tools, such as financial transmission right (FTR) and loss hedging right (LHR), which have been introduced to hedge against price risks associated to congestion and losses, to construct a full price risk hedging portfolio. The two-settlement market structure and the introduction of financial tools inevitably create market manipulation opportunities. We investigate several possible market manipulation behaviors by virtual bidding and propose a market monitor approach to identify and quantify such

  4. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    DEFF Research Database (Denmark)

    Jensen, Ninna Reitzel; Schomacker, Kristian Juul

    2015-01-01

    Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death......, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance...... product types. This enables comparison of participating life insurance products and unit-linked insurance products, thus building a bridge between the two different ways of formalizing life insurance products. Finally, our model distinguishes itself from the existing literature by taking into account...

  5. Risk-based classification system of nanomaterials

    International Nuclear Information System (INIS)

    Tervonen, Tommi; Linkov, Igor; Figueira, Jose Rui; Steevens, Jeffery; Chappell, Mark; Merad, Myriam

    2009-01-01

    Various stakeholders are increasingly interested in the potential toxicity and other risks associated with nanomaterials throughout the different stages of a product's life cycle (e.g., development, production, use, disposal). Risk assessment methods and tools developed and applied to chemical and biological materials may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material due to variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as to promote the safe handling and use of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. Stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different ecological risk categories based on our current knowledge of nanomaterial physico-chemical characteristics, variation in produced material, and best professional judgments. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.

  6. GIS-Based Population Model Applied to Nevada Transportation Routes

    International Nuclear Information System (INIS)

    Mills, G.S.; Neuhauser, K.S.

    1999-01-01

    Recently, a model based on geographic information system (GIS) processing of US Census Block data has made high-resolution population analysis for transportation risk analysis technically and economically feasible. Population density bordering each kilometer of a route may be tabulated with specific route sections falling into each of three categories (Rural, Suburban or Urban) identified for separate risk analysis. In addition to the improvement in resolution of Urban areas along a route, the model provides a statistically-based correction to population densities in Rural and Suburban areas where Census Block dimensions may greatly exceed the 800-meter scale of interest. A semi-automated application of the GIS model to a subset of routes in Nevada (related to the Yucca Mountain project) are presented, and the results compared to previous models including a model based on published Census and other data. These comparisons demonstrate that meaningful improvement in accuracy and specificity of transportation risk analyses is dependent on correspondingly accurate and geographically-specific population density data

  7. RiskREP: Risk-Based Security Requirements Elicitation and Prioritization

    OpenAIRE

    Herrmann, Andrea; Morali, A.; Etalle, Sandro; Wieringa, Roelf J.; Niedrite, Laila; Strazdina, Renate; Wangler, Benkt

    2011-01-01

    Companies are under pressure to be in control of their assets but at the same time they must operate as efficiently as possible. This means that they aim to implement “good-enough security‿ but need to be able to justify their security investment plans. In this paper, we present a Risk-Based Requirements Prioritization method (RiskREP) that extends misuse case-based methods with IT architecture based risk assessment and countermeasure definition and prioritization. Countermeasure prioritizati...

  8. A Bayesian approach to the evaluation of risk-based microbiological criteria for Campylobacter in broiler meat

    DEFF Research Database (Denmark)

    Ranta, Jukka; Lindqvist, Roland; Hansson, Ingrid

    2015-01-01

    Shifting from traditional hazard-based food safety management toward risk-based management requires statistical methods for evaluating intermediate targets in food production, such as microbiological criteria (MC), in terms of their effects on human risk of illness. A fully risk-based evaluation...... of MC involves several uncertainties that are related to both the underlying Quantitative Microbiological Risk Assessment (QMRA) model and the production-specific sample data on the prevalence and concentrations of microbes in production batches. We used Bayesian modeling for statistical inference...

  9. Risk matrix model applied to the outsourcing of logistics' activities

    Directory of Open Access Journals (Sweden)

    Fouad Jawab

    2015-09-01

    Full Text Available Purpose: This paper proposes the application of the risk matrix model in the field of logistics outsourcing. Such an application can serve as the basis for decision making regarding the conduct of a risk management in the logistics outsourcing process and allow its prevention. Design/methodology/approach: This study is based on the risk management of logistics outsourcing in the field of the retail sector in Morocco. The authors identify all possible risks and then classify and prioritize them using the Risk Matrix Model. Finally, we have come to four possible decisions for the identified risks. The analysis was made possible through interviews and discussions with the heads of departments and agents who are directly involved in each outsourced activity. Findings and Originality/value: It is possible to improve the risk matrix model by proposing more personalized prevention measures according to each company that operates in the mass-market retailing. Originality/value: This study is the only one made in the process of logistics outsourcing in the retail sector in Morocco through Label’vie as a case study. First, we had identified as thorough as we could all possible risks, then we applied the Risk Matrix Model to sort them out in an ascending order of importance and criticality. As a result, we could hand out to the decision-makers the mapping for an effective control of risks and a better guiding of the process of risk management.

  10. Environmental modeling and health risk analysis (ACTS/RISK)

    National Research Council Canada - National Science Library

    Aral, M. M

    2010-01-01

    ... presents a review of the topics of exposure and health risk analysis. The Analytical Contaminant Transport Analysis System (ACTS) and Health RISK Analysis (RISK) software tools are an integral part of the book and provide computational platforms for all the models discussed herein. The most recent versions of these two softwa...

  11. PACE and the Medicare+Choice risk-adjusted payment model.

    Science.gov (United States)

    Temkin-Greener, H; Meiners, M R; Gruenberg, L

    2001-01-01

    This paper investigates the impact of the Medicare principal inpatient diagnostic cost group (PIP-DCG) payment model on the Program of All-Inclusive Care for the Elderly (PACE). Currently, more than 6,000 Medicare beneficiaries who are nursing home certifiable receive care from PACE, a program poised for expansion under the Balanced Budget Act of 1997. Overall, our analysis suggests that the application of the PIP-DCG model to the PACE program would reduce Medicare payments to PACE, on average, by 38%. The PIP-DCG payment model bases its risk adjustment on inpatient diagnoses and does not capture adequately the risk of caring for a population with functional impairments.

  12. 12 CFR 567.6 - Risk-based capital credit risk-weight categories.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Risk-based capital credit risk-weight... CAPITAL Regulatory Capital Requirements § 567.6 Risk-based capital credit risk-weight categories. (a) Risk...)(2) of this section), plus risk-weighted recourse obligations, direct credit substitutes, and certain...

  13. Evaluating the Impact of Prescription Fill Rates on Risk Stratification Model Performance.

    Science.gov (United States)

    Chang, Hsien-Yen; Richards, Thomas M; Shermock, Kenneth M; Elder Dalpoas, Stacy; J Kan, Hong; Alexander, G Caleb; Weiner, Jonathan P; Kharrazi, Hadi

    2017-12-01

    Risk adjustment models are traditionally derived from administrative claims. Prescription fill rates-extracted by comparing electronic health record prescriptions and pharmacy claims fills-represent a novel measure of medication adherence and may improve the performance of risk adjustment models. We evaluated the impact of prescription fill rates on claims-based risk adjustment models in predicting both concurrent and prospective costs and utilization. We conducted a retrospective cohort study of 43,097 primary care patients from HealthPartners network between 2011 and 2012. Diagnosis and/or pharmacy claims of 2011 were used to build 3 base models using the Johns Hopkins ACG system, in addition to demographics. Model performances were compared before and after adding 3 types of prescription fill rates: primary 0-7 days, primary 0-30 days, and overall. Overall fill rates utilized all ordered prescriptions from electronic health record while primary fill rates excluded refill orders. The overall, primary 0-7, and 0-30 days fill rates were 72.30%, 59.82%, and 67.33%. The fill rates were similar between sexes but varied across different medication classifications, whereas the youngest had the highest rate. Adding fill rates modestly improved the performance of all models in explaining medical costs (improving concurrent R by 1.15% to 2.07%), followed by total costs (0.58% to 1.43%), and pharmacy costs (0.07% to 0.65%). The impact was greater for concurrent costs compared with prospective costs. Base models without diagnosis information showed the highest improvement using prescription fill rates. Prescription fill rates can modestly enhance claims-based risk prediction models; however, population-level improvements in predicting utilization are limited.

  14. Limits of Risk Predictability in a Cascading Alternating Renewal Process Model.

    Science.gov (United States)

    Lin, Xin; Moussawi, Alaa; Korniss, Gyorgy; Bakdash, Jonathan Z; Szymanski, Boleslaw K

    2017-07-27

    Most risk analysis models systematically underestimate the probability and impact of catastrophic events (e.g., economic crises, natural disasters, and terrorism) by not taking into account interconnectivity and interdependence of risks. To address this weakness, we propose the Cascading Alternating Renewal Process (CARP) to forecast interconnected global risks. However, assessments of the model's prediction precision are limited by lack of sufficient ground truth data. Here, we establish prediction precision as a function of input data size by using alternative long ground truth data generated by simulations of the CARP model with known parameters. We illustrate the approach on a model of fires in artificial cities assembled from basic city blocks with diverse housing. The results confirm that parameter recovery variance exhibits power law decay as a function of the length of available ground truth data. Using CARP, we also demonstrate estimation using a disparate dataset that also has dependencies: real-world prediction precision for the global risk model based on the World Economic Forum Global Risk Report. We conclude that the CARP model is an efficient method for predicting catastrophic cascading events with potential applications to emerging local and global interconnected risks.

  15. A framework for widespread replication of a highly spatially resolved childhood lead exposure risk model.

    Science.gov (United States)

    Kim, Dohyeong; Galeano, M Alicia Overstreet; Hull, Andrew; Miranda, Marie Lynn

    2008-12-01

    Preventive approaches to childhood lead poisoning are critical for addressing this longstanding environmental health concern. Moreover, increasing evidence of cognitive effects of blood lead levels system-based childhood lead exposure risk models, especially if executed at highly resolved spatial scales, can help identify children most at risk of lead exposure, as well as prioritize and direct housing and health-protective intervention programs. However, developing highly resolved spatial data requires labor-and time-intensive geocoding and analytical processes. In this study we evaluated the benefit of increased effort spent geocoding in terms of improved performance of lead exposure risk models. We constructed three childhood lead exposure risk models based on established methods but using different levels of geocoded data from blood lead surveillance, county tax assessors, and the 2000 U.S. Census for 18 counties in North Carolina. We used the results to predict lead exposure risk levels mapped at the individual tax parcel unit. The models performed well enough to identify high-risk areas for targeted intervention, even with a relatively low level of effort on geocoding. This study demonstrates the feasibility of widespread replication of highly spatially resolved childhood lead exposure risk models. The models guide resource-constrained local health and housing departments and community-based organizations on how best to expend their efforts in preventing and mitigating lead exposure risk in their communities.

  16. Model of Risk Forewarn and Investment Decision in Stock Markets and Its Realization

    Institute of Scientific and Technical Information of China (English)

    ZOU Hui-wen; TANG Bing-yong; WANG Li-ping; XU Guang-wei

    2004-01-01

    Based on the discussion of characteristic and mechanism of the stock prices volatility in Chinese emerging stock markets, this research designs an index system for risk forewarn, and builds up an investment decision model based on the forewarn of the market risk signal. Then, on probing into the structure and function of the realization of the model, the paper presents the method of data interface.

  17. Artificial Systems and Models for Risk Covering Operations

    Directory of Open Access Journals (Sweden)

    Laurenţiu Mihai Treapăt

    2017-04-01

    Full Text Available Mainly, this paper focuses on the roles of artificial intelligence based systems and especially on risk-covering operations. In this context, the paper comes with theoretical explanations on real-life based examples and applications. From a general perspective, the paper enriches its value with a wide discussion on the related subject. The paper aims to revise the volatilities’ estimation models and the correlations between the various time series and also by presenting the Risk Metrics methodology, as explained is a case study. The advantages that the VaR estimation offers, consist of its ability to quantitatively and numerically express the risk level of a portfolio, at a certain moment in time and also the risk of on open position (in titles, in FX, commodities or granted loans, belonging to an economic agent or even individual; hence, its role in a more efficient capital allocation, in the assumed risk delimitation, and also as a performance measurement instrument. In this paper and the study case that completes our work, we aim to prove how we can prevent considerable losses and even bankruptcies if VaR is known and applied accordingly. For this reason, the universities inRomaniashould include or increase their curricula with the study of the VaR model as an artificial intelligence tool. The simplicity of the presented case study, most probably, is the strongest argument of the current work because it can be understood also by the readers that are not necessarily very experienced in the risk management field.

  18. Risk-Based Two-Stage Stochastic Optimization Problem of Micro-Grid Operation with Renewables and Incentive-Based Demand Response Programs

    Directory of Open Access Journals (Sweden)

    Pouria Sheikhahmadi

    2018-03-01

    Full Text Available The operation problem of a micro-grid (MG in grid-connected mode is an optimization one in which the main objective of the MG operator (MGO is to minimize the operation cost with optimal scheduling of resources and optimal trading energy with the main grid. The MGO can use incentive-based demand response programs (DRPs to pay an incentive to the consumers to change their demands in the peak hours. Moreover, the MGO forecasts the output power of renewable energy resources (RERs and models their uncertainties in its problem. In this paper, the operation problem of an MGO is modeled as a risk-based two-stage stochastic optimization problem. To model the uncertainties of RERs, two-stage stochastic programming is considered and conditional value at risk (CVaR index is used to manage the MGO’s risk-level. Moreover, the non-linear economic models of incentive-based DRPs are used by the MGO to change the peak load. The numerical studies are done to investigate the effect of incentive-based DRPs on the operation problem of the MGO. Moreover, to show the effect of the risk-averse parameter on MGO decisions, a sensitivity analysis is carried out.

  19. Risk-based maintenance-Techniques and applications

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2007-01-01

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions

  20. Assessing Breast Cancer Risk Estimates Based on the Gail Model and Its Predictors in Qatari Women.

    Science.gov (United States)

    Bener, Abdulbari; Çatan, Funda; El Ayoubi, Hanadi R; Acar, Ahmet; Ibrahim, Wanis H

    2017-07-01

    The Gail model is the most widely used breast cancer risk assessment tool. An accurate assessment of individual's breast cancer risk is very important for prevention of the disease and for the health care providers to make decision on taking chemoprevention for high-risk women in clinical practice in Qatar. To assess the breast cancer risk among Arab women population in Qatar using the Gail model and provide a global comparison of risk assessment. In this cross-sectional study of 1488 women (aged 35 years and older), we used the Gail Risk Assessment Tool to assess the risk of developing breast cancer. Sociodemographic features such as age, lifestyle habits, body mass index, breast-feeding duration, consanguinity among parents, and family history of breast cancer were considered as possible risks. The mean age of the study population was 47.8 ± 10.8 years. Qatari women and Arab women constituted 64.7% and 35.3% of the study population, respectively. The mean 5-year and lifetime breast cancer risks were 1.12 ± 0.52 and 10.57 ± 3.1, respectively. Consanguineous marriage among parents was seen in 30.6% of participants. We found a relationship between the 5-year and lifetime risks of breast cancer and variables such as age, age at menarche, gravidity, parity, body mass index, family history of cancer, menopause age, occupation, and level of education. The linear regression analysis identified the predictors for breast cancer in women such as age, age at menarche, age of first birth, family history and age of menopausal were considered the strong predictors and significant contributing risk factors for breast cancer after adjusting for ethnicity, parity and other variables. The current study is the first to evaluate the performance of the Gail model for Arab women population in the Gulf Cooperation Council. Gail model is an appropriate breast cancer risk assessment tool for female population in Qatar.

  1. A Risk Prediction Model Based on Lymph-Node Metastasis in Poorly Differentiated-Type Intramucosal Gastric Cancer.

    Directory of Open Access Journals (Sweden)

    Jeung Hui Pyo

    Full Text Available Endoscopic submucosal dissection (ESD for undifferentiated type early gastric cancer is regarded as an investigational treatment. Few studies have tried to identify the risk factors that predict lymph-node metastasis (LNM in intramucosal poorly differentiated adenocarcinomas (PDC. This study was designed to develop a risk scoring system (RSS for predicting LNM in intramucosal PDC.From January 2002 to July 2015, patients diagnosed with mucosa-confined PDC, among those who underwent curative gastrectomy with lymph node dissection were reviewed. A risk model based on independent predicting factors of LNM was developed, and its performance was internally validated using a split sample approach.Overall, LNM was observed in 5.2% (61 of 1169 patients. Four risk factors [Female sex, tumor size ≥ 3.2 cm, muscularis mucosa (M3 invasion, and lymphatic-vascular involvement] were significantly associated with LNM, which were incorporated into the RSS. The area under the receiver operating characteristic curve for predicting LNM after internal validation was 0.69 [95% confidence interval (CI, 0.59-0.79]. A total score of 2 points corresponded to the optimal RSS threshold with a discrimination of 0.75 (95% CI 0.69-0.81. The LNM rates were 1.6% for low risk (<2 points and 8.9% for high-risk (≥2 points patients, with a negative predictive value of 98.6% (95% CI 0.98-1.00.A RSS could be useful in clinical practice to determine which patients with intramucosal PDC have low risk of LNM.

  2. Predictive Accuracy of the PanCan Lung Cancer Risk Prediction Model -External Validation based on CT from the Danish Lung Cancer Screening Trial

    NARCIS (Netherlands)

    Wille, M.M.W.; Riel, S.J. van; Saghir, Z.; Dirksen, A.; Pedersen, J.H.; Jacobs, C.; Thomsen, L.H.u.; Scholten, E.T.; Skovgaard, L.T.; Ginneken, B. van

    2015-01-01

    Lung cancer risk models should be externally validated to test generalizability and clinical usefulness. The Danish Lung Cancer Screening Trial (DLCST) is a population-based prospective cohort study, used to assess the discriminative performances of the PanCan models.From the DLCST database, 1,152

  3. Ecological models for regulatory risk assessments of pesticides: Developing a strategy for the future.

    NARCIS (Netherlands)

    Thorbek, P.; Forbes, V.; Heimbach, F.; Hommen, U.; Thulke, H.H.; Brink, van den P.J.

    2010-01-01

    Ecological Models for Regulatory Risk Assessments of Pesticides: Developing a Strategy for the Future provides a coherent, science-based view on ecological modeling for regulatory risk assessments. It discusses the benefits of modeling in the context of registrations, identifies the obstacles that

  4. Knowledge-based software design for Defense-in-Depth risk monitor system and application for AP1000

    International Nuclear Information System (INIS)

    Ma Zhanguo; Yoshikawa, Hidekazu; Yang Ming; Nakagawa, Takashi

    2017-01-01

    As part of the new risk monitor system, the software for the plant Defense-in-Depth (DiD) risk monitor system was designed based on the state-transition and finite-state machine, and then the knowledge-based software was developed by object-oriented method utilizing the Unified Modeling Language (UML). Currently, there are mainly two functions in the developed plant DiD risk monitor software that are knowledge-base editor which is used to model the system in a hierarchical manner and the interaction simulator that simulates the interactions between the different actors in the model. In this paper, a model for playing its behavior is called an Actor which is modeled at the top level. The passive safety AP1000 power plant was studied and the small-break loss-of-coolant accident (SBLOCA) design basis accident transient is modeled using the plant DiD risk monitor software. Furthermore, the simulation result is shown for the interactions between the actors which are defined in the plant DiD risk monitor system as PLANT actor, OPERATOR actor, and SUPERVISOR actor. This paper shows that it is feasible to model the nuclear power plant knowledge base using the software modeling technique. The software can make the large knowledge base for the nuclear power plant with small effort. (author)

  5. A spatially-based modeling framework for assessing the risks of soil-associated metals to bats

    International Nuclear Information System (INIS)

    Hernout, Béatrice V.; Somerwill, Kate E.; Arnold, Kathryn E.; McClean, Colin J.; Boxall, Alistair B.A.

    2013-01-01

    Populations of some species of bats are declining in some regions of Europe. These declines are probably due to a range of pressures, including climate change, urbanization and exposure to toxins such as metals. This paper describes the development, paramaterisation and application of a spatially explicit modeling framework to predict the risks of soil-associated metals (lead, copper, zinc and cadmium) to bat health. Around 5.9% of areas where bats reside were predicted to have lead levels that pose a risk to bat health. For copper, this value was 2.8%, for cadmium it was 0.6% and for zinc 0.5%. Further work is therefore warranted to explore the impacts of soil-associated metals on bat populations in the UK. - Highlights: ► A modeling framework is presented to estimate risks of contaminants to wildlife. ► The model has been applied to soil metal contamination and bat species. ► Results indicate that lead and copper pose the greatest risk to bat health. ► A risk is predicted for up to 6% of areas where bats reside in England and Wales. - Application of a novel, spatially explicit risk assessment framework indicates that the health of insectivorous bat species in some regions of the UK may be at threat from exposure to soil associated metals.

  6. Data base of accident and agricultural statistics for transportation risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs.

  7. Data base of accident and agricultural statistics for transportation risk assessment

    International Nuclear Information System (INIS)

    Saricks, C.L.; Williams, R.G.; Hopf, M.R.

    1989-11-01

    A state-level data base of accident and agricultural statistics has been developed to support risk assessment for transportation of spent nuclear fuels and high-level radioactive wastes. This data base will enhance the modeling capabilities for more route-specific analyses of potential risks associated with transportation of these wastes to a disposal site. The data base and methodology used to develop state-specific accident and agricultural data bases are described, and summaries of accident and agricultural statistics are provided. 27 refs., 9 tabs

  8. Modeling a Theory-Based Approach to Examine the Influence of Neurocognitive Impairment on HIV Risk Reduction Behaviors Among Drug Users in Treatment.

    Science.gov (United States)

    Huedo-Medina, Tania B; Shrestha, Roman; Copenhaver, Michael

    2016-08-01

    Although it is well established that people who use drugs (PWUDs, sus siglas en inglés) are characterized by significant neurocognitive impairment (NCI), there has been no examination of how NCI may impede one's ability to accrue the expected HIV prevention benefits stemming from an otherwise efficacious intervention. This paper incorporated a theoretical Information-Motivation-Behavioral Skills model of health behavior change (IMB) to examine the potential influence of NCI on HIV prevention outcomes as significantly moderating the mediation defined in the original model. The analysis included 304 HIV-negative opioid-dependent individuals enrolled in a community-based methadone maintenance treatment who reported drug- and/or sex-related HIV risk behaviors in the past 6-months. Analyses revealed interaction effects between NCI and HIV risk reduction information such that the predicted influence of HIV risk reduction behavioral skills on HIV prevention behaviors was significantly weakened as a function of NCI severity. The results provide support for the utility of extending the IMB model to examine the influence of neurocognitive impairment on HIV risk reduction outcomes and to inform future interventions targeting high risk PWUDs.

  9. Model-free and model-based reward prediction errors in EEG.

    Science.gov (United States)

    Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy

    2018-05-24

    Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Construction of an Early Risk Warning Model of Organizational Resilience: An Empirical Study Based on Samples of R&D Teams

    Directory of Open Access Journals (Sweden)

    Si-hua Chen

    2016-01-01

    Full Text Available Facing fierce competition, it is critical for organizations to keep advantages either actively or passively. Organizational resilience is the ability of an organization to anticipate, prepare for, respond to, and adapt to incremental change and sudden disruptions in order to survive and prosper. It is of particular importance for enterprises to apprehend the intensity of organizational resilience and thereby judge their abilities to withstand pressure. By conducting an exploratory factor analysis and a confirmatory factor analysis, this paper clarifies a five-factor model for organizational resilience of R&D teams. Moreover, based on it, this paper applies fuzzy integrated evaluation method to build an early risk warning model for organizational resilience of R&D teams. The application of the model to a company shows that the model can adequately evaluate the intensity of organizational resilience of R&D teams. The results are also supposed to contribute to applied early risk warning theory.

  11. High-risk regions and outbreak modelling of tularemia in humans.

    Science.gov (United States)

    Desvars-Larrive, A; Liu, X; Hjertqvist, M; Sjöstedt, A; Johansson, A; Rydén, P

    2017-02-01

    Sweden reports large and variable numbers of human tularemia cases, but the high-risk regions are anecdotally defined and factors explaining annual variations are poorly understood. Here, high-risk regions were identified by spatial cluster analysis on disease surveillance data for 1984-2012. Negative binomial regression with five previously validated predictors (including predicted mosquito abundance and predictors based on local weather data) was used to model the annual number of tularemia cases within the high-risk regions. Seven high-risk regions were identified with annual incidences of 3·8-44 cases/100 000 inhabitants, accounting for 56·4% of the tularemia cases but only 9·3% of Sweden's population. For all high-risk regions, most cases occurred between July and September. The regression models explained the annual variation of tularemia cases within most high-risk regions and discriminated between years with and without outbreaks. In conclusion, tularemia in Sweden is concentrated in a few high-risk regions and shows high annual and seasonal variations. We present reproducible methods for identifying tularemia high-risk regions and modelling tularemia cases within these regions. The results may help health authorities to target populations at risk and lay the foundation for developing an early warning system for outbreaks.

  12. Violent reinjury risk assessment instrument (VRRAI) for hospital-based violence intervention programs.

    Science.gov (United States)

    Kramer, Erik J; Dodington, James; Hunt, Ava; Henderson, Terrell; Nwabuo, Adaobi; Dicker, Rochelle; Juillard, Catherine

    2017-09-01

    Violent injury is the second most common cause of death among 15- to 24-year olds in the US. Up to 58% of violently injured youth return to the hospital with a second violent injury. Hospital-based violence intervention programs (HVIPs) have been shown to reduce injury recidivism through intensive case management. However, no validated guidelines for risk assessment strategies in the HVIP setting have been reported. We aimed to use qualitative methods to investigate the key components of risk assessments employed by HVIP case managers and to propose a risk assessment model based on this qualitative analysis. An established academic hospital-affiliated HVIP served as the nexus for this research. Thematic saturation was reached with 11 semi-structured interviews and two focus groups conducted with HVIP case managers and key informants identified through snowball sampling. Interactions were analyzed by a four-member team using Nvivo 10, employing the constant comparison method. Risk factors identified were used to create a set of models presented in two follow-up HVIP case managers and leadership focus groups. Eighteen key themes within seven domains (environment, identity, mental health, behavior, conflict, indicators of lower risk, and case management) and 141 potential risk factors for use in the risk assessment framework were identified. The most salient factors were incorporated into eight models that were presented to the HVIP case managers. A 29-item algorithmic structured professional judgment model was chosen. We identified four tiers of risk factors for violent reinjury that were incorporated into a proposed risk assessment instrument, VRRAI. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Risk-Based Operation and Maintenance Using Bayesian Networks

    DEFF Research Database (Denmark)

    Nielsen, Jannie Jessen; Sørensen, John Dalsgaard

    2011-01-01

    This paper describes how risk-based decision making can be used for maintenance planning of components exposed to degradation such as fatigue in offshore wind turbines. In fatigue models, large epistemic uncertainties are usually present. These can be reduced if monitoring results are used to upd...

  14. Risk-based optimization of land reclamation

    International Nuclear Information System (INIS)

    Lendering, K.T.; Jonkman, S.N.; Gelder, P.H.A.J.M. van; Peters, D.J.

    2015-01-01

    Large-scale land reclamations are generally constructed by means of a landfill well above mean sea level. This can be costly in areas where good quality fill material is scarce. An alternative to save materials and costs is a ‘polder terminal’. The quay wall acts as a flood defense and the terminal level is well below the level of the quay wall. Compared with a conventional terminal, the costs are lower, but an additional flood risk is introduced. In this paper, a risk-based optimization is developed for a conventional and a polder terminal. It considers the investment and residual flood risk. The method takes into account both the quay wall and terminal level, which determine the probability and damage of flooding. The optimal quay wall level is found by solving a Lambert function numerically. The terminal level is bounded by engineering boundary conditions, i.e. piping and uplift of the cover layer of the terminal yard. It is found that, for a representative case study, the saving of reclamation costs for a polder terminal is larger than the increase of flood risk. The model is applicable to other cases of land reclamation and to similar optimization problems in flood risk management. - Highlights: • A polder terminal can be an attractive alternative for a conventional terminal. • A polder terminal is feasible at locations with high reclamation cost. • A risk-based approach is required to determine the optimal protection levels. • The depth of the polder terminal yard is bounded by uplifting of the cover layer. • This paper can support decisions regarding alternatives for port expansions.

  15. A threat-vulnerability based risk analysis model for cyber physical system security

    CSIR Research Space (South Africa)

    Ledwaba, Lehlogonolo

    2017-01-01

    Full Text Available model. An analysis of the Natanz system shows that, with an actual case security-risk score at Mitigation level 5, the infested facilities barely avoided a situation worse than the one which occurred. The paper concludes with a discussion on the need...

  16. An innovative expression model of human health risk based on the quantitative analysis of soil metals sources contribution in different spatial scales.

    Science.gov (United States)

    Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun

    2018-09-01

    Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. A Hierarchal Risk Assessment Model Using the Evidential Reasoning Rule

    Directory of Open Access Journals (Sweden)

    Xiaoxiao Ji

    2017-02-01

    Full Text Available This paper aims to develop a hierarchical risk assessment model using the newly-developed evidential reasoning (ER rule, which constitutes a generic conjunctive probabilistic reasoning process. In this paper, we first provide a brief introduction to the basics of the ER rule and emphasize the strengths for representing and aggregating uncertain information from multiple experts and sources. Further, we discuss the key steps of developing the hierarchical risk assessment framework systematically, including (1 formulation of risk assessment hierarchy; (2 representation of both qualitative and quantitative information; (3 elicitation of attribute weights and information reliabilities; (4 aggregation of assessment information using the ER rule and (5 quantification and ranking of risks using utility-based transformation. The proposed hierarchical risk assessment framework can potentially be implemented to various complex and uncertain systems. A case study on the fire/explosion risk assessment of marine vessels demonstrates the applicability of the proposed risk assessment model.

  18. People's Risk Recognition Preceding Evacuation and Its Role in Demand Modeling and Planning.

    Science.gov (United States)

    Urata, Junji; Pel, Adam J

    2018-05-01

    Evacuation planning and management involves estimating the travel demand in the event that such action is required. This is usually done as a function of people's decision to evacuate, which we show is strongly linked to their risk awareness. We use an empirical data set, which shows tsunami evacuation behavior, to demonstrate that risk recognition is not synonymous with objective risk, but is instead determined by a combination of factors including risk education, information, and sociodemographics, and that it changes dynamically over time. Based on these findings, we formulate an ordered logit model to describe risk recognition combined with a latent class model to describe evacuation choices. Our proposed evacuation choice model along with a risk recognition class can evaluate quantitatively the influence of disaster mitigation measures, risk education, and risk information. The results obtained from the risk recognition model show that risk information has a greater impact in the sense that people recognize their high risk. The results of the evacuation choice model show that people who are unaware of their risk take a longer time to evacuate. © 2017 Society for Risk Analysis.

  19. Screening for gestational diabetes mellitus by a model based on risk indicators

    DEFF Research Database (Denmark)

    Jensen, Dorte Møller; Mølsted-Pedersen, Lars; Beck-Nielsen, Henning

    2003-01-01

    OBJECTIVE: This study was performed to prospectively evaluate a screening model for gestational diabetes mellitus on the basis of clinical risk indicators. STUDY DESIGN: In a prospective multicenter study with 5235 consecutive pregnant women, diagnostic testing with a 2-hour 75-g oral glucose...... of the results from tested women to the whole group in question, a 2.4% prevalence of gestational diabetes mellitus was calculated. Sensitivity and specificity of the model was 80.6 (73.7-87.6) and 64.8 (63.5-66.1), respectively (95% CIs). CONCLUSION: Under ideal conditions, sensitivity of the model...

  20. A Knowledge-Base for a Personalized Infectious Disease Risk Prediction System.

    Science.gov (United States)

    Vinarti, Retno; Hederman, Lucy

    2018-01-01

    We present a knowledge-base to represent collated infectious disease risk (IDR) knowledge. The knowledge is about personal and contextual risk of contracting an infectious disease obtained from declarative sources (e.g. Atlas of Human Infectious Diseases). Automated prediction requires encoding this knowledge in a form that can produce risk probabilities (e.g. Bayesian Network - BN). The knowledge-base presented in this paper feeds an algorithm that can auto-generate the BN. The knowledge from 234 infectious diseases was compiled. From this compilation, we designed an ontology and five rule types for modelling IDR knowledge in general. The evaluation aims to assess whether the knowledge-base structure, and its application to three disease-country contexts, meets the needs of personalized IDR prediction system. From the evaluation results, the knowledge-base conforms to the system's purpose: personalization of infectious disease risk.

  1. Methodological issues in cardiovascular epidemiology: the risk of determining absolute risk through statistical models

    Directory of Open Access Journals (Sweden)

    Demosthenes B Panagiotakos

    2006-09-01

    Full Text Available Demosthenes B Panagiotakos, Vassilis StavrinosOffice of Biostatistics, Epidemiology, Department of Dietetics, Nutrition, Harokopio University, Athens, GreeceAbstract: During the past years there has been increasing interest in the development of cardiovascular disease functions that predict future events at individual level. However, this effort has not been so far very successful, since several investigators have reported large differences in the estimation of the absolute risk among different populations. For example, it seems that predictive models that have been derived from US or north European populations  overestimate the incidence of cardiovascular events in south European and Japanese populations. A potential explanation could be attributed to several factors such as geographical, cultural, social, behavioral, as well as genetic variations between the investigated populations in addition to various methodological, statistical, issues relating to the estimation of these predictive models. Based on current literature it can be concluded that, while risk prediction of future cardiovascular events is a useful tool and might be valuable in controlling the burden of the disease in a population, further work is required to improve the accuracy of the present predictive models.Keywords: cardiovascular disease, risk, models

  2. Models of Credit Risk Measurement

    OpenAIRE

    Hagiu Alina

    2011-01-01

    Credit risk is defined as that risk of financial loss caused by failure by the counterparty. According to statistics, for financial institutions, credit risk is much important than market risk, reduced diversification of the credit risk is the main cause of bank failures. Just recently, the banking industry began to measure credit risk in the context of a portfolio along with the development of risk management started with models value at risk (VAR). Once measured, credit risk can be diversif...

  3. Models for assessing and managing credit risk

    Directory of Open Access Journals (Sweden)

    Neogradi Slađana

    2014-01-01

    Full Text Available This essay deals with the definition of a model for assessing and managing credit risk. Risk is an inseparable component of any average and normal credit transaction. Looking at the different aspects of the identification and classification of risk in the banking industry as well as representation of the key components of modern risk management. In the first part of the essay will analyze how the impact of credit risk on bank and empirical models for determining the financial difficulties in which the company can be found. Bank on the basis of these models can reduce number of approved risk assets. In the second part, we consider models for improving credit risk with emphasis on Basel I, II and III, and the third part, we conclude that the most appropriate model and gives the best effect for measuring credit risk in domestic banks.

  4. A probabilistic topic model for clinical risk stratification from electronic health records.

    Science.gov (United States)

    Huang, Zhengxing; Dong, Wei; Duan, Huilong

    2015-12-01

    Risk stratification aims to provide physicians with the accurate assessment of a patient's clinical risk such that an individualized prevention or management strategy can be developed and delivered. Existing risk stratification techniques mainly focus on predicting the overall risk of an individual patient in a supervised manner, and, at the cohort level, often offer little insight beyond a flat score-based segmentation from the labeled clinical dataset. To this end, in this paper, we propose a new approach for risk stratification by exploring a large volume of electronic health records (EHRs) in an unsupervised fashion. Along this line, this paper proposes a novel probabilistic topic modeling framework called probabilistic risk stratification model (PRSM) based on Latent Dirichlet Allocation (LDA). The proposed PRSM recognizes a patient clinical state as a probabilistic combination of latent sub-profiles, and generates sub-profile-specific risk tiers of patients from their EHRs in a fully unsupervised fashion. The achieved stratification results can be easily recognized as high-, medium- and low-risk, respectively. In addition, we present an extension of PRSM, called weakly supervised PRSM (WS-PRSM) by incorporating minimum prior information into the model, in order to improve the risk stratification accuracy, and to make our models highly portable to risk stratification tasks of various diseases. We verify the effectiveness of the proposed approach on a clinical dataset containing 3463 coronary heart disease (CHD) patient instances. Both PRSM and WS-PRSM were compared with two established supervised risk stratification algorithms, i.e., logistic regression and support vector machine, and showed the effectiveness of our models in risk stratification of CHD in terms of the Area Under the receiver operating characteristic Curve (AUC) analysis. As well, in comparison with PRSM, WS-PRSM has over 2% performance gain, on the experimental dataset, demonstrating that

  5. Risk prediction model for knee pain in the Nottingham community: a Bayesian modelling approach.

    Science.gov (United States)

    Fernandes, G S; Bhattacharya, A; McWilliams, D F; Ingham, S L; Doherty, M; Zhang, W

    2017-03-20

    Twenty-five percent of the British population over the age of 50 years experiences knee pain. Knee pain can limit physical ability and cause distress and bears significant socioeconomic costs. The objectives of this study were to develop and validate the first risk prediction model for incident knee pain in the Nottingham community and validate this internally within the Nottingham cohort and externally within the Osteoarthritis Initiative (OAI) cohort. A total of 1822 participants from the Nottingham community who were at risk for knee pain were followed for 12 years. Of this cohort, two-thirds (n = 1203) were used to develop the risk prediction model, and one-third (n = 619) were used to validate the model. Incident knee pain was defined as pain on most days for at least 1 month in the past 12 months. Predictors were age, sex, body mass index, pain elsewhere, prior knee injury and knee alignment. A Bayesian logistic regression model was used to determine the probability of an OR >1. The Hosmer-Lemeshow χ 2 statistic (HLS) was used for calibration, and ROC curve analysis was used for discrimination. The OAI cohort from the United States was also used to examine the performance of the model. A risk prediction model for knee pain incidence was developed using a Bayesian approach. The model had good calibration, with an HLS of 7.17 (p = 0.52) and moderate discriminative ability (ROC 0.70) in the community. Individual scenarios are given using the model. However, the model had poor calibration (HLS 5866.28, p prediction model for knee pain, regardless of underlying structural changes of knee osteoarthritis, in the community using a Bayesian modelling approach. The model appears to work well in a community-based population but not in individuals with a higher risk for knee osteoarthritis, and it may provide a convenient tool for use in primary care to predict the risk of knee pain in the general population.

  6. A three-gene expression signature model for risk stratification of patients with neuroblastoma.

    Science.gov (United States)

    Garcia, Idoia; Mayol, Gemma; Ríos, José; Domenech, Gema; Cheung, Nai-Kong V; Oberthuer, André; Fischer, Matthias; Maris, John M; Brodeur, Garrett M; Hero, Barbara; Rodríguez, Eva; Suñol, Mariona; Galvan, Patricia; de Torres, Carmen; Mora, Jaume; Lavarino, Cinzia

    2012-04-01

    Neuroblastoma is an embryonal tumor with contrasting clinical courses. Despite elaborate stratification strategies, precise clinical risk assessment still remains a challenge. The purpose of this study was to develop a PCR-based predictor model to improve clinical risk assessment of patients with neuroblastoma. The model was developed using real-time PCR gene expression data from 96 samples and tested on separate expression data sets obtained from real-time PCR and microarray studies comprising 362 patients. On the basis of our prior study of differentially expressed genes in favorable and unfavorable neuroblastoma subgroups, we identified three genes, CHD5, PAFAH1B1, and NME1, strongly associated with patient outcome. The expression pattern of these genes was used to develop a PCR-based single-score predictor model. The model discriminated patients into two groups with significantly different clinical outcome [set 1: 5-year overall survival (OS): 0.93 ± 0.03 vs. 0.53 ± 0.06, 5-year event-free survival (EFS): 0.85 ± 0.04 vs. 0.042 ± 0.06, both P model was an independent marker for survival (P model robustly classified patients in the total cohort and in different clinically relevant risk subgroups. We propose for the first time in neuroblastoma, a technically simple PCR-based predictor model that could help refine current risk stratification systems. ©2012 AACR.

  7. Risk-based classification system of nanomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Tervonen, Tommi, E-mail: t.p.tervonen@rug.n [University of Groningen, Faculty of Economics and Business (Netherlands); Linkov, Igor, E-mail: igor.linkov@usace.army.mi [US Army Research and Development Center (United States); Figueira, Jose Rui, E-mail: figueira@ist.utl.p [Technical University of Lisbon, CEG-IST, Centre for Management Studies, Instituto Superior Tecnico (Portugal); Steevens, Jeffery, E-mail: jeffery.a.steevens@usace.army.mil; Chappell, Mark, E-mail: mark.a.chappell@usace.army.mi [US Army Research and Development Center (United States); Merad, Myriam, E-mail: myriam.merad@ineris.f [INERIS BP 2, Societal Management of Risks Unit/Accidental Risks Division (France)

    2009-05-15

    Various stakeholders are increasingly interested in the potential toxicity and other risks associated with nanomaterials throughout the different stages of a product's life cycle (e.g., development, production, use, disposal). Risk assessment methods and tools developed and applied to chemical and biological materials may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material due to variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as to promote the safe handling and use of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. Stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different ecological risk categories based on our current knowledge of nanomaterial physico-chemical characteristics, variation in produced material, and best professional judgments. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.

  8. Managing business model innovation risks - lessons for theory and practice

    DEFF Research Database (Denmark)

    Taran, Yariv; Chester Goduscheit, René; Boer, Harry

    2015-01-01

    approach, arguing from a “no risk no reward” aphorism, a sloppy implementation approach towards business model innovation may result in catastrophic, sometimes even fatal, consequences to a firm’s core business. Based on four unsuccessful business model innovation experiences, which took place in three...

  9. STakeholder-Objective Risk Model (STORM): Determining the aggregated risk of multiple contaminant hazards in groundwater well catchments

    Science.gov (United States)

    Enzenhoefer, R.; Binning, P. J.; Nowak, W.

    2015-09-01

    Risk is often defined as the product of probability, vulnerability and value. Drinking water supply from groundwater abstraction is often at risk due to multiple hazardous land use activities in the well catchment. Each hazard might or might not introduce contaminants into the subsurface at any point in time, which then affects the pumped quality upon transport through the aquifer. In such situations, estimating the overall risk is not trivial, and three key questions emerge: (1) How to aggregate the impacts from different contaminants and spill locations to an overall, cumulative impact on the value at risk? (2) How to properly account for the stochastic nature of spill events when converting the aggregated impact to a risk estimate? (3) How will the overall risk and subsequent decision making depend on stakeholder objectives, where stakeholder objectives refer to the values at risk, risk attitudes and risk metrics that can vary between stakeholders. In this study, we provide a STakeholder-Objective Risk Model (STORM) for assessing the total aggregated risk. Or concept is a quantitative, probabilistic and modular framework for simulation-based risk estimation. It rests on the source-pathway-receptor concept, mass-discharge-based aggregation of stochastically occuring spill events, accounts for uncertainties in the involved flow and transport models through Monte Carlo simulation, and can address different stakeholder objectives. We illustrate the application of STORM in a numerical test case inspired by a German drinking water catchment. As one may expect, the results depend strongly on the chosen stakeholder objectives, but they are equally sensitive to different approaches for risk aggregation across different hazards, contaminant types, and over time.

  10. Stochastic Model Predictive Fault Tolerant Control Based on Conditional Value at Risk for Wind Energy Conversion System

    Directory of Open Access Journals (Sweden)

    Yun-Tao Shi

    2018-01-01

    Full Text Available Wind energy has been drawing considerable attention in recent years. However, due to the random nature of wind and high failure rate of wind energy conversion systems (WECSs, how to implement fault-tolerant WECS control is becoming a significant issue. This paper addresses the fault-tolerant control problem of a WECS with a probable actuator fault. A new stochastic model predictive control (SMPC fault-tolerant controller with the Conditional Value at Risk (CVaR objective function is proposed in this paper. First, the Markov jump linear model is used to describe the WECS dynamics, which are affected by many stochastic factors, like the wind. The Markov jump linear model can precisely model the random WECS properties. Second, the scenario-based SMPC is used as the controller to address the control problem of the WECS. With this controller, all the possible realizations of the disturbance in prediction horizon are enumerated by scenario trees so that an uncertain SMPC problem can be transformed into a deterministic model predictive control (MPC problem. Finally, the CVaR object function is adopted to improve the fault-tolerant control performance of the SMPC controller. CVaR can provide a balance between the performance and random failure risks of the system. The Min-Max performance index is introduced to compare the fault-tolerant control performance with the proposed controller. The comparison results show that the proposed method has better fault-tolerant control performance.

  11. Validation of a new mortality risk prediction model for people 65 years and older in northwest Russia: The Crystal risk score.

    Science.gov (United States)

    Turusheva, Anna; Frolova, Elena; Bert, Vaes; Hegendoerfer, Eralda; Degryse, Jean-Marie

    2017-07-01

    Prediction models help to make decisions about further management in clinical practice. This study aims to develop a mortality risk score based on previously identified risk predictors and to perform internal and external validations. In a population-based prospective cohort study of 611 community-dwelling individuals aged 65+ in St. Petersburg (Russia), all-cause mortality risks over 2.5 years follow-up were determined based on the results obtained from anthropometry, medical history, physical performance tests, spirometry and laboratory tests. C-statistic, risk reclassification analysis, integrated discrimination improvement analysis, decision curves analysis, internal validation and external validation were performed. Older adults were at higher risk for mortality [HR (95%CI)=4.54 (3.73-5.52)] when two or more of the following components were present: poor physical performance, low muscle mass, poor lung function, and anemia. If anemia was combined with high C-reactive protein (CRP) and high B-type natriuretic peptide (BNP) was added the HR (95%CI) was slightly higher (5.81 (4.73-7.14)) even after adjusting for age, sex and comorbidities. Our models were validated in an external population of adults 80+. The extended model had a better predictive capacity for cardiovascular mortality [HR (95%CI)=5.05 (2.23-11.44)] compared to the baseline model [HR (95%CI)=2.17 (1.18-4.00)] in the external population. We developed and validated a new risk prediction score that may be used to identify older adults at higher risk for mortality in Russia. Additional studies need to determine which targeted interventions improve the outcomes of these at-risk individuals. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Case studies: Risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1987-01-01

    The SOCRATES computer program uses the results of a Probabilistic Risk Assessment (PRA) or a system level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at a plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns with no adverse impacts on risk. Three summaries of case study applications are included to demonstrate the types of results that can be achieved through risk-based evaluation of technical specifications. (orig.)

  13. A model for the optimal risk management of (farm) firms

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    Current methods of risk management focus on efficiency and do not provide operational answers to the basic question of how to optimise and balance the two objectives, maximisation of expected income and minimisation of risk. This paper uses the Capital Asset Pricing Model (CAPM) to derive...... an operational criterion for the optimal risk management of firms. The criterion assumes that the objective of the firm manager is to maximise the market value of the firm and is based on the condition that the application of risk management tools has a symmetric effect on the variability of income around...... the mean. The criterion is based on the expected consequences of risk management on relative changes in the variance of return on equity and expected income. The paper demonstrates how the criterion may be used to evaluate and compare the effect of different risk management tools, and it illustrates how...

  14. Development and Validation of a Prediction Model to Estimate Individual Risk of Pancreatic Cancer.

    Science.gov (United States)

    Yu, Ami; Woo, Sang Myung; Joo, Jungnam; Yang, Hye-Ryung; Lee, Woo Jin; Park, Sang-Jae; Nam, Byung-Ho

    2016-01-01

    There is no reliable screening tool to identify people with high risk of developing pancreatic cancer even though pancreatic cancer represents the fifth-leading cause of cancer-related death in Korea. The goal of this study was to develop an individualized risk prediction model that can be used to screen for asymptomatic pancreatic cancer in Korean men and women. Gender-specific risk prediction models for pancreatic cancer were developed using the Cox proportional hazards model based on an 8-year follow-up of a cohort study of 1,289,933 men and 557,701 women in Korea who had biennial examinations in 1996-1997. The performance of the models was evaluated with respect to their discrimination and calibration ability based on the C-statistic and Hosmer-Lemeshow type χ2 statistic. A total of 1,634 (0.13%) men and 561 (0.10%) women were newly diagnosed with pancreatic cancer. Age, height, BMI, fasting glucose, urine glucose, smoking, and age at smoking initiation were included in the risk prediction model for men. Height, BMI, fasting glucose, urine glucose, smoking, and drinking habit were included in the risk prediction model for women. Smoking was the most significant risk factor for developing pancreatic cancer in both men and women. The risk prediction model exhibited good discrimination and calibration ability, and in external validation it had excellent prediction ability. Gender-specific risk prediction models for pancreatic cancer were developed and validated for the first time. The prediction models will be a useful tool for detecting high-risk individuals who may benefit from increased surveillance for pancreatic cancer.

  15. A New Perspective on Modeling Groundwater-Driven Health Risk With Subjective Information

    Science.gov (United States)

    Ozbek, M. M.

    2003-12-01

    Fuzzy rule-based systems provide an efficient environment for the modeling of expert information in the context of risk management for groundwater contamination problems. In general, their use in the form of conditional pieces of knowledge, has been either as a tool for synthesizing control laws from data (i.e., conjunction-based models), or in a knowledge representation and reasoning perspective in Artificial Intelligence (i.e., implication-based models), where only the latter may lead to coherence problems (e.g., input data that leads to logical inconsistency when added to the knowledge base). We implement a two-fold extension to an implication-based groundwater risk model (Ozbek and Pinder, 2002) including: 1) the implementation of sufficient conditions for a coherent knowledge base, and 2) the interpolation of expert statements to supplement gaps in knowledge. The original model assumes statements of public health professionals for the characterization of the exposed individual and the relation of dose and pattern of exposure to its carcinogenic effects. We demonstrate the utility of the extended model in that it: 1)identifies inconsistent statements and establishes coherence in the knowledge base, and 2) minimizes the burden of knowledge elicitation from the experts for utilizing existing knowledge in an optimal fashion.ÿÿ

  16. Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions

    Science.gov (United States)

    Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei

    2015-01-01

    Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019

  17. Risk-based and deterministic regulation

    International Nuclear Information System (INIS)

    Fischer, L.E.; Brown, N.W.

    1995-07-01

    Both risk-based and deterministic methods are used for regulating the nuclear industry to protect the public safety and health from undue risk. The deterministic method is one where performance standards are specified for each kind of nuclear system or facility. The deterministic performance standards address normal operations and design basis events which include transient and accident conditions. The risk-based method uses probabilistic risk assessment methods to supplement the deterministic one by (1) addressing all possible events (including those beyond the design basis events), (2) using a systematic, logical process for identifying and evaluating accidents, and (3) considering alternative means to reduce accident frequency and/or consequences. Although both deterministic and risk-based methods have been successfully applied, there is need for a better understanding of their applications and supportive roles. This paper describes the relationship between the two methods and how they are used to develop and assess regulations in the nuclear industry. Preliminary guidance is suggested for determining the need for using risk based methods to supplement deterministic ones. However, it is recommended that more detailed guidance and criteria be developed for this purpose

  18. Melanoma Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  19. On pseudo-values for regression analysis in competing risks models

    DEFF Research Database (Denmark)

    Graw, F; Gerds, Thomas Alexander; Schumacher, M

    2009-01-01

    For regression on state and transition probabilities in multi-state models Andersen et al. (Biometrika 90:15-27, 2003) propose a technique based on jackknife pseudo-values. In this article we analyze the pseudo-values suggested for competing risks models and prove some conjectures regarding their...

  20. Wearable-Sensor-Based Classification Models of Faller Status in Older Adults.

    Directory of Open Access Journals (Sweden)

    Jennifer Howcroft

    Full Text Available Wearable sensors have potential for quantitative, gait-based, point-of-care fall risk assessment that can be easily and quickly implemented in clinical-care and older-adult living environments. This investigation generated models for wearable-sensor based fall-risk classification in older adults and identified the optimal sensor type, location, combination, and modelling method; for walking with and without a cognitive load task. A convenience sample of 100 older individuals (75.5 ± 6.7 years; 76 non-fallers, 24 fallers based on 6 month retrospective fall occurrence walked 7.62 m under single-task and dual-task conditions while wearing pressure-sensing insoles and tri-axial accelerometers at the head, pelvis, and left and right shanks. Participants also completed the Activities-specific Balance Confidence scale, Community Health Activities Model Program for Seniors questionnaire, six minute walk test, and ranked their fear of falling. Fall risk classification models were assessed for all sensor combinations and three model types: multi-layer perceptron neural network, naïve Bayesian, and support vector machine. The best performing model was a multi-layer perceptron neural network with input parameters from pressure-sensing insoles and head, pelvis, and left shank accelerometers (accuracy = 84%, F1 score = 0.600, MCC score = 0.521. Head sensor-based models had the best performance of the single-sensor models for single-task gait assessment. Single-task gait assessment models outperformed models based on dual-task walking or clinical assessment data. Support vector machines and neural networks were the best modelling technique for fall risk classification. Fall risk classification models developed for point-of-care environments should be developed using support vector machines and neural networks, with a multi-sensor single-task gait assessment.

  1. Risk-Based Sampling: I Don't Want to Weight in Vain.

    Science.gov (United States)

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.

  2. A model-based approach to preplanting risk assessment for gray leaf spot of maize.

    Science.gov (United States)

    Paul, P A; Munkvold, G P

    2004-12-01

    ABSTRACT Risk assessment models for gray leaf spot of maize, caused by Cercospora zeae-maydis, were developed using preplanting site and maize genotype data as predictors. Disease severity at the dough/dent plant growth stage was categorized into classes and used as the response variable. Logistic regression and classification and regression tree (CART) modeling approaches were used to predict severity classes as a function of planting date (PD), amount of maize soil surface residue (SR), cropping sequence, genotype maturity and gray leaf spot resistance (GLSR) ratings, and longitude (LON). Models were development using 332 cases collected between 1998 and 2001. Thirty cases collected in 2002 were used to validate the models. Preplanting data showed a strong relationship with late-season gray leaf spot severity classes. The most important predictors were SR, PD, GLSR, and LON. Logistic regression models correctly classified 60 to 70% of the validation cases, whereas the CART models correctly classified 57 to 77% of these cases. Cases misclassified by the CART models were mostly due to overestimation, whereas the logistic regression models tended to misclassify cases by underestimation. Both the CART and logistic regression models have potential as management decision-making tools. Early quantitative assessment of gray leaf spot risk would allow for more sound management decisions being made when warranted.

  3. Polytomous diagnosis of ovarian tumors as benign, borderline, primary invasive or metastatic: development and validation of standard and kernel-based risk prediction models

    Directory of Open Access Journals (Sweden)

    Testa Antonia C

    2010-10-01

    Full Text Available Abstract Background Hitherto, risk prediction models for preoperative ultrasound-based diagnosis of ovarian tumors were dichotomous (benign versus malignant. We develop and validate polytomous models (models that predict more than two events to diagnose ovarian tumors as benign, borderline, primary invasive or metastatic invasive. The main focus is on how different types of models perform and compare. Methods A multi-center dataset containing 1066 women was used for model development and internal validation, whilst another multi-center dataset of 1938 women was used for temporal and external validation. Models were based on standard logistic regression and on penalized kernel-based algorithms (least squares support vector machines and kernel logistic regression. We used true polytomous models as well as combinations of dichotomous models based on the 'pairwise coupling' technique to produce polytomous risk estimates. Careful variable selection was performed, based largely on cross-validated c-index estimates. Model performance was assessed with the dichotomous c-index (i.e. the area under the ROC curve and a polytomous extension, and with calibration graphs. Results For all models, between 9 and 11 predictors were selected. Internal validation was successful with polytomous c-indexes between 0.64 and 0.69. For the best model dichotomous c-indexes were between 0.73 (primary invasive vs metastatic and 0.96 (borderline vs metastatic. On temporal and external validation, overall discrimination performance was good with polytomous c-indexes between 0.57 and 0.64. However, discrimination between primary and metastatic invasive tumors decreased to near random levels. Standard logistic regression performed well in comparison with advanced algorithms, and combining dichotomous models performed well in comparison with true polytomous models. The best model was a combination of dichotomous logistic regression models. This model is available online

  4. Escherichia coli pollution in a Baltic Sea lagoon: a model-based source and spatial risk assessment.

    Science.gov (United States)

    Schippmann, Bianca; Schernewski, Gerald; Gräwe, Ulf

    2013-07-01

    Tourism around the Oder (Szczecin) Lagoon, at the southern Baltic coast, has a long tradition, is an important source of income and shall be further developed. Insufficient bathing water quality and frequent beach closings, especially in the Oder river mouth, hamper tourism development. Monitoring data gives only an incomplete picture of Escherichia coli (E. coli) bacteria sources, spatial transport patterns, risks and does neither support an efficient bathing water quality management nor decision making. We apply a 3D ocean model and a Lagrangian particle tracking model to analyse pollution events and to obtain spatial E. coli pollution maps based on scenario simulations. Model results suggests that insufficient sewage treatment in the city of Szczecin is the major source of faecal pollution, even for beaches 20km downstream. E. coli mortality rate and emission intensity are key parameters for concentration levels downstream. Wind and river discharge play a modifying role. Prevailing southwestern wind conditions cause E. coli transport along the eastern coast and favour high concentration levels at the beaches. Our simulations indicate that beach closings in 2006 would not have been necessary according to the new EU-Bathing Water Quality Directive (2006/7/EC). The implementation of the new directive will, very likely, reduce the number of beach closings, but not the risk for summer tourists. Model results suggest, that a full sewage treatment in Szczecin would allow the establishment of new beaches closer to the city (north of Dabie lake). Copyright © 2013 Elsevier GmbH. All rights reserved.

  5. Calculating excess lifetime risk in relative risk models

    International Nuclear Information System (INIS)

    Vaeth, M.; Pierce, D.A.

    1990-01-01

    When assessing the impact of radiation exposure it is common practice to present the final conclusions in terms of excess lifetime cancer risk in a population exposed to a given dose. The present investigation is mainly a methodological study focusing on some of the major issues and uncertainties involved in calculating such excess lifetime risks and related risk projection methods. The age-constant relative risk model used in the recent analyses of the cancer mortality that was observed in the follow-up of the cohort of A-bomb survivors in Hiroshima and Nagasaki is used to describe the effect of the exposure on the cancer mortality. In this type of model the excess relative risk is constant in age-at-risk, but depends on the age-at-exposure. Calculation of excess lifetime risks usually requires rather complicated life-table computations. In this paper we propose a simple approximation to the excess lifetime risk; the validity of the approximation for low levels of exposure is justified empirically as well as theoretically. This approximation provides important guidance in understanding the influence of the various factors involved in risk projections. Among the further topics considered are the influence of a latent period, the additional problems involved in calculations of site-specific excess lifetime cancer risks, the consequences of a leveling off or a plateau in the excess relative risk, and the uncertainties involved in transferring results from one population to another. The main part of this study relates to the situation with a single, instantaneous exposure, but a brief discussion is also given of the problem with a continuous exposure at a low-dose rate

  6. The MCRA model for probabilistic single-compound and cumulative risk assessment of pesticides.

    Science.gov (United States)

    van der Voet, Hilko; de Boer, Waldo J; Kruisselbrink, Johannes W; Goedhart, Paul W; van der Heijden, Gerie W A M; Kennedy, Marc C; Boon, Polly E; van Klaveren, Jacob D

    2015-05-01

    Pesticide risk assessment is hampered by worst-case assumptions leading to overly pessimistic assessments. On the other hand, cumulative health effects of similar pesticides are often not taken into account. This paper describes models and a web-based software system developed in the European research project ACROPOLIS. The models are appropriate for both acute and chronic exposure assessments of single compounds and of multiple compounds in cumulative assessment groups. The software system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in pesticide risk assessment at mcra.rivm.nl. We describe the MCRA implementation of the methods as advised in the 2012 EFSA Guidance on probabilistic modelling, as well as more refined methods developed in the ACROPOLIS project. The emphasis is on cumulative assessments. Two approaches, sample-based and compound-based, are contrasted. It is shown that additional data on agricultural use of pesticides may give more realistic risk assessments. Examples are given of model and software validation of acute and chronic assessments, using both simulated data and comparisons against the previous release of MCRA and against the standard software DEEM-FCID used by the Environmental Protection Agency in the USA. It is shown that the EFSA Guidance pessimistic model may not always give an appropriate modelling of exposure. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  7. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  8. Modelling and mapping spread in pest risk analysis: a generic approach

    NARCIS (Netherlands)

    Kehlenbeck, H.; Robinet, C.; Werf, van der W.; Kriticos, D.; Reynaud, P.; Baker, R.

    2012-01-01

    Assessing the likelihood and magnitude of spread is one of the cornerstones of pest risk analysis (PRA), and is usually based on qualitative expert judgment. This paper proposes a suite of simple ecological models to support risk assessors who also wish to estimate the rate and extent of spread,

  9. Risk-based methodology for USNRC inspections

    International Nuclear Information System (INIS)

    Wong, S.M.; Holahan, G.M.; Chung, J.W.; Johnson, M.R.

    1995-01-01

    This paper describes the development and trial applications of a risk-based methodology to enhance the inspection processes for US nuclear power plants. Objectives of risk-based methods to complement prescriptive engineering approaches in US Nuclear Regulatory Commission (USNRC) inspection programs are presented. Insights from time-dependent risk profiles of plant configurational from Individual Plant Evaluation (IPE) studies were integrated to develop a framework for optimizing inspection efforts in NRC regulatory initiatives. Lessons learned from NRC pilot applications of the risk-based methodology for evaluation of the effectiveness of operational risk management programs at US nuclear power plant sites are also discussed

  10. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  11. Risk-based SMA for Cubesats

    Science.gov (United States)

    Leitner, Jesse

    2016-01-01

    This presentation conveys an approach for risk-based safety and mission assurance applied to cubesats. This presentation accompanies a NASA Goddard standard in development that provides guidance for building a mission success plan for cubesats based on the risk tolerance and resources available.

  12. [The application of two occupation health risk assessment models in a wooden furniture manufacturing industry].

    Science.gov (United States)

    Wang, A H; Leng, P B; Bian, G L; Li, X H; Mao, G C; Zhang, M B

    2016-10-20

    Objective: To explore the applicability of 2 different models of occupational health risk assessment in wooden furniture manufacturing industry. Methods: American EPA inhalation risk model and ICMM model of occupational health risk assessment were conducted to assess occupational health risk in a small wooden furniture enterprises, respectively. Results: There was poor protective measure and equipment of occupational disease in the plant. The concentration of wood dust in the air of two workshops was over occupational exposure limit (OEL) , and the C TWA was 8.9 mg/m 3 and 3.6 mg/m 3 , respectively. According to EPA model, the workers who exposed to benzene in this plant had high risk (9.7×10 -6 ~34.3×10 -6 ) of leukemia, and who exposed to formaldehyde had high risk (11.4 × 10 -6 ) of squamous cell carcinoma. There were inconsistent evaluation results using the ICMM tools of standard-based matrix and calculated risk rating. There were very high risks to be attacked by rhinocarcinoma of the workers who exposed to wood dust for the tool of calculated risk rating, while high risk for the tool of standard-based matrix. For the workers who exposed to noise, risk of noise-induced deafness was unacceptable and medium risk using two tools, respectively. Conclusion: Both EPA model and ICMM model can appropriately predict and assessthe occupational health risk in wooden furniture manufactory, ICMM due to the relatively simple operation, easy evaluation parameters, assessment of occupational - disease - inductive factors comprehensively, and more suitable for wooden furniture production enterprise.

  13. Neural Network-Based Coronary Heart Disease Risk Prediction Using Feature Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jae Kwon Kim

    2017-01-01

    Full Text Available Background. Of the machine learning techniques used in predicting coronary heart disease (CHD, neural network (NN is popularly used to improve performance accuracy. Objective. Even though NN-based systems provide meaningful results based on clinical experiments, medical experts are not satisfied with their predictive performances because NN is trained in a “black-box” style. Method. We sought to devise an NN-based prediction of CHD risk using feature correlation analysis (NN-FCA using two stages. First, the feature selection stage, which makes features acceding to the importance in predicting CHD risk, is ranked, and second, the feature correlation analysis stage, during which one learns about the existence of correlations between feature relations and the data of each NN predictor output, is determined. Result. Of the 4146 individuals in the Korean dataset evaluated, 3031 had low CHD risk and 1115 had CHD high risk. The area under the receiver operating characteristic (ROC curve of the proposed model (0.749 ± 0.010 was larger than the Framingham risk score (FRS (0.393 ± 0.010. Conclusions. The proposed NN-FCA, which utilizes feature correlation analysis, was found to be better than FRS in terms of CHD risk prediction. Furthermore, the proposed model resulted in a larger ROC curve and more accurate predictions of CHD risk in the Korean population than the FRS.

  14. Combining engineering and data-driven approaches: Development of a generic fire risk model facilitating calibration

    DEFF Research Database (Denmark)

    De Sanctis, G.; Fischer, K.; Kohler, J.

    2014-01-01

    Fire risk models support decision making for engineering problems under the consistent consideration of the associated uncertainties. Empirical approaches can be used for cost-benefit studies when enough data about the decision problem are available. But often the empirical approaches...... a generic risk model that is calibrated to observed fire loss data. Generic risk models assess the risk of buildings based on specific risk indicators and support risk assessment at a portfolio level. After an introduction to the principles of generic risk assessment, the focus of the present paper...... are not detailed enough. Engineering risk models, on the other hand, may be detailed but typically involve assumptions that may result in a biased risk assessment and make a cost-benefit study problematic. In two related papers it is shown how engineering and data-driven modeling can be combined by developing...

  15. Methods and Models of Market Risk Stress-Testing of the Portfolio of Financial Instruments

    Directory of Open Access Journals (Sweden)

    Alexander M. Karminsky

    2015-01-01

    Full Text Available Amid instability of financial markets and macroeconomic situation the necessity of improving bank risk-management instrument arises. New economic reality defines the need for searching for more advanced approaches of estimating banks vulnerability to exceptional, but plausible events. Stress-testing belongs to such instruments. The paper reviews and compares the models of market risk stress-testing of the portfolio of different financial instruments. These days the topic of the paper is highly acute due to the fact that now stress-testing is becoming an integral part of anticrisis risk-management amid macroeconomic instability and appearance of new risks together with close interest to the problem of risk-aggregation. The paper outlines the notion of stress-testing and gives coverage of goals, functions of stress-tests and main criteria for market risk stress-testing classification. The paper also stresses special aspects of scenario analysis. Novelty of the research is explained by elaborating the programme of aggregated complex multifactor stress-testing of the portfolio risk based on scenario analysis. The paper highlights modern Russian and foreign models of stress-testing both on solo-basis and complex. The paper lays emphasis on the results of stress-testing and revaluations of positions for all three complex models: methodology of the Central Bank of stress-testing portfolio risk, model relying on correlations analysis and copula model. The models of stress-testing on solo-basis are different for each financial instrument. Parametric StressVaR model is applicable to shares and options stress-testing;model based on "Grek" indicators is used for options; for euroobligation regional factor model is used. Finally some theoretical recommendations about managing market risk of the portfolio are given.

  16. A comparison of models for risk assessment

    International Nuclear Information System (INIS)

    Kellerer, A.M.; Jing Chen

    1993-01-01

    Various mathematical models have been used to represent the dependence of excess cancer risk on dose, age and time since exposure. For solid cancers, i.e. all cancers except leukaemia, the so-called relative risk model is usually employed. However, there can be quite different relative risk models. The most usual model for the quantification of excess tumour rate among the atomic bomb survivors has been a dependence of the relative risk on age at exposure, but it has been shown recently that an age attained model can be equally applied, to represent the observations among the atomic bomb survivors. The differences between the models and their implications are explained. It is also shown that the age attained model is similar to the approaches that have been used in the analysis of lung cancer incidence among radon exposed miners. A more unified approach to modelling of radiation risks can thus be achieved. (3 figs.)

  17. Risk Measurement and Risk Modelling Using Applications of Vine Copulas

    Directory of Open Access Journals (Sweden)

    David E. Allen

    2017-09-01

    Full Text Available This paper features an application of Regular Vine copulas which are a novel and recently developed statistical and mathematical tool which can be applied in the assessment of composite financial risk. Copula-based dependence modelling is a popular tool in financial applications, but is usually applied to pairs of securities. By contrast, Vine copulas provide greater flexibility and permit the modelling of complex dependency patterns using the rich variety of bivariate copulas which may be arranged and analysed in a tree structure to explore multiple dependencies. The paper features the use of Regular Vine copulas in an analysis of the co-dependencies of 10 major European Stock Markets, as represented by individual market indices and the composite STOXX 50 index. The sample runs from 2005 to the end of 2013 to permit an exploration of how correlations change indifferent economic circumstances using three different sample periods: pre-GFC (January 2005–July 2007, GFC (July 2007– September 2009, and post-GFC periods (September 2009–December 2013. The empirical results suggest that the dependencies change in a complex manner, and are subject to change in different economic circumstances. One of the attractions of this approach to risk modelling is the flexibility in the choice of distributions used to model co-dependencies. The practical application of Regular Vine metrics is demonstrated via an example of the calculation of the VaR of a portfolio made up of the indices.

  18. Development of a cyber security risk model using Bayesian networks

    International Nuclear Information System (INIS)

    Shin, Jinsoo; Son, Hanseong; Khalil ur, Rahman; Heo, Gyunyoung

    2015-01-01

    Cyber security is an emerging safety issue in the nuclear industry, especially in the instrumentation and control (I and C) field. To address the cyber security issue systematically, a model that can be used for cyber security evaluation is required. In this work, a cyber security risk model based on a Bayesian network is suggested for evaluating cyber security for nuclear facilities in an integrated manner. The suggested model enables the evaluation of both the procedural and technical aspects of cyber security, which are related to compliance with regulatory guides and system architectures, respectively. The activity-quality analysis model was developed to evaluate how well people and/or organizations comply with the regulatory guidance associated with cyber security. The architecture analysis model was created to evaluate vulnerabilities and mitigation measures with respect to their effect on cyber security. The two models are integrated into a single model, which is called the cyber security risk model, so that cyber security can be evaluated from procedural and technical viewpoints at the same time. The model was applied to evaluate the cyber security risk of the reactor protection system (RPS) of a research reactor and to demonstrate its usefulness and feasibility. - Highlights: • We developed the cyber security risk model can be find the weak point of cyber security integrated two cyber analysis models by using Bayesian Network. • One is the activity-quality model signifies how people and/or organization comply with the cyber security regulatory guide. • Other is the architecture model represents the probability of cyber-attack on RPS architecture. • The cyber security risk model can provide evidence that is able to determine the key element for cyber security for RPS of a research reactor

  19. Scientific reporting is suboptimal for aspects that characterize genetic risk prediction studies: a review of published articles based on the Genetic RIsk Prediction Studies statement.

    Science.gov (United States)

    Iglesias, Adriana I; Mihaescu, Raluca; Ioannidis, John P A; Khoury, Muin J; Little, Julian; van Duijn, Cornelia M; Janssens, A Cecile J W

    2014-05-01

    Our main objective was to raise awareness of the areas that need improvements in the reporting of genetic risk prediction articles for future publications, based on the Genetic RIsk Prediction Studies (GRIPS) statement. We evaluated studies that developed or validated a prediction model based on multiple DNA variants, using empirical data, and were published in 2010. A data extraction form based on the 25 items of the GRIPS statement was created and piloted. Forty-two studies met our inclusion criteria. Overall, more than half of the evaluated items (34 of 62) were reported in at least 85% of included articles. Seventy-seven percentage of the articles were identified as genetic risk prediction studies through title assessment, but only 31% used the keywords recommended by GRIPS in the title or abstract. Seventy-four percentage mentioned which allele was the risk variant. Overall, only 10% of the articles reported all essential items needed to perform external validation of the risk model. Completeness of reporting in genetic risk prediction studies is adequate for general elements of study design but is suboptimal for several aspects that characterize genetic risk prediction studies such as description of the model construction. Improvements in the transparency of reporting of these aspects would facilitate the identification, replication, and application of genetic risk prediction models. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Measuring Risk Structure Using the Capital Asset Pricing Model

    Directory of Open Access Journals (Sweden)

    Zdeněk Konečný

    2015-01-01

    Full Text Available This article is aimed at proposing of an inovative method for calculating the shares of operational and financial risks. This methodological tool will support managers while monitoring the risk structure. The method is based on the capital asset pricing model (CAPM for calculation of equity cost, namely on determination of the beta coefficient, which is the only variable, that is dependent on entrepreneurial risk. There are combined both alternative approaches for calculation betas, which means, that there are accounting data used and there is distinguished unlevered beta and levered beta. The novelty of the proposed method is based on including of quantities for measuring operational and financial risks in beta calculation. The volatility of cash flow, as a quantity for measuring of operational risk, is included in the unlevered beta. Return on equity based on the cash flow and the indebtedness are variables used in calculation of the levered beta. This modification makes it possible to calculate the share of operational risk as the proportion of the unlevered/levered beta and the share of financial risk, which is the remainder of levered beta. The modified method is applied on companies from two sectors of the Czech economy. In the data set there are companies from one cyclical sector and from one neutral sector to find out potential differences in the risk structure. The findings show, that in both sectors the share of operational risk is over 50%, however, in the neutral sector is this more dominant.

  1. A mathematical model for environmental risk assessment in manufacturing industry

    Institute of Scientific and Technical Information of China (English)

    何莉萍; 徐盛明; 陈大川; 党创寅

    2002-01-01

    Environmental conscious manufacturing has become an important issue in industry because of market pressure and environmental regulations. An environmental risk assessment model was developed based on the network analytic method and fuzzy set theory. The "interval analysis method" was applied to deal with the on-site monitoring data as basic information for assessment. In addition, the fuzzy set theory was employed to allow uncertain, interactive and dynamic information to be effectively incorporated into the environmental risk assessment. This model is a simple, practical and effective tool for evaluating the environmental risk of manufacturing industry and for analyzing the relative impacts of emission wastes, which are hazardous to both human and ecosystem health. Furthermore, the model is considered useful for design engineers and decision-maker to design and select processes when the costs, environmental impacts and performances of a product are taken into consideration.

  2. MODELING CREDIT RISK THROUGH CREDIT SCORING

    OpenAIRE

    Adrian Cantemir CALIN; Oana Cristina POPOVICI

    2014-01-01

    Credit risk governs all financial transactions and it is defined as the risk of suffering a loss due to certain shifts in the credit quality of a counterpart. Credit risk literature gravitates around two main modeling approaches: the structural approach and the reduced form approach. In addition to these perspectives, credit risk assessment has been conducted through a series of techniques such as credit scoring models, which form the traditional approach. This paper examines the evolution of...

  3. Fuzzy hierarchical model for risk assessment principles, concepts, and practical applications

    CERN Document Server

    Chan, Hing Kai

    2013-01-01

    Risk management is often complicated by situational uncertainties and the subjective preferences of decision makers. Fuzzy Hierarchical Model for Risk Assessment introduces a fuzzy-based hierarchical approach to solve risk management problems considering both qualitative and quantitative criteria to tackle imprecise information.   This approach is illustrated through number of case studies using examples from the food, fashion and electronics sectors to cover a range of applications including supply chain management, green product design and green initiatives. These practical examples explore how this method can be adapted and fine tuned to fit other industries as well.   Supported by an extensive literature review, Fuzzy Hierarchical Model for Risk Assessment  comprehensively introduces a new method for project managers across all industries as well as researchers in risk management.

  4. Assessing the risk of Legionnaires' disease: the inhalation exposure model and the estimated risk in residential bathrooms.

    Science.gov (United States)

    Azuma, Kenichi; Uchiyama, Iwao; Okumura, Jiro

    2013-02-01

    Legionella are widely found in the built environment. Patients with Legionnaires' disease have been increasing in Japan; however, health risks from Legionella bacteria in the environment are not appropriately assessed. We performed a quantitative health risk assessment modeled on residential bathrooms in the Adachi outbreak area and estimated risk levels. The estimated risks in the Adachi outbreak approximately corresponded to the risk levels exponentially extrapolated into lower levels on the basis of infection and mortality rates calculated from actual outbreaks, suggesting that the model of Legionnaires' disease in residential bathrooms was adequate to predict disease risk for the evaluated outbreaks. Based on this model, the infection and mortality risk levels per year in 10 CFU/100 ml (100 CFU/L) of the Japanese water quality guideline value were approximately 10(-2) and 10(-5), respectively. However, acceptable risk levels of infection and mortality from Legionnaires' disease should be adjusted to approximately 10(-4) and 10(-7), respectively, per year. Therefore, a reference value of 0.1 CFU/100 ml (1 CFU/L) as a water quality guideline for Legionella bacteria is recommended. This value is occasionally less than the actual detection limit. Legionella levels in water system should be maintained as low as reasonably achievable (<1 CFU/L). Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Safety from physical viewpoint: ''two-risk model in multiple risk problem''

    International Nuclear Information System (INIS)

    Kuz'Min, I.I.; Akimov, V.A.

    1998-01-01

    Full text of publication follows: the problem of safety provision for people and environment within the framework of a certain socio-economic system (SES) as a problem of managing a great number of interacting risks characterizing numerous hazards (natural, manmade, social, economic once, etc.) inherent in the certain SES has been discussed. From the physical point of view, it can be considered a problem of interaction of many bodies which has no accurate mathematical solution even if the laws of interaction of this bodies are known. In physics, to solve this problem, an approach based on the reduction of the above-mentioned problem of the problem of two-body interaction which can be solved accurately in mathematics has been used. The report presents a similar approach to the problem of risk management in the SES. This approach includes the subdivision of numerous hazards inherent within the framework of the SES into two classes of hazards, so that each of the classes could be considered an integrated whole one, each of them being characterized by the appropriate risk. Consequently, problem of 'multiple-risk' management (i.e. the problem of many bodies, as represented in physics) can be reduced to the 'two-risk' management problem (that is, to the problem two-bodies). Within the framework of the two-risk model the optimization of costs to reduce the two kinds of risk, that is, the risk inherent in the SES as a whole, as well as the risk potentially provoked by lots of activities to be introduced in the SES economy has been described. The model has made it possible to formulate and prove the theorem of equilibrium in risk management. Using the theorem, a relatively simple and practically applicable procedure of optimizing the threshold costs to reduce diverse kinds of risk has been elaborated. The procedure provides to assess the minimum value of the cost that can be achieved regarding the socio-economic factors typical of the SES under discussion. The aimed

  6. Cabin Environment Physics Risk Model

    Science.gov (United States)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  7. Development of a prototype Typhoon Risk Model over the Korean Peninsula

    Science.gov (United States)

    Kim, K. Y.; Cocke, S.; Shin, D. W.; CHOI, M.; Kwon, J.

    2016-12-01

    Risk can be defined as probability of a given hazard of a given level causing a particular level of loss of damage (Alexander, 2000). Risk management is important for mitigation and developing plans for emergencies. More effective risk management strategies can help reduce potential losses from natural disasters like typhoon, floods, earthquakes, and so on. We are developing a prototype typhoon risk model to assess the current and potentially future hazard due to typhoons in the Western Pacific. To develop the typhoon risk model, a variety of sources of data over Korea are used such as population, damage to buildings, agriculture, ships, etc. The model is based on proven concepts used in catastrophe models that have been used in the U.S. and other regions of the world. Recently, the sea surface temperatures where typhoons have occurred have tended to increase. According to recent studies of global warming, the intensity of typhoons could increase, and the frequency of typhoons may decrease in the future climate. The prototype risk model can help us determine the change in risk as a consequence of the change in typhoon activity. We focus on Korea and other regions of interest to Korean insurers, re-insurers, and related industries. The model can potentially be coupled to various damage models or emergency management systems for planning and mitigation. In addition, the assessment would be useful for emergency planners, coastal community planners, and private and governmental insurance programs. This work was funded by the Korea Meteorological Administration Research and Development Program under Grant KMIPA2016-8030.

  8. Risk modelling study for carotid endarterectomy.

    Science.gov (United States)

    Kuhan, G; Gardiner, E D; Abidia, A F; Chetter, I C; Renwick, P M; Johnson, B F; Wilkinson, A R; McCollum, P T

    2001-12-01

    The aims of this study were to identify factors that influence the risk of stroke or death following carotid endarterectomy (CEA) and to develop a model to aid in comparative audit of vascular surgeons and units. A series of 839 CEAs performed by four vascular surgeons between 1992 and 1999 was analysed. Multiple logistic regression analysis was used to model the effect of 15 possible risk factors on the 30-day risk of stroke or death. Outcome was compared for four surgeons and two units after adjustment for the significant risk factors. The overall 30-day stroke or death rate was 3.9 per cent (29 of 741). Heart disease, diabetes and stroke were significant risk factors. The 30-day predicted stroke or death rates increased with increasing risk scores. The observed 30-day stroke or death rate was 3.9 per cent for both vascular units and varied from 3.0 to 4.2 per cent for the four vascular surgeons. Differences in the outcomes between the surgeons and vascular units did not reach statistical significance after risk adjustment. Diabetes, heart disease and stroke are significant risk factors for stroke or death following CEA. The risk score model identified patients at higher risk and aided in comparative audit.

  9. Risk-based emergency decision support

    International Nuclear Information System (INIS)

    Koerte, Jens

    2003-01-01

    In the present paper we discuss how to assist critical decisions taken under complex, contingent circumstances, with a high degree of uncertainty and short time frames. In such sharp-end decision regimes, standard rule-based decision support systems do not capture the complexity of the situation. At the same time, traditional risk analysis is of little use due to variability in the specific circumstances. How then, can an organisation provide assistance to, e.g. pilots in dealing with such emergencies? A method called 'contingent risk and decision analysis' is presented, to provide decision support for decisions under variable circumstances and short available time scales. The method consists of nine steps of definition, modelling, analysis and criteria definition to be performed 'off-line' by analysts, and procedure generation to transform the analysis result into an operational decision aid. Examples of pilots' decisions in response to sudden vibration in offshore helicopter transport method are used to illustrate the approach

  10. Modeling of Flood Risk for the Continental United States

    Science.gov (United States)

    Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.

    2011-12-01

    The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from

  11. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  12. Application of adversarial risk analysis model in pricing strategies with remanufacturing

    Directory of Open Access Journals (Sweden)

    Liurui Deng

    2015-01-01

    Full Text Available Purpose: Purpose: This paper mainly focus on the application of adversarial risk analysis (ARA in pricing strategy with remanufacturing. We hope to obtain more realistic results than classical model. Moreover, we also wish that our research improve the development of ARA in pricing strategy of manufacturing or remanufacturing. Approach: In order to gain more actual research, combining adversarial risk analysis, we explore the pricing strategy with remanufacturing based on Stackelberg model. Especially, we build OEM’s 1-order ARA model and further study on manufacturers and remanufacturers’ pricing strategy. Findings: We find the OEM’s 1-order ARA model for the OEM’s product cost C. Besides, we get according manufacturers and remanufacturers’ pricing strategies. Besides, the pricing strategies based on 1-order ARA model have advantage over than the classical model regardless of OEMs and remanufacturers. Research implications: The research on application of ARA imply that we can get more actual results with this kind of modern risk analysis method and ARA can be extensively in pricing strategies of supply chain. Value: Our research improves the application of ARA in remanufacturing industry. Meanwhile, inspired by this analysis, we can also create different ARA models for different parameters. Furthermore, some results and analysis methods can be applied to other pricing strategies of supply chain.

  13. Modelling Web-Based Instructional Systems

    NARCIS (Netherlands)

    Retalis, Symeon; Avgeriou, Paris

    2002-01-01

    The size and complexity of modern instructional systems, which are based on the World Wide Web, bring about great intricacy in their crafting, as there is not enough knowledge or experience in this field. This imposes the use of new instructional design models in order to achieve risk-mitigation,

  14. Risk assessment model for development of advanced age-related macular degeneration.

    Science.gov (United States)

    Klein, Michael L; Francis, Peter J; Ferris, Frederick L; Hamon, Sara C; Clemons, Traci E

    2011-12-01

    To design a risk assessment model for development of advanced age-related macular degeneration (AMD) incorporating phenotypic, demographic, environmental, and genetic risk factors. We evaluated longitudinal data from 2846 participants in the Age-Related Eye Disease Study. At baseline, these individuals had all levels of AMD, ranging from none to unilateral advanced AMD (neovascular or geographic atrophy). Follow-up averaged 9.3 years. We performed a Cox proportional hazards analysis with demographic, environmental, phenotypic, and genetic covariates and constructed a risk assessment model for development of advanced AMD. Performance of the model was evaluated using the C statistic and the Brier score and externally validated in participants in the Complications of Age-Related Macular Degeneration Prevention Trial. The final model included the following independent variables: age, smoking history, family history of AMD (first-degree member), phenotype based on a modified Age-Related Eye Disease Study simple scale score, and genetic variants CFH Y402H and ARMS2 A69S. The model did well on performance measures, with very good discrimination (C statistic = 0.872) and excellent calibration and overall performance (Brier score at 5 years = 0.08). Successful external validation was performed, and a risk assessment tool was designed for use with or without the genetic component. We constructed a risk assessment model for development of advanced AMD. The model performed well on measures of discrimination, calibration, and overall performance and was successfully externally validated. This risk assessment tool is available for online use.

  15. Integrated source-risk model for radon: A definition study

    International Nuclear Information System (INIS)

    Laheij, G.M.H.; Aldenkamp, F.J.; Stoop, P.

    1993-10-01

    The purpose of a source-risk model is to support policy making on radon mitigation by comparing effects of various policy options and to enable optimization of counter measures applied to different parts of the source-risk chain. There are several advantages developing and using a source-risk model: risk calculations are standardized; the effects of measures applied to different parts of the source-risk chain can be better compared because interactions are included; and sensitivity analyses can be used to determine the most important parameters within the total source-risk chain. After an inventory of processes and sources to be included in the source-risk chain, the models presently available in the Netherlands are investigated. The models were screened for completeness, validation and operational status. The investigation made clear that, by choosing for each part of the source-risk chain the most convenient model, a source-risk chain model for radon may be realized. However, the calculation of dose out of the radon concentrations and the status of the validation of most models should be improved. Calculations with the proposed source-risk model will give estimations with a large uncertainty at the moment. For further development of the source-risk model an interaction between the source-risk model and experimental research is recommended. Organisational forms of the source-risk model are discussed. A source-risk model in which only simple models are included is also recommended. The other models are operated and administrated by the model owners. The model owners execute their models for a combination of input parameters. The output of the models is stored in a database which will be used for calculations with the source-risk model. 5 figs., 15 tabs., 7 appendices, 14 refs

  16. The application of cure models in the presence of competing risks: a tool for improved risk communication in population-based cancer patient survival.

    Science.gov (United States)

    Eloranta, Sandra; Lambert, Paul C; Andersson, Therese M-L; Björkholm, Magnus; Dickman, Paul W

    2014-09-01

    Quantifying cancer patient survival from the perspective of cure is clinically relevant. However, most cure models estimate cure assuming no competing causes of death. We use a relative survival framework to demonstrate how flexible parametric cure models can be used in combination with competing-risks theory to incorporate noncancer deaths. Under a model that incorporates statistical cure, we present the probabilities that cancer patients (1) have died from their cancer, (2) have died from other causes, (3) will eventually die from their cancer, or (4) will eventually die from other causes, all as a function of time since diagnosis. We further demonstrate how conditional probabilities can be used to update the prognosis among survivors (eg, at 1 or 5 years after diagnosis) by summarizing the proportion of patients who will not die from their cancer. The proposed method is applied to Swedish population-based data for persons diagnosed with melanoma, colon cancer, or acute myeloid leukemia between 1973 and 2007.

  17. Risk prediction models for selection of lung cancer screening candidates: A retrospective validation study

    NARCIS (Netherlands)

    K. ten Haaf (Kevin); J. Jeon (Jihyoun); M.C. Tammemagi (Martin); S.S. Han (Summer); C.Y. Kong (Chung Yin); S.K. Plevritis (Sylvia); E. Feuer (Eric); H.J. de Koning (Harry); E.W. Steyerberg (Ewout W.); R. Meza (Rafael)

    2017-01-01

    textabstractBackground: Selection of candidates for lung cancer screening based on individual risk has been proposed as an alternative to criteria based on age and cumulative smoking exposure (pack-years). Nine previously established risk models were assessed for their ability to identify those most

  18. Anthropogenic factors and the risk of highly pathogenic avian influenza H5N1: prospects from a spatial-based model.

    Science.gov (United States)

    Paul, Mathilde; Tavornpanich, Saraya; Abrial, David; Gasqui, Patrick; Charras-Garrido, Myriam; Thanapongtharm, Weerapong; Xiao, Xiangming; Gilbert, Marius; Roger, Francois; Ducrot, Christian

    2010-01-01

    Beginning in 2003, highly pathogenic avian influenza (HPAI) H5N1 virus spread across Southeast Asia, causing unprecedented epidemics. Thailand was massively infected in 2004 and 2005 and continues today to experience sporadic outbreaks. While research findings suggest that the spread of HPAI H5N1 is influenced primarily by trade patterns, identifying the anthropogenic risk factors involved remains a challenge. In this study, we investigated which anthropogenic factors played a role in the risk of HPAI in Thailand using outbreak data from the "second wave" of the epidemic (3 July 2004 to 5 May 2005) in the country. We first performed a spatial analysis of the relative risk of HPAI H5N1 at the subdistrict level based on a hierarchical Bayesian model. We observed a strong spatial heterogeneity of the relative risk. We then tested a set of potential risk factors in a multivariable linear model. The results confirmed the role of free-grazing ducks and rice-cropping intensity but showed a weak association with fighting cock density. The results also revealed a set of anthropogenic factors significantly linked with the risk of HPAI. High risk was associated strongly with densely populated areas, short distances to a highway junction, and short distances to large cities. These findings highlight a new explanatory pattern for the risk of HPAI and indicate that, in addition to agro-environmental factors, anthropogenic factors play an important role in the spread of H5N1. To limit the spread of future outbreaks, efforts to control the movement of poultry products must be sustained. INRA, EDP Sciences, 2010.

  19. Stackelberg game of buyback policy in supply chain with a risk-averse retailer and a risk-averse supplier based on CVaR.

    Directory of Open Access Journals (Sweden)

    Yanju Zhou

    Full Text Available This paper considers a decentralized supply chain in which a single supplier sells a perishable product to a single retailer facing uncertain demand. We assume that the supplier and the retailer are both risk averse and utilize Conditional Value at Risk (CVaR, a risk measure method which is popularized in financial risk management, to estimate their risk attitude. We establish a buyback policy model based on Stackelberg game theory under considering supply chain members' risk preference and get the expressions of the supplier's optimal repurchase price and the retailer's optimal order quantity which are compared with those under risk neutral case. Finally, a numerical example is applied to simulate that model and prove related conclusions.

  20. Stackelberg game of buyback policy in supply chain with a risk-averse retailer and a risk-averse supplier based on CVaR.

    Science.gov (United States)

    Zhou, Yanju; Chen, Qian; Chen, Xiaohong; Wang, Zongrun

    2014-01-01

    This paper considers a decentralized supply chain in which a single supplier sells a perishable product to a single retailer facing uncertain demand. We assume that the supplier and the retailer are both risk averse and utilize Conditional Value at Risk (CVaR), a risk measure method which is popularized in financial risk management, to estimate their risk attitude. We establish a buyback policy model based on Stackelberg game theory under considering supply chain members' risk preference and get the expressions of the supplier's optimal repurchase price and the retailer's optimal order quantity which are compared with those under risk neutral case. Finally, a numerical example is applied to simulate that model and prove related conclusions.

  1. HOW INTERNAL RISK - BASED AUDIT APPRAISES THE EVALUATION OF RISKS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    N. Dorosh

    2017-09-01

    Full Text Available The article deals with the nature and function of the internal risk-based audit process approach to create patterns of risks and methods of evaluation. Deals with the relationship between the level of maturity of the risk of the company and the method of risk-based internal audit. it was emphasized that internal auditing provides an independent and objective opinion to an organization’s management as to whether its risks are being managed to acceptable levels.

  2. A risk-based auditing process for pharmaceutical manufacturers.

    Science.gov (United States)

    Vargo, Susan; Dana, Bob; Rangavajhula, Vijaya; Rönninger, Stephan

    2014-01-01

    The purpose of this article is to share ideas on developing a risk-based model for the scheduling of audits (both internal and external). Audits are a key element of a manufacturer's quality system and provide an independent means of evaluating the manufacturer's or the supplier/vendor's compliance status. Suggestions for risk-based scheduling approaches are discussed in the article. Pharmaceutical manufacturers are required to establish and implement a quality system. The quality system is an organizational structure defining responsibilities, procedures, processes, and resources that the manufacturer has established to ensure quality throughout the manufacturing process. Audits are a component of the manufacturer's quality system and provide a systematic and an independent means of evaluating the manufacturer's overall quality system and compliance status. Audits are performed at defined intervals for a specified duration. The intention of the audit process is to focus on key areas within the quality system and may not cover all relevant areas during each audit. In this article, the authors provide suggestions for risk-based scheduling approaches to aid pharmaceutical manufacturers in identifying the key focus areas for an audit.

  3. Survey of credit risk models in relation to capital adequacy framework for financial institutions

    Directory of Open Access Journals (Sweden)

    Poomjai Nacaskul

    2016-12-01

    Full Text Available This article (i iterates what is meant by credit risks and the mathematical-statistical modelling thereof, (ii elaborates the conceptual and technical links between credit risk modelling and capital adequacy framework for financial institutions, particularly as per the New Capital Accord (Basel II’s Internal Ratings-Based (IRB approach, (iii proffer a simple and intuitive taxonomy on contemporary credit risk modelling methodologies, and (iv discuses in some details a number of key models pertinent, in various stages of development, to various application areas in the banking and financial sector.

  4. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  5. Risk-Based Predictive Maintenance for Safety-Critical Systems by Using Probabilistic Inference

    Directory of Open Access Journals (Sweden)

    Tianhua Xu

    2013-01-01

    Full Text Available Risk-based maintenance (RBM aims to improve maintenance planning and decision making by reducing the probability and consequences of failure of equipment. A new predictive maintenance strategy that integrates dynamic evolution model and risk assessment is proposed which can be used to calculate the optimal maintenance time with minimal cost and safety constraints. The dynamic evolution model provides qualified risks by using probabilistic inference with bucket elimination and gives the prospective degradation trend of a complex system. Based on the degradation trend, an optimal maintenance time can be determined by minimizing the expected maintenance cost per time unit. The effectiveness of the proposed method is validated and demonstrated by a collision accident of high-speed trains with obstacles in the presence of safety and cost constrains.

  6. Driving Strategic Risk Planning With Predictive Modelling For Managerial Accounting

    DEFF Research Database (Denmark)

    Nielsen, Steen; Pontoppidan, Iens Christian

    for managerial accounting and shows how it can be used to determine the impact of different types of risk assessment input parameters on the variability of important outcome measures. The purpose is to: (i) point out the theoretical necessity of a stochastic risk framework; (ii) present a stochastic framework......Currently, risk management in management/managerial accounting is treated as deterministic. Although it is well-known that risk estimates are necessarily uncertain or stochastic, until recently the methodology required to handle stochastic risk-based elements appear to be impractical and too...... mathematical. The ultimate purpose of this paper is to “make the risk concept procedural and analytical” and to argue that accountants should now include stochastic risk management as a standard tool. Drawing on mathematical modelling and statistics, this paper methodically develops risk analysis approach...

  7. Engineering models for catastrophe risk and their application to insurance

    Science.gov (United States)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  8. Contrasting safety assessments of a runway incursion scenario: Event sequence analysis versus multi-agent dynamic risk modelling

    International Nuclear Information System (INIS)

    Stroeve, Sybert H.; Blom, Henk A.P.; Bakker, G.J.

    2013-01-01

    In the safety literature it has been argued, that in a complex socio-technical system safety cannot be well analysed by event sequence based approaches, but requires to capture the complex interactions and performance variability of the socio-technical system. In order to evaluate the quantitative and practical consequences of these arguments, this study compares two approaches to assess accident risk of an example safety critical sociotechnical system. It contrasts an event sequence based assessment with a multi-agent dynamic risk model (MA-DRM) based assessment, both of which are performed for a particular runway incursion scenario. The event sequence analysis uses the well-known event tree modelling formalism and the MA-DRM based approach combines agent based modelling, hybrid Petri nets and rare event Monte Carlo simulation. The comparison addresses qualitative and quantitative differences in the methods, attained risk levels, and in the prime factors influencing the safety of the operation. The assessments show considerable differences in the accident risk implications of the performance of human operators and technical systems in the runway incursion scenario. In contrast with the event sequence based results, the MA-DRM based results show that the accident risk is not manifest from the performance of and relations between individual human operators and technical systems. Instead, the safety risk emerges from the totality of the performance and interactions in the agent based model of the safety critical operation considered, which coincides very well with the argumentation in the safety literature.

  9. Data Analyses and Modelling for Risk Based Monitoring of Mycotoxins in Animal Feed

    Directory of Open Access Journals (Sweden)

    H.J. (Ine van der Fels-Klerx

    2018-01-01

    Full Text Available Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study aimed to prioritize feed products in the Netherlands for deoxynivalenol and aflatoxin B1 monitoring. Historical mycotoxin monitoring results from the period 2007–2016 were combined with data from other sources. Based on occurrence, groundnuts had high priority for aflatoxin B1 monitoring; some feed materials (maize and maize products and several oil seed products and complete/complementary feed excluding dairy cattle and young animals had medium priority; and all other animal feeds and feed materials had low priority. For deoxynivalenol, maize by-products had a high priority, complete and complementary feed for pigs had a medium priority and all other feed and feed materials a low priority. Also including health consequence estimations showed that feed materials that ranked highest for aflatoxin B1 included sunflower seed and palmkernel expeller/extracts and maize. For deoxynivalenol, maize products were ranked highest, followed by various small grain cereals (products; all other feed materials were of lower concern. Results of this study have proven to be useful in setting up the annual risk based control program for mycotoxins in animal feed and feed materials.

  10. Data Analyses and Modelling for Risk Based Monitoring of Mycotoxins in Animal Feed

    Science.gov (United States)

    van der Fels-Klerx, H.J. (Ine); Adamse, Paulien; Punt, Ans; van Asselt, Esther D.

    2018-01-01

    Following legislation, European Member States should have multi-annual control programs for contaminants, such as for mycotoxins, in feed and food. These programs need to be risk based implying the checks are regular and proportional to the estimated risk for animal and human health. This study aimed to prioritize feed products in the Netherlands for deoxynivalenol and aflatoxin B1 monitoring. Historical mycotoxin monitoring results from the period 2007–2016 were combined with data from other sources. Based on occurrence, groundnuts had high priority for aflatoxin B1 monitoring; some feed materials (maize and maize products and several oil seed products) and complete/complementary feed excluding dairy cattle and young animals had medium priority; and all other animal feeds and feed materials had low priority. For deoxynivalenol, maize by-products had a high priority, complete and complementary feed for pigs had a medium priority and all other feed and feed materials a low priority. Also including health consequence estimations showed that feed materials that ranked highest for aflatoxin B1 included sunflower seed and palmkernel expeller/extracts and maize. For deoxynivalenol, maize products were ranked highest, followed by various small grain cereals (products); all other feed materials were of lower concern. Results of this study have proven to be useful in setting up the annual risk based control program for mycotoxins in animal feed and feed materials. PMID:29373559

  11. Towards a resource-based habitat approach for spatial modelling of vector-borne disease risks.

    Science.gov (United States)

    Hartemink, Nienke; Vanwambeke, Sophie O; Purse, Bethan V; Gilbert, Marius; Van Dyck, Hans

    2015-11-01

    Given the veterinary and public health impact of vector-borne diseases, there is a clear need to assess the suitability of landscapes for the emergence and spread of these diseases. Current approaches for predicting disease risks neglect key features of the landscape as components of the functional habitat of vectors or hosts, and hence of the pathogen. Empirical-statistical methods do not explicitly incorporate biological mechanisms, whereas current mechanistic models are rarely spatially explicit; both methods ignore the way animals use the landscape (i.e. movement ecology). We argue that applying a functional concept for habitat, i.e. the resource-based habitat concept (RBHC), can solve these issues. The RBHC offers a framework to identify systematically the different ecological resources that are necessary for the completion of the transmission cycle and to relate these resources to (combinations of) landscape features and other environmental factors. The potential of the RBHC as a framework for identifying suitable habitats for vector-borne pathogens is explored and illustrated with the case of bluetongue virus, a midge-transmitted virus affecting ruminants. The concept facilitates the study of functional habitats of the interacting species (vectors as well as hosts) and provides new insight into spatial and temporal variation in transmission opportunities and exposure that ultimately determine disease risks. It may help to identify knowledge gaps and control options arising from changes in the spatial configuration of key resources across the landscape. The RBHC framework may act as a bridge between existing mechanistic and statistical modelling approaches. © 2014 The Authors. Biological Reviews published by John Wiley & Sons Ltd on behalf of Cambridge Philosophical Society.

  12. [Joint application of mathematic models in assessing the residual risk of hepatitis C virus transmitted through blood transfusion].

    Science.gov (United States)

    Wang, Xun; Jia, Yao; Xie, Yun-zheng; Li, Xiu-mei; Liu, Xiao-ying; Wu, Xiao-fei

    2011-09-01

    The practicable and effective methods for residual risk assessment on transfusion-transmitted disease was to establish the mathematic models. Based on the characteristics of the repeat donors which donated their blood on a regular base, a model of sero-conversion during the interval of donations was established to assess the incidence of the repeat donors. Based on the characteristics of the prevalence in the population, a model of 'prevalence increased with the age of the donor' was established to assess the incidence of those first-time donors. And based on the impact of the windows period through blood screening program, a model of residual risk associated with the incidence and the length of the windows period was established to assess the residual risk of blood transfusion. In this paper, above said 3 kinds of mathematic models were jointly applied to assess the residual risk of hepatitis C virus (HCV) which was transmitted through blood transfusion in Shanghai, based on data from the routine blood collection and screening program. All the anti-HCV unqualified blood donations were confirmed before assessment. Results showed that the residual risk of HCV transmitted through blood transfusion during Jan. 1(st), 2007 to Dec. 31(st), 2008 in Shanghai was 1:101 000. Data showed that the results of residual risk assessment with mathematic models was valuable. The residual risk of transfusion-transmitted HCV in Shanghai was at a safe level, according to the results in this paper.

  13. A quantile-based Time at Risk: A new approach for assessing risk in financial markets

    Science.gov (United States)

    Bolgorian, Meysam; Raei, Reza

    2013-11-01

    In this paper, we provide a new measure for evaluation of risk in financial markets. This measure is based on the return interval of critical events in financial markets or other investment situations. Our main goal was to devise a model like Value at Risk (VaR). As VaR, for a given financial asset, probability level and time horizon, gives a critical value such that the likelihood of loss on the asset over the time horizon exceeds this value is equal to the given probability level, our concept of Time at Risk (TaR), using a probability distribution function of return intervals, provides a critical time such that the probability that the return interval of a critical event exceeds this time equals the given probability level. As an empirical application, we applied our model to data from the Tehran Stock Exchange Price Index (TEPIX) as a financial asset (market portfolio) and reported the results.

  14. A stable systemic risk ranking in China's banking sector: Based on principal component analysis

    Science.gov (United States)

    Fang, Libing; Xiao, Binqing; Yu, Honghai; You, Qixing

    2018-02-01

    In this paper, we compare five popular systemic risk rankings, and apply principal component analysis (PCA) model to provide a stable systemic risk ranking for the Chinese banking sector. Our empirical results indicate that five methods suggest vastly different systemic risk rankings for the same bank, while the combined systemic risk measure based on PCA provides a reliable ranking. Furthermore, according to factor loadings of the first component, PCA combined ranking is mainly based on fundamentals instead of market price data. We clearly find that price-based rankings are not as practical a method as fundamentals-based ones. This PCA combined ranking directly shows systemic risk contributions of each bank for banking supervision purpose and reminds banks to prevent and cope with the financial crisis in advance.

  15. Risk predictive modelling for diabetes and cardiovascular disease.

    Science.gov (United States)

    Kengne, Andre Pascal; Masconi, Katya; Mbanya, Vivian Nchanchou; Lekoubou, Alain; Echouffo-Tcheugui, Justin Basile; Matsha, Tandi E

    2014-02-01

    Absolute risk models or clinical prediction models have been incorporated in guidelines, and are increasingly advocated as tools to assist risk stratification and guide prevention and treatments decisions relating to common health conditions such as cardiovascular disease (CVD) and diabetes mellitus. We have reviewed the historical development and principles of prediction research, including their statistical underpinning, as well as implications for routine practice, with a focus on predictive modelling for CVD and diabetes. Predictive modelling for CVD risk, which has developed over the last five decades, has been largely influenced by the Framingham Heart Study investigators, while it is only ∼20 years ago that similar efforts were started in the field of diabetes. Identification of predictive factors is an important preliminary step which provides the knowledge base on potential predictors to be tested for inclusion during the statistical derivation of the final model. The derived models must then be tested both on the development sample (internal validation) and on other populations in different settings (external validation). Updating procedures (e.g. recalibration) should be used to improve the performance of models that fail the tests of external validation. Ultimately, the effect of introducing validated models in routine practice on the process and outcomes of care as well as its cost-effectiveness should be tested in impact studies before wide dissemination of models beyond the research context. Several predictions models have been developed for CVD or diabetes, but very few have been externally validated or tested in impact studies, and their comparative performance has yet to be fully assessed. A shift of focus from developing new CVD or diabetes prediction models to validating the existing ones will improve their adoption in routine practice.

  16. An artificial neural network prediction model of congenital heart disease based on risk factors: A hospital-based case-control study.

    Science.gov (United States)

    Li, Huixia; Luo, Miyang; Zheng, Jianfei; Luo, Jiayou; Zeng, Rong; Feng, Na; Du, Qiyun; Fang, Junqun

    2017-02-01

    An artificial neural network (ANN) model was developed to predict the risks of congenital heart disease (CHD) in pregnant women.This hospital-based case-control study involved 119 CHD cases and 239 controls all recruited from birth defect surveillance hospitals in Hunan Province between July 2013 and June 2014. All subjects were interviewed face-to-face to fill in a questionnaire that covered 36 CHD-related variables. The 358 subjects were randomly divided into a training set and a testing set at the ratio of 85:15. The training set was used to identify the significant predictors of CHD by univariate logistic regression analyses and develop a standard feed-forward back-propagation neural network (BPNN) model for the prediction of CHD. The testing set was used to test and evaluate the performance of the ANN model. Univariate logistic regression analyses were performed on SPSS 18.0. The ANN models were developed on Matlab 7.1.The univariate logistic regression identified 15 predictors that were significantly associated with CHD, including education level (odds ratio  = 0.55), gravidity (1.95), parity (2.01), history of abnormal reproduction (2.49), family history of CHD (5.23), maternal chronic disease (4.19), maternal upper respiratory tract infection (2.08), environmental pollution around maternal dwelling place (3.63), maternal exposure to occupational hazards (3.53), maternal mental stress (2.48), paternal chronic disease (4.87), paternal exposure to occupational hazards (2.51), intake of vegetable/fruit (0.45), intake of fish/shrimp/meat/egg (0.59), and intake of milk/soymilk (0.55). After many trials, we selected a 3-layer BPNN model with 15, 12, and 1 neuron in the input, hidden, and output layers, respectively, as the best prediction model. The prediction model has accuracies of 0.91 and 0.86 on the training and testing sets, respectively. The sensitivity, specificity, and Yuden Index on the testing set (training set) are 0.78 (0.83), 0.90 (0.95), and 0

  17. Comparison of robustness to outliers between robust poisson models and log-binomial models when estimating relative risks for common binary outcomes: a simulation study.

    Science.gov (United States)

    Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P

    2014-06-26

    To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.

  18. Risk Information Seeking among U.S. and Dutch Residents. An Application of the model of Risk Information Seeking and Processing

    NARCIS (Netherlands)

    ter Huurne, E.F.J.; Griffin, Robert J.; Gutteling, Jan M.

    2009-01-01

    The model of risk information seeking and processing (RISP) proposes characteristics of individuals that might predispose them to seek risk information. The intent of this study is to test the model’s robustness across two independent samples in different nations. Based on data from the United

  19. Earthquake insurance pricing: a risk-based approach.

    Science.gov (United States)

    Lin, Jeng-Hsiang

    2018-04-01

    Flat earthquake premiums are 'uniformly' set for a variety of buildings in many countries, neglecting the fact that the risk of damage to buildings by earthquakes is based on a wide range of factors. How these factors influence the insurance premiums is worth being studied further. Proposed herein is a risk-based approach to estimate the earthquake insurance rates of buildings. Examples of application of the approach to buildings located in Taipei city of Taiwan were examined. Then, the earthquake insurance rates for the buildings investigated were calculated and tabulated. To fulfil insurance rating, the buildings were classified into 15 model building types according to their construction materials and building height. Seismic design levels were also considered in insurance rating in response to the effect of seismic zone and construction years of buildings. This paper may be of interest to insurers, actuaries, and private and public sectors of insurance. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

  20. Surrogate modeling of joint flood risk across coastal watersheds

    Science.gov (United States)

    Bass, Benjamin; Bedient, Philip

    2018-03-01

    This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.

  1. Risk-Based Inspection and Maintenance Planning Optimization of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Ramírez, José G. Rangel; Sørensen, John Dalsgaard

    2009-01-01

    A risk-based inspection planning (RBI) approach applied to offshore wind turbines (OWT) is presented, based on RBI methodology developed in the last decades in the oil and gas industry. In wind farm (IWF) and single-alone locations are considered using a code-established turbulence models including...

  2. Estimating the Value-at-Risk for some stocks at the capital market in Indonesia based on ARMA-FIGARCH models

    Science.gov (United States)

    Sukono; Lesmana, E.; Susanti, D.; Napitupulu, H.; Hidayat, Y.

    2017-11-01

    Value-at-Risk has already become a standard measurement that must be carried out by the financial institution for both internal interest and regulatory. In this paper, the estimation of Value-at-Risk of some stocks with econometric models approach is analyzed. In this research, we assume that the stock return follows the time series model. To do the estimation of mean value we are using ARMA models, while to estimate the variance value we are using FIGARCH models. Furthermore, the mean value estimator and the variance are used to estimate the Value-at-Risk. The result of the analysis shows that from five stock PRUF, BBRI, MPPA, BMRI, and INDF, the Value-at-Risk obtained are 0.01791, 0.06037, 0.02550, 0.06030, and 0.02585 respectively. Since Value-at-Risk represents the maximum risk size of each stock at a 95% level of significance, then it can be taken into consideration in determining the investment policy on stocks.

  3. A Risk-based Assessment And Management Framework For Multipollutant Air Quality

    Science.gov (United States)

    Frey, H. Christopher; Hubbell, Bryan

    2010-01-01

    The National Research Council recommended both a risk- and performance-based multipollutant approach to air quality management. Specifically, management decisions should be based on minimizing the exposure to, and risk of adverse effects from, multiple sources of air pollution and that the success of these decisions should be measured by how well they achieved this objective. We briefly describe risk analysis and its application within the current approach to air quality management. Recommendations are made as to how current practice could evolve to support a fully risk- and performance-based multipollutant air quality management system. The ability to implement a risk assessment framework in a credible and policy-relevant manner depends on the availability of component models and data which are scientifically sound and developed with an understanding of their application in integrated assessments. The same can be said about accountability assessments used to evaluate the outcomes of decisions made using such frameworks. The existing risk analysis framework, although typically applied to individual pollutants, is conceptually well suited for analyzing multipollutant management actions. Many elements of this framework, such as emissions and air quality modeling, already exist with multipollutant characteristics. However, the framework needs to be supported with information on exposure and concentration response relationships that result from multipollutant health studies. Because the causal chain that links management actions to emission reductions, air quality improvements, exposure reductions and health outcomes is parallel between prospective risk analyses and retrospective accountability assessments, both types of assessment should be placed within a single framework with common metrics and indicators where possible. Improvements in risk reductions can be obtained by adopting a multipollutant risk analysis framework within the current air quality management

  4. Evaluation of portfolio credit risk based on survival analysis for progressive censored data

    Science.gov (United States)

    Jaber, Jamil J.; Ismail, Noriszura; Ramli, Siti Norafidah Mohd

    2017-04-01

    In credit risk management, the Basel committee provides a choice of three approaches to the financial institutions for calculating the required capital: the standardized approach, the Internal Ratings-Based (IRB) approach, and the Advanced IRB approach. The IRB approach is usually preferred compared to the standard approach due to its higher accuracy and lower capital charges. This paper use several parametric models (Exponential, log-normal, Gamma, Weibull, Log-logistic, Gompertz) to evaluate the credit risk of the corporate portfolio in the Jordanian banks based on the monthly sample collected from January 2010 to December 2015. The best model is selected using several goodness-of-fit criteria (MSE, AIC, BIC). The results indicate that the Gompertz distribution is the best model parametric model for the data.

  5. VTE Risk assessment - a prognostic Model: BATER Cohort Study of young women.

    Science.gov (United States)

    Heinemann, Lothar Aj; Dominh, Thai; Assmann, Anita; Schramm, Wolfgang; Schürmann, Rolf; Hilpert, Jan; Spannagl, Michael

    2005-04-18

    BACKGROUND: Community-based cohort studies are not available that evaluated the predictive power of both clinical and genetic risk factors for venous thromboembolism (VTE). There is, however, clinical need to forecast the likelihood of future occurrence of VTE, at least qualitatively, to support decisions about intensity of diagnostic or preventive measures. MATERIALS AND METHODS: A 10-year observation period of the Bavarian Thromboembolic Risk (BATER) study, a cohort study of 4337 women (18-55 years), was used to develop a predictive model of VTE based on clinical and genetic variables at baseline (1993). The objective was to prepare a probabilistic scheme that discriminates women with virtually no VTE risk from those at higher levels of absolute VTE risk in the foreseeable future. A multivariate analysis determined which variables at baseline were the best predictors of a future VTE event, provided a ranking according to the predictive power, and permitted to design a simple graphic scheme to assess the individual VTE risk using five predictor variables. RESULTS: Thirty-four new confirmed VTEs occurred during the observation period of over 32,000 women-years (WYs). A model was developed mainly based on clinical information (personal history of previous VTE and family history of VTE, age, BMI) and one composite genetic risk markers (combining Factor V Leiden and Prothrombin G20210A Mutation). Four levels of increasing VTE risk were arbitrarily defined to map the prevalence in the study population: No/low risk of VTE (61.3%), moderate risk (21.1%), high risk (6.0%), very high risk of future VTE (0.9%). In 10.6% of the population the risk assessment was not possible due to lacking VTE cases. The average incidence rates for VTE in these four levels were: 4.1, 12.3, 47.2, and 170.5 per 104 WYs for no, moderate, high, and very high risk, respectively. CONCLUSION: Our prognostic tool - containing clinical information (and if available also genetic data) - seems to be

  6. Risk-based inspection of nuclear power plants

    International Nuclear Information System (INIS)

    Masopust, R.

    1995-01-01

    A multidiscipline research programme was developed in the USA to establish risk-based inspections for NPP structures and equipment components. Based on this US research effort, the risk-based procedure for developing inspection guidelines for NPPs is described. The procedure includes the definition of systems, qualitative risk assessment, qualitative risk analysis and development of the inspection programme. The method, when adopted and modified, is recommended also for risk-based inspections of structures and equipment of WWER-type NPPs. A pilot application of the method to unit 1 of the Surry NPP is summarized. (Z.S.) 1 tab., 1 fig., 5 refs

  7. A neural network model for credit risk evaluation.

    Science.gov (United States)

    Khashman, Adnan

    2009-08-01

    Credit scoring is one of the key analytical techniques in credit risk evaluation which has been an active research area in financial risk management. This paper presents a credit risk evaluation system that uses a neural network model based on the back propagation learning algorithm. We train and implement the neural network to decide whether to approve or reject a credit application, using seven learning schemes and real world credit applications from the Australian credit approval datasets. A comparison of the system performance under the different learning schemes is provided, furthermore, we compare the performance of two neural networks; with one and two hidden layers following the ideal learning scheme. Experimental results suggest that neural networks can be effectively used in automatic processing of credit applications.

  8. Risk-based remediation: Approach and application

    International Nuclear Information System (INIS)

    Frishmuth, R.A.; Benson, L.A.

    1995-01-01

    The principle objective of remedial actions is to protect human health and the environment. Risk assessments are the only defensible tools available to demonstrate to the regulatory community and public that this objective can be achieved. Understanding the actual risks posed by site-related contamination is crucial to designing cost-effective remedial strategies. All to often remedial actions are overdesigned, resulting in little to no increase in risk reduction while increasing project cost. Risk-based remedial actions have recently been embraced by federal and state regulators, industry, government, the scientific community, and the public as a mechanism to implement rapid and cost-effective remedial actions. Emphasizing risk reduction, rather than adherence to ambiguous and generic standards, ensures that only remedial actions required to protect human health and the environment at a particular site are implemented. Two sites are presented as case studies on how risk-based approaches are being used to remediate two petroleum hydrocarbon contaminated sites. The sites are located at two US Air Force Bases, Wurtsmith Air Force Base (AFB) in Oscoda, Michigan and Malmstrom AFB in Great Falls, Montana

  9. Issues and approaches in risk-based aging analyses of passive components

    International Nuclear Information System (INIS)

    Uryasev, S.P.; Samanta, P.K.; Vesely, W.E.

    1994-01-01

    In previous NRC-sponsored work a general methodology was developed to quantify the risk contributions from aging components at nuclear plants. The methodology allowed Probabilistic Risk Analyses (PRAs) to be modified to incorporate the age-dependent component failure rates and also aging maintenance models to evaluate and prioritize the aging contributions from active components using the linear aging failure rate model and empirical components aging rates. In the present paper, this methodology is extended to passive components (for example, the pipes, heat exchangers, and the vessel). The analyses of passive components bring in issues different from active components. Here, we specifically focus on three aspects that need to be addressed in risk-based aging prioritization of passive components

  10. Developing safety performance functions incorporating reliability-based risk measures.

    Science.gov (United States)

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Nuclear insurance risk assessment using risk-based methodology

    International Nuclear Information System (INIS)

    Wendland, W.G.

    1992-01-01

    This paper presents American Nuclear Insurers' (ANI's) and Mutual Atomic Energy Liability Underwriters' (MAELU's) process and experience for conducting nuclear insurance risk assessments using a risk-based methodology. The process is primarily qualitative and uses traditional insurance risk assessment methods and an approach developed under the auspices of the American Society of Mechanical Engineers (ASME) in which ANI/MAELU is an active sponsor. This process assists ANI's technical resources in identifying where to look for insurance risk in an industry in which insurance exposure tends to be dynamic and nonactuarial. The process is an evolving one that also seeks to minimize the impact on insureds while maintaining a mutually agreeable risk tolerance

  12. Update on a Pharmacokinetic-Centric Alternative Tier II Program for MMT—Part II: Physiologically Based Pharmacokinetic Modeling and Manganese Risk Assessment

    Directory of Open Access Journals (Sweden)

    Michael D. Taylor

    2012-01-01

    Full Text Available Recently, a variety of physiologically based pharmacokinetic (PBPK models have been developed for the essential element manganese. This paper reviews the development of PBPK models (e.g., adult, pregnant, lactating, and neonatal rats, nonhuman primates, and adult, pregnant, lactating, and neonatal humans and relevant risk assessment applications. Each PBPK model incorporates critical features including dose-dependent saturable tissue capacities and asymmetrical diffusional flux of manganese into brain and other tissues. Varied influx and efflux diffusion rate and binding constants for different brain regions account for the differential increases in regional brain manganese concentrations observed experimentally. We also present novel PBPK simulations to predict manganese tissue concentrations in fetal, neonatal, pregnant, or aged individuals, as well as individuals with liver disease or chronic manganese inhalation. The results of these simulations could help guide risk assessors in the application of uncertainty factors as they establish exposure guidelines for the general public or workers.

  13. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk-Based Lender Oversight Supervision § 120.1000 Risk-Based Lender Oversight. (a) Risk-Based Lender...

  14. Developing genetic epidemiological models to predict risk for nasopharyngeal carcinoma in high-risk population of China.

    Directory of Open Access Journals (Sweden)

    Hong-Lian Ruan

    Full Text Available To date, the only established model for assessing risk for nasopharyngeal carcinoma (NPC relies on the sero-status of the Epstein-Barr virus (EBV. By contrast, the risk assessment models proposed here include environmental risk factors, family history of NPC, and information on genetic variants. The models were developed using epidemiological and genetic data from a large case-control study, which included 1,387 subjects with NPC and 1,459 controls of Cantonese origin. The predictive accuracy of the models were then assessed by calculating the area under the receiver-operating characteristic curves (AUC. To compare the discriminatory improvement of models with and without genetic information, we estimated the net reclassification improvement (NRI and integrated discrimination index (IDI. Well-established environmental risk factors for NPC include consumption of salted fish and preserved vegetables and cigarette smoking (in pack years. The environmental model alone shows modest discriminatory ability (AUC = 0.68; 95% CI: 0.66, 0.70, which is only slightly increased by the addition of data on family history of NPC (AUC = 0.70; 95% CI: 0.68, 0.72. With the addition of data on genetic variants, however, our model's discriminatory ability rises to 0.74 (95% CI: 0.72, 0.76. The improvements in NRI and IDI also suggest the potential usefulness of considering genetic variants when screening for NPC in endemic areas. If these findings are confirmed in larger cohort and population-based case-control studies, use of the new models to analyse data from NPC-endemic areas could well lead to earlier detection of NPC.

  15. Overview of Risk Mitigation for Safety-Critical Computer-Based Systems

    Science.gov (United States)

    Torres-Pomales, Wilfredo

    2015-01-01

    This report presents a high-level overview of a general strategy to mitigate the risks from threats to safety-critical computer-based systems. In this context, a safety threat is a process or phenomenon that can cause operational safety hazards in the form of computational system failures. This report is intended to provide insight into the safety-risk mitigation problem and the characteristics of potential solutions. The limitations of the general risk mitigation strategy are discussed and some options to overcome these limitations are provided. This work is part of an ongoing effort to enable well-founded assurance of safety-related properties of complex safety-critical computer-based aircraft systems by developing an effective capability to model and reason about the safety implications of system requirements and design.

  16. A model-based analysis of decision making under risk in obsessive-compulsive and hoarding disorders.

    Science.gov (United States)

    Aranovich, Gabriel J; Cavagnaro, Daniel R; Pitt, Mark A; Myung, Jay I; Mathews, Carol A

    2017-07-01

    Attitudes towards risk are highly consequential in clinical disorders thought to be prone to "risky behavior", such as substance dependence, as well as those commonly associated with excessive risk aversion, such as obsessive-compulsive disorder (OCD) and hoarding disorder (HD). Moreover, it has recently been suggested that attitudes towards risk may serve as a behavioral biomarker for OCD. We investigated the risk preferences of participants with OCD and HD using a novel adaptive task and a quantitative model from behavioral economics that decomposes risk preferences into outcome sensitivity and probability sensitivity. Contrary to expectation, compared to healthy controls, participants with OCD and HD exhibited less outcome sensitivity, implying less risk aversion in the standard economic framework. In addition, risk attitudes were strongly correlated with depression, hoarding, and compulsion scores, while compulsion (hoarding) scores were associated with more (less) "rational" risk preferences. These results demonstrate how fundamental attitudes towards risk relate to specific psychopathology and thereby contribute to our understanding of the cognitive manifestations of mental disorders. In addition, our findings indicate that the conclusion made in recent work that decision making under risk is unaltered in OCD is premature. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Spatial scale effects in environmental risk-factor modelling for diseases

    Directory of Open Access Journals (Sweden)

    Ram K. Raghavan

    2013-05-01

    Full Text Available Studies attempting to identify environmental risk factors for diseases can be seen to extract candidate variables from remotely sensed datasets, using a single buffer-zone surrounding locations from where disease status are recorded. A retrospective case-control study using canine leptospirosis data was conducted to verify the effects of changing buffer-zones (spatial extents on the risk factors derived. The case-control study included 94 case dogs predominantly selected based on positive polymerase chain reaction (PCR test for leptospires in urine, and 185 control dogs based on negative PCR. Land cover features from National Land Cover Dataset (NLCD and Kansas Gap Analysis Program (KS GAP around geocoded addresses of cases/controls were extracted using multiple buffers at every 500 m up to 5,000 m, and multivariable logistic models were used to estimate the risk of different land cover variables to dogs. The types and statistical significance of risk factors identified changed with an increase in spatial extent in both datasets. Leptospirosis status in dogs was significantly associated with developed high-intensity areas in models that used variables extracted from spatial extents of 500-2000 m, developed medium-intensity areas beyond 2,000 m and up to 3,000 m, and evergreen forests beyond 3,500 m and up to 5,000 m in individual models in the NLCD. Significant associations were seen in urban areas in models that used variables extracted from spatial extents of 500-2,500 m and forest/woodland areas beyond 2,500 m and up to 5,000 m in individual models in Kansas gap analysis programme datasets. The use of ad hoc spatial extents can be misleading or wrong, and the determination of an appropriate spatial extent is critical when extracting environmental variables for studies. Potential work-arounds for this problem are discussed.

  18. Estimated cancer risk of dioxins to humans using a bioassay and physiologically based pharmacokinetic model

    International Nuclear Information System (INIS)

    Maruyama, Wakae; Aoki, Yasunobu

    2006-01-01

    The health risk of dioxins and dioxin-like compounds to humans was analyzed quantitatively using experimental data and mathematical models. To quantify the toxicity of a mixture of three dioxin congeners, we calculated the new relative potencies (REPs) for 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD), 1,2,3,7,8-pentachlorodibenzo-p-dioxin (PeCDD), and 2,3,4,7,8- pentachlorodibenzofuran (PeCDF), focusing on their tumor promotion activity. We applied a liver foci formation assay to female SD rats after repeated oral administration of dioxins. The REP of dioxin for a rat was determined using dioxin concentration and the number of the foci in rat liver. A physiologically based pharmacokinetic model (PBPK model) was used for interspecies extrapolation targeting on dioxin concentration in liver. Toxic dose for human was determined by back-estimation with a human PBPK model, assuming that the same concentration in the target tissue may cause the same level of effect in rats and humans, and the REP for human was determined by the toxic dose obtained. The calculated REPs for TCDD, PeCDD, and PeCDF were 1.0, 0.34, and 0.05 for rats, respectively, and the REPs for humans were almost the same as those for rats. These values were different from the toxic equivalency factors (TEFs) presented previously (Van den Berg, M., Birnbaum, L., Bosveld, A.T.C., Brunstrom, B., Cook, P., Feeley, M., Giesy, J.P., Hanberg, A., Hasegawa, R., Kennedy, S.W., Kubiak, T., Larsen, J.C., Rolaf van Leeuwen, F.X., Liem, A.K.D., Nolt, C., Peterson, R.E., Poellinger. L., Safe, S., Schrenk, D., Tillitt, D, Tysklind, M., Younes, M., Waern, F., Zacharewski, T., 1998. Toxic equivalency factors (TEFs) for PCBs, PCDDs, PCDFs for humans and wildlife. Environ. Health Perspect. 106, 775-792). The relative risk of excess liver cancer for Japanese people in general was 1.7-6.5 x 10 -7 by TCDD only, and 2.9-11 x 10 -7 by the three dioxins at the present level of contamination

  19. Rent pricing decision support mathematical model for finance leases under effective risks

    Directory of Open Access Journals (Sweden)

    Rabbani Masoud

    2015-01-01

    Full Text Available Nowadays, leasing has become an increasingly important and popular method for equipment acquisition. But, because of the rent pricing difficulties and some risks that affect the lessor and lessee's decision making, there are many people that still tend to buy equipment instead of lease it. In this paper we explore how risk can affect the leasing issue support mathematical model. For this purpose, we consider three types of risk; Credit risk, Transaction risk and Risk based pricing. In particular, our focus was on how to make decision about rent pricing in a leasing problem with different customers, various quality levels and different pricing methods. Finally, the mathematical model has been solved by Genetic Algorithm that is a search heuristic to optimize the problem. This algorithm was coded in MATLAB® R2012a to provide the best set of results.

  20. ISM Approach to Model Offshore Outsourcing Risks

    Directory of Open Access Journals (Sweden)

    Sunand Kumar

    2014-07-01

    Full Text Available In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain network. This also leads to dissatisfaction among different stake holders. The main objective of this paper is to identify and understand the mutual interaction among various risks which affect the performance of offshore outsourcing.  To this effect, authors have identified various risks through extant review of literature.  From this information, an integrated model using interpretive structural modelling (ISM for risks affecting offshore outsourcing is developed and the structural relationships between these risks are modeled.  Further, MICMAC analysis is done to analyze the driving power and dependency of risks which shall be helpful to managers to identify and classify important criterions and to reveal the direct and indirect effects of each criterion on offshore outsourcing. Results show that political risk and risk due to cultural differences are act as strong drivers.

  1. Risk Route Choice Analysis and the Equilibrium Model under Anticipated Regret Theory

    Directory of Open Access Journals (Sweden)

    pengcheng yuan

    2014-02-01

    Full Text Available The assumption about travellers’ route choice behaviour has major influence on the traffic flow equilibrium analysis. Previous studies about the travellers’ route choice were mainly based on the expected utility maximization theory. However, with the gradually increasing knowledge about the uncertainty of the transportation system, the researchers have realized that there is much constraint in expected util­ity maximization theory, because expected utility maximiza­tion requires travellers to be ‘absolutely rational’; but in fact, travellers are not truly ‘absolutely rational’. The anticipated regret theory proposes an alternative framework to the tra­ditional risk-taking in route choice behaviour which might be more scientific and reasonable. We have applied the antici­pated regret theory to the analysis of the risk route choosing process, and constructed an anticipated regret utility func­tion. By a simple case which includes two parallel routes, the route choosing results influenced by the risk aversion degree, regret degree and the environment risk degree have been analyzed. Moreover, the user equilibrium model based on the anticipated regret theory has been established. The equivalence and the uniqueness of the model are proved; an efficacious algorithm is also proposed to solve the model. Both the model and the algorithm are demonstrated in a real network. By an experiment, the model results and the real data have been compared. It was found that the model re­sults can be similar to the real data if a proper regret degree parameter is selected. This illustrates that the model can better explain the risk route choosing behaviour. Moreover, it was also found that the traveller’ regret degree increases when the environment becomes more and more risky.

  2. Risk Assessment of Nautical Navigational Environment Based on Grey Fixed Weight Cluster

    Directory of Open Access Journals (Sweden)

    Yanfei Tian

    2017-06-01

    Full Text Available In order to set up a mathematical model suitable for nautical navigational environment risk evaluation and systematically master the navigational environment risk characteristics of the Qiongzhou Strait in a quantitative way, a risk assessment model with approach steps is set up based on the grey fixed weight cluster (GFWC. The evaluation index system is structured scientifically through both literature review and expert investigation. The relative weight of each index is designed to be obtained via fuzzy analytic hierarchy process (FAHP; Index membership degree of every grey class is proposed to be achieved by fuzzy statistics (FS to avoid the difficulty of building whiten weight functions. By using the model, nautical navigational environment risk of the Qiongzhou Strait is determined at a “moderate” level according to the principle of maximum membership degree. The comprehensive risk evaluation of the Qiongzhou Strait nautical navigational environment can provide theoretical reference for implementing targeted risk control measures. It shows that the constructed GFWC risk assessment model as well as the presented steps are workable in case of incomplete information. The proposed strategy can excavate the collected experts’ knowledge mathematically, quantify the weight of each index and risk level, and finally lead to a comprehensive risk evaluation result. Besides, the adoptions of probability and statistic theory, fuzzy theory, aiming at solving the bottlenecks in case of uncertainty, will give the model a better adaptability and executability.

  3. Risk factors and a prediction model for lower limb lymphedema following lymphadenectomy in gynecologic cancer: a hospital-based retrospective cohort study.

    Science.gov (United States)

    Kuroda, Kenji; Yamamoto, Yasuhiro; Yanagisawa, Manami; Kawata, Akira; Akiba, Naoya; Suzuki, Kensuke; Naritaka, Kazutoshi

    2017-07-25

    Lower limb lymphedema (LLL) is a chronic and incapacitating condition afflicting patients who undergo lymphadenectomy for gynecologic cancer. This study aimed to identify risk factors for LLL and to develop a prediction model for its occurrence. Pelvic lymphadenectomy (PLA) with or without para-aortic lymphadenectomy (PALA) was performed on 366 patients with gynecologic malignancies at Yaizu City Hospital between April 2002 and July 2014; we retrospectively analyzed 264 eligible patients. The intervals between surgery and diagnosis of LLL were calculated; the prevalence and risk factors were evaluated using the Kaplan-Meier and Cox proportional hazards methods. We developed a prediction model with which patients were scored and classified as low-risk or high-risk. The cumulative incidence of LLL was 23.1% at 1 year, 32.8% at 3 years, and 47.7% at 10 years post-surgery. LLL developed after a median 13.5 months. Using regression analysis, body mass index (BMI) ≥25 kg/m 2 (hazard ratio [HR], 1.616; 95% confidence interval [CI], 1.030-2.535), PLA + PALA (HR, 2.323; 95% CI, 1.126-4.794), postoperative radiation therapy (HR, 2.469; 95% CI, 1.148-5.310), and lymphocyst formation (HR, 1.718; 95% CI, 1.120-2.635) were found to be independently associated with LLL; age, type of cancer, number of lymph nodes, retroperitoneal suture, chemotherapy, lymph node metastasis, herbal medicine, self-management education, or infection were not associated with LLL. The predictive score was based on the 4 associated variables; patients were classified as high-risk (scores 3-6) and low-risk (scores 0-2). LLL incidence was significantly greater in the high-risk group than in the low-risk group (HR, 2.19; 95% CI, 1.440-3.324). The cumulative incidence at 5 years was 52.1% [95% CI, 42.9-62.1%] for the high-risk group and 28.9% [95% CI, 21.1-38.7%] for the low-risk group. The area under the receiver operator characteristics curve for the prediction model was 0.631 at 1 year, 0

  4. Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions

    Science.gov (United States)

    Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.

    2016-02-01

    The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable individual vessel accident risk levels and shoreline contamination risk from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - the Portuguese continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time, or as an alternative, a correction factor based on vessel distance from coast. Shoreline risks can be computed in real time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns ("hot spots") or developing sensitivity analysis to specific conditions, whereas real

  5. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    Science.gov (United States)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  6. A new method to quantify the health risks from sources of perfluoroalkyl substances, combined with positive matrix factorization and risk assessment models.

    Science.gov (United States)

    Xu, Jiao; Shi, Guo-Liang; Guo, Chang-Sheng; Wang, Hai-Ting; Tian, Ying-Ze; Huangfu, Yan-Qi; Zhang, Yuan; Feng, Yin-Chang; Xu, Jian

    2018-01-01

    A hybrid model based on the positive matrix factorization (PMF) model and the health risk assessment model for assessing risks associated with sources of perfluoroalkyl substances (PFASs) in water was established and applied at Dianchi Lake to test its applicability. The new method contains 2 stages: 1) the sources of PFASs were apportioned by the PMF model and 2) the contribution of health risks from each source was calculated by the new hybrid model. Two factors were extracted by PMF, with factor 1 identified as aqueous fire-fighting foams source and factor 2 as fluoropolymer manufacturing and processing and perfluorooctanoic acid production source. The health risk of PFASs in the water assessed by the health risk assessment model was 9.54 × 10 -7  a -1 on average, showing no obvious adverse effects to human health. The 2 sources' risks estimated by the new hybrid model ranged from 2.95 × 10 -10 to 6.60 × 10 -6  a -1 and from 1.64 × 10 -7 to 1.62 × 10 -6  a -1 , respectively. The new hybrid model can provide useful information on the health risks of PFAS sources, which is helpful for pollution control and environmental management. Environ Toxicol Chem 2018;37:107-115. © 2017 SETAC. © 2017 SETAC.

  7. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    Science.gov (United States)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  8. AN EXTENDED REINFORCEMENT LEARNING MODEL OF BASAL GANGLIA TO UNDERSTAND THE CONTRIBUTIONS OF SEROTONIN AND DOPAMINE IN RISK-BASED DECISION MAKING, REWARD PREDICTION, AND PUNISHMENT LEARNING

    Directory of Open Access Journals (Sweden)

    Pragathi Priyadharsini Balasubramani

    2014-04-01

    Full Text Available Although empirical and neural studies show that serotonin (5HT plays many functional roles in the brain, prior computational models mostly focus on its role in behavioral inhibition. In this study, we present a model of risk based decision making in a modified Reinforcement Learning (RL-framework. The model depicts the roles of dopamine (DA and serotonin (5HT in Basal Ganglia (BG. In this model, the DA signal is represented by the temporal difference error (δ, while the 5HT signal is represented by a parameter (α that controls risk prediction error. This formulation that accommodates both 5HT and DA reconciles some of the diverse roles of 5HT particularly in connection with the BG system. We apply the model to different experimental paradigms used to study the role of 5HT: 1 Risk-sensitive decision making, where 5HT controls risk assessment, 2 Temporal reward prediction, where 5HT controls time-scale of reward prediction, and 3 Reward/Punishment sensitivity, in which the punishment prediction error depends on 5HT levels. Thus the proposed integrated RL model reconciles several existing theories of 5HT and DA in the BG.

  9. Use of Physiologically Based Pharmacokinetic (PBPK) Models ...

    Science.gov (United States)

    EPA announced the availability of the final report, Use of Physiologically Based Pharmacokinetic (PBPK) Models to Quantify the Impact of Human Age and Interindividual Differences in Physiology and Biochemistry Pertinent to Risk Final Report for Cooperative Agreement. This report describes and demonstrates techniques necessary to extrapolate and incorporate in vitro derived metabolic rate constants in PBPK models. It also includes two case study examples designed to demonstrate the applicability of such data for health risk assessment and addresses the quantification, extrapolation and interpretation of advanced biochemical information on human interindividual variability of chemical metabolism for risk assessment application. It comprises five chapters; topics and results covered in the first four chapters have been published in the peer reviewed scientific literature. Topics covered include: Data Quality ObjectivesExperimental FrameworkRequired DataTwo example case studies that develop and incorporate in vitro metabolic rate constants in PBPK models designed to quantify human interindividual variability to better direct the choice of uncertainty factors for health risk assessment. This report is intended to serve as a reference document for risk assors to use when quantifying, extrapolating, and interpretating advanced biochemical information about human interindividual variability of chemical metabolism.

  10. Risk Modelling for Passages in Approach Channel

    Directory of Open Access Journals (Sweden)

    Leszek Smolarek

    2013-01-01

    Full Text Available Methods of multivariate statistics, stochastic processes, and simulation methods are used to identify and assess the risk measures. This paper presents the use of generalized linear models and Markov models to study risks to ships along the approach channel. These models combined with simulation testing are used to determine the time required for continuous monitoring of endangered objects or period at which the level of risk should be verified.

  11. Model-based approach for quantitative estimates of skin, heart, and lung toxicity risk for left-side photon and proton irradiation after breast-conserving surgery.

    Science.gov (United States)

    Tommasino, Francesco; Durante, Marco; D'Avino, Vittoria; Liuzzi, Raffaele; Conson, Manuel; Farace, Paolo; Palma, Giuseppe; Schwarz, Marco; Cella, Laura; Pacelli, Roberto

    2017-05-01

    Proton beam therapy represents a promising modality for left-side breast cancer (BC) treatment, but concerns have been raised about skin toxicity and poor cosmesis. The aim of this study is to apply skin normal tissue complication probability (NTCP) model for intensity modulated proton therapy (IMPT) optimization in left-side BC. Ten left-side BC patients undergoing photon irradiation after breast-conserving surgery were randomly selected from our clinical database. Intensity modulated photon (IMRT) and IMPT plans were calculated with iso-tumor-coverage criteria and according to RTOG 1005 guidelines. Proton plans were computed with and without skin optimization. Published NTCP models were employed to estimate the risk of different toxicity endpoints for skin, lung, heart and its substructures. Acute skin NTCP evaluation suggests a lower toxicity level with IMPT compared to IMRT when the skin is included in proton optimization strategy (0.1% versus 1.7%, p < 0.001). Dosimetric results show that, with the same level of tumor coverage, IMPT attains significant heart and lung dose sparing compared with IMRT. By NTCP model-based analysis, an overall reduction in the cardiopulmonary toxicity risk prediction can be observed for all IMPT compared to IMRT plans: the relative risk reduction from protons varies between 0.1 and 0.7 depending on the considered toxicity endpoint. Our analysis suggests that IMPT might be safely applied without increasing the risk of severe acute radiation induced skin toxicity. The quantitative risk estimates also support the potential clinical benefits of IMPT for left-side BC irradiation due to lower risk of cardiac and pulmonary morbidity. The applied approach might be relevant on the long term for the setup of cost-effectiveness evaluation strategies based on NTCP predictions.

  12. Advanced uncertainty modelling for container port risk analysis.

    Science.gov (United States)

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Why we need new approaches to low-dose risk modeling

    International Nuclear Information System (INIS)

    Alvarez, J.L.; Seiler, F.A.

    1996-01-01

    The linear no-threshold model for radiation effects was introduced as a conservative model for the design of radiation protection programs. The model has persisted not only as the basis for such programs, but has come to be treated as a dogma and is often confused with scientific fact. In this examination a number of serious problems with the linear no-threshold model of radiation carcinogenesis were demonstrated, many of them invalidating the hypothesis. It was shown that the relative risk formalism did not approach 1 as the dose approaches zero. When morality ratios were used instead, the data in the region below 0.3 Sv were systematically below the predictions of the linear model. It was also shown that the data above 0.3 Sv were of little use in formulating a model at low doses. In addition, these data are valid only for doses accumulated at high dose rates, and there is no scientific justification for using the model in low-dose, low-dose-rate extrapolations for purposes of radiation protection. Further examination of model fits to the Japanese survivor data were attempted. Several such models were fit to the data including an unconstrained linear, linear-square root, and Weibull, all of which fit the data better than the relative risk, linear no-threshold model. These fits were used to demonstrate that the linear model systematically over estimates the risk at low doses in the Japanese survivor data set. It is recommended here that an unbiased re-analysis of the data be undertaken and the results used to construct a new model, based on all pertinent data. This model could then form the basis for managing radiation risks in the appropriate regions of dose and dose rate

  14. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  15. Dynamic building risk assessment theoretic model for rainstorm-flood utilization ABM and ABS

    Science.gov (United States)

    Lai, Wenze; Li, Wenbo; Wang, Hailei; Huang, Yingliang; Wu, Xuelian; Sun, Bingyun

    2015-12-01

    Flood is one of natural disasters with the worst loss in the world. It needs to assess flood disaster risk so that we can reduce the loss of flood disaster. Disaster management practical work needs the dynamic risk results of building. Rainstorm flood disaster system is a typical complex system. From the view of complex system theory, flood disaster risk is the interaction result of hazard effect objects, rainstorm flood hazard factors, and hazard environments. Agent-based modeling (ABM) is an important tool for complex system modeling. Rainstorm-flood building risk dynamic assessment method (RFBRDAM) was proposed using ABM in this paper. The interior structures and procedures of different agents in proposed meth had been designed. On the Netlogo platform, the proposed method was implemented to assess the building risk changes of the rainstorm flood disaster in the Huaihe River Basin using Agent-based simulation (ABS). The results indicated that the proposed method can dynamically assess building risk of the whole process for the rainstorm flood disaster. The results of this paper can provide one new approach for flood disaster building risk dynamic assessment and flood disaster management.

  16. Conceptual Framework for Trait-Based Ecological Risk Assessment for Wildlife Populations Exposed to Pesticides

    Science.gov (United States)

    Between screening level risk assessments and complex ecological models, a need exists for practical identification of risk based on general information about species, chemicals, and exposure scenarios. Several studies have identified demographic, biological, and toxicological fa...

  17. Calibration plots for risk prediction models in the presence of competing risks

    DEFF Research Database (Denmark)

    Gerds, Thomas A; Andersen, Per K; Kattan, Michael W

    2014-01-01

    A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks...... prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves...

  18. Cyclical patterns in risk indicators based on financial market infrastructure transaction data

    NARCIS (Netherlands)

    Timmermans, M.; Heijmans, R.; Daniels, Hennie

    2017-01-01

    This paper studies cyclical patterns in risk indicators based on TARGET2 transaction data. These indicators provide information on network properties, operational aspects and links to ancillary systems. We compare the performance of two different ARIMA dummy models to the TBATS state space model.

  19. Multi-attribute risk assessment for risk ranking of natural gas pipelines

    International Nuclear Information System (INIS)

    Brito, A.J.; Almeida, A.T. de

    2009-01-01

    The paper presents a decision model for risk assessment and for risk ranking of sections of natural gas pipelines based on multi-attribute utility theory. Pipeline hazard scenarios are surveyed and the reasons for a risk assessment model based on a multi-attribute approach are presented. Three dimensions of impact and the need to translate decision-makers' preferences into risk management decisions are highlighted. The model approaches these factors by using a multi-attribute utility function, in order to produce multi-dimensional risk measurements. By using decision analysis concepts, this model quantitatively incorporates the decision-maker's preferences and behavior regarding risk within clear and consistent risk measurements. In order to support the prioritizing of critical sections of pipeline in natural gas companies, this multi-attribute model also allows sections of pipeline to be ranked into a risk hierarchy. A numerical application based on a real case study was undertaken so that the effectiveness of the decision model could be verified

  20. Prostate Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.