WorldWideScience

Sample records for event-based risk model

  1. Development of a GCR Event-based Risk Model

    Science.gov (United States)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  2. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    Science.gov (United States)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  3. Event-Based Activity Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2004-01-01

    We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related...... activities inside and outside an IT-system. We use event-activity diagrams to model activity. Such diagrams support the modeling of activity flow, object flow, shared events, triggering events, and interrupting events....

  4. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    Science.gov (United States)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  5. Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)

    Science.gov (United States)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  6. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  7. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  8. Characterising Seismic Hazard Input for Analysis Risk to Multi-System Infrastructures: Application to Scenario Event-Based Models and extension to Probabilistic Risk

    Science.gov (United States)

    Weatherill, G. A.; Silva, V.

    2011-12-01

    The potential human and economic cost of earthquakes to complex urban infrastructures has been demonstrated in the most emphatic manner by recent large earthquakes such as that of Haiti (February 2010), Christchurch (September 2010 and February 2011) and Tohoku (March 2011). Consideration of seismic risk for a homogenous portfolio, such as a single building typology or infrastructure, or independent analyses of separate typologies or infrastructures, are insufficient to fully characterise the potential impacts that arise from inter-connected system failure. Individual elements of each infrastructure may be adversely affected by different facets of the ground motion (e.g. short-period acceleration, long-period displacement, cumulative energy input etc.). The accuracy and efficiency of the risk analysis is dependent on the ability to characterise these multiple features of the ground motion over a spatially distributed portfolio of elements. The modelling challenges raised by this extension to multi-system analysis of risk have been a key focus of the European Project "Systemic Seismic Vulnerability and Risk Analysis for Buildings, Lifeline Networks and Infrastructures Safety Gain (SYNER-G)", and are expected to be developed further within the Global Earthquake Model (GEM). Seismic performance of a spatially distributed infrastructure during an earthquake may be assessed by means of Monte Carlo simulation, in order to incorporate the aleatory variability of the ground motion into the network analysis. Methodologies for co-simulating large numbers of spatially cross-correlated ground motion fields are appraised, and their potential impacts on a spatially distributed portfolio of mixed building typologies assessed using idealised case study scenarios from California and Europe. Potential developments to incorporate correlation and uncertainty in site amplification and geotechnical hazard are also explored. Whilst the initial application of the seismic risk analysis is

  9. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  10. Towards an event-based corpuscular model for optical phenomena

    NARCIS (Netherlands)

    De Raedt, H.; Jin, F.; Michielsen, K.; Roychoudhuri, C; Khrennikov, AY; Kracklauer, AF

    2011-01-01

    We discuss an event-based corpuscular model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory through a series of cause-and-effect processes, starting with the emission and ending with the

  11. Mars Science Laboratory; A Model for Event-Based EPO

    Science.gov (United States)

    Mayo, Louis; Lewis, E.; Cline, T.; Stephenson, B.; Erickson, K.; Ng, C.

    2012-10-01

    The NASA Mars Science Laboratory (MSL) and its Curiosity Rover, a part of NASA's Mars Exploration Program, represent the most ambitious undertaking to date to explore the red planet. MSL/Curiosity was designed primarily to determine whether Mars ever had an environment capable of supporting microbial life. NASA's MSL education program was designed to take advantage of existing, highly successful event based education programs to communicate Mars science and education themes to worldwide audiences through live webcasts, video interviews with scientists, TV broadcasts, professional development for teachers, and the latest social media frameworks. We report here on the success of the MSL education program and discuss how this methodological framework can be used to enhance other event based education programs.

  12. Event-based total suspended sediment particle size distribution model

    Science.gov (United States)

    Thompson, Jennifer; Sattar, Ahmed M. A.; Gharabaghi, Bahram; Warner, Richard C.

    2016-05-01

    One of the most challenging modelling tasks in hydrology is prediction of the total suspended sediment particle size distribution (TSS-PSD) in stormwater runoff generated from exposed soil surfaces at active construction sites and surface mining operations. The main objective of this study is to employ gene expression programming (GEP) and artificial neural networks (ANN) to develop a new model with the ability to more accurately predict the TSS-PSD by taking advantage of both event-specific and site-specific factors in the model. To compile the data for this study, laboratory scale experiments using rainfall simulators were conducted on fourteen different soils to obtain TSS-PSD. This data is supplemented with field data from three construction sites in Ontario over a period of two years to capture the effect of transport and deposition within the site. The combined data sets provide a wide range of key overlooked site-specific and storm event-specific factors. Both parent soil and TSS-PSD in runoff are quantified by fitting each to a lognormal distribution. Compared to existing regression models, the developed model more accurately predicted the TSS-PSD using a more comprehensive list of key model input parameters. Employment of the new model will increase the efficiency of deployment of required best management practices, designed based on TSS-PSD, to minimize potential adverse effects of construction site runoff on aquatic life in the receiving watercourses.

  13. Personalized Event-Based Surveillance and Alerting Support for the Assessment of Risk

    OpenAIRE

    Stewar, Avaré; Lage, Ricardo; Diaz-Aviles, Ernesto; Dolog, Peter

    2011-01-01

    In a typical Event-Based Surveillance setting, a stream of web documents is continuously monitored for disease reporting. A structured representation of the disease reporting events is extracted from the raw text, and the events are then aggregated to produce signals, which are intended to represent early warnings against potential public health threats. To public health officials, these warnings represent an overwhelming list of "one-size-fits-all" information for risk assessment. To reduce ...

  14. Hydrologic Modeling in the Kenai River Watershed using Event Based Calibration

    Science.gov (United States)

    Wells, B.; Toniolo, H. A.; Stuefer, S. L.

    2015-12-01

    Understanding hydrologic changes is key for preparing for possible future scenarios. On the Kenai Peninsula in Alaska the yearly salmon runs provide a valuable stimulus to the economy. It is the focus of a large commercial fishing fleet, but also a prime tourist attraction. Modeling of anadromous waters provides a tool that assists in the prediction of future salmon run size. Beaver Creek, in Kenai, Alaska, is a lowlands stream that has been modeled using the Army Corps of Engineers event based modeling package HEC-HMS. With the use of historic precipitation and discharge data, the model was calibrated to observed discharge values. The hydrologic parameters were measured in the field or calculated, while soil parameters were estimated and adjusted during the calibration. With the calibrated parameter for HEC-HMS, discharge estimates can be used by other researches studying the area and help guide communities and officials to make better-educated decisions regarding the changing hydrology in the area and the tied economic drivers.

  15. Coupling urban event-based and catchment continuous modelling for combined sewer overflow river impact assessment

    Directory of Open Access Journals (Sweden)

    I. Andrés-Doménech

    2010-10-01

    Full Text Available Since Water Framework Directive (WFD was passed in year 2000, the conservation of water bodies in the EU must be understood in a completely different way. Regarding to combined sewer overflows (CSOs from urban drainage networks, the WFD implies that we cannot accept CSOs because of their intrinsic features, but they must be assessed for their impact on the receiving water bodies in agreement with specific environmental aims. Consequently, both, urban system and the receiving water body must be jointly analysed to evaluate the environmental impact generated on the latter. In this context, a coupled scheme is presented in this paper to assess the CSOs impact on a river system in Torrelavega (Spain. First, a urban model is developed to statistically characterise the CSOs frequency, volume and duration. The main feature of this first model is the fact of being event-based: the system is modelled with some built synthetic storms which cover adequately the probability range of the main rainfall descriptors, i.e., rainfall event volume and peak intensity. Thus, CSOs are characterised in terms of their occurrence probability. Secondly, a continuous and distributed basin model is built to assess river response at different points in the river network. This model was calibrated initially on a daily scale and downscaled later to hourly scale. The main objective of this second element of the scheme is to provide the most likely state of the receiving river when a CSO occurs. By combining results of both models, CSO and river flows are homogeneously characterised from a statistical point of view. Finally, results from both models were coupled to estimate the final concentration of some analysed pollutants (biochemical oxygen demand, BOD, and total ammonium, NH4+, within the river just after the spills.

  16. The Effect of Task Duration on Event-Based Prospective Memory: A Multinomial Modeling Approach.

    Science.gov (United States)

    Zhang, Hongxia; Tang, Weihai; Liu, Xiping

    2017-01-01

    Remembering to perform an action when a specific event occurs is referred to as Event-Based Prospective Memory (EBPM). This study investigated how EBPM performance is affected by task duration by having university students ( n = 223) perform an EBPM task that was embedded within an ongoing computer-based color-matching task. For this experiment, we separated the overall task's duration into the filler task duration and the ongoing task duration. The filler task duration is the length of time between the intention and the beginning of the ongoing task, and the ongoing task duration is the length of time between the beginning of the ongoing task and the appearance of the first Prospective Memory (PM) cue. The filler task duration and ongoing task duration were further divided into three levels: 3, 6, and 9 min. Two factors were then orthogonally manipulated between-subjects using a multinomial processing tree model to separate the effects of different task durations on the two EBPM components. A mediation model was then created to verify whether task duration influences EBPM via self-reminding or discrimination. The results reveal three points. (1) Lengthening the duration of ongoing tasks had a negative effect on EBPM performance while lengthening the duration of the filler task had no significant effect on it. (2) As the filler task was lengthened, both the prospective and retrospective components show a decreasing and then increasing trend. Also, when the ongoing task duration was lengthened, the prospective component decreased while the retrospective component significantly increased. (3) The mediating effect of discrimination between the task duration and EBPM performance was significant. We concluded that different task durations influence EBPM performance through different components with discrimination being the mediator between task duration and EBPM performance.

  17. The Effect of Task Duration on Event-Based Prospective Memory: A Multinomial Modeling Approach

    Directory of Open Access Journals (Sweden)

    Hongxia Zhang

    2017-11-01

    Full Text Available Remembering to perform an action when a specific event occurs is referred to as Event-Based Prospective Memory (EBPM. This study investigated how EBPM performance is affected by task duration by having university students (n = 223 perform an EBPM task that was embedded within an ongoing computer-based color-matching task. For this experiment, we separated the overall task’s duration into the filler task duration and the ongoing task duration. The filler task duration is the length of time between the intention and the beginning of the ongoing task, and the ongoing task duration is the length of time between the beginning of the ongoing task and the appearance of the first Prospective Memory (PM cue. The filler task duration and ongoing task duration were further divided into three levels: 3, 6, and 9 min. Two factors were then orthogonally manipulated between-subjects using a multinomial processing tree model to separate the effects of different task durations on the two EBPM components. A mediation model was then created to verify whether task duration influences EBPM via self-reminding or discrimination. The results reveal three points. (1 Lengthening the duration of ongoing tasks had a negative effect on EBPM performance while lengthening the duration of the filler task had no significant effect on it. (2 As the filler task was lengthened, both the prospective and retrospective components show a decreasing and then increasing trend. Also, when the ongoing task duration was lengthened, the prospective component decreased while the retrospective component significantly increased. (3 The mediating effect of discrimination between the task duration and EBPM performance was significant. We concluded that different task durations influence EBPM performance through different components with discrimination being the mediator between task duration and EBPM performance.

  18. Some implications of an event-based definition of exposure to the risk of road accident

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving the road......This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving...... and disadvantages of defining exposure as specific events are discussed. It is argued that developments in vehicle technology are likely to make events both observable and countable, thus ensuring that exposure is an operational concept. © 2014 Elsevier Ltd. All rights reserved....

  19. Modeling Psychological Contract Violation using Dual Regime Models: An Event-based Approach.

    Science.gov (United States)

    Hofmans, Joeri

    2017-01-01

    A good understanding of the dynamics of psychological contract violation requires theories, research methods and statistical models that explicitly recognize that violation feelings follow from an event that violates one's acceptance limits, after which interpretative processes are set into motion, determining the intensity of these violation feelings. Whereas theories-in the form of the dynamic model of the psychological contract-and research methods-in the form of daily diary research and experience sampling research-are available by now, the statistical tools to model such a two-stage process are still lacking. The aim of the present paper is to fill this gap in the literature by introducing two statistical models-the Zero-Inflated model and the Hurdle model-that closely mimic the theoretical process underlying the elicitation violation feelings via two model components: a binary distribution that models whether violation has occurred or not, and a count distribution that models how severe the negative impact is. Moreover, covariates can be included for both model components separately, which yields insight into their unique and shared antecedents. By doing this, the present paper offers a methodological-substantive synergy, showing how sophisticated methodology can be used to examine an important substantive issue.

  20. Influence of lag time on event-based rainfall-runoff modeling using the data driven approach

    Science.gov (United States)

    Talei, Amin; Chua, Lloyd H. C.

    2012-05-01

    SummaryThis study investigated the effect of lag time on the performance of data-driven models, specifically the adaptive network-based fuzzy inference system (ANFIS), in event-based rainfall-runoff modeling. Rainfall and runoff data for a catchment in Singapore were chosen for this study. For the purpose of this study, lag time was determined from cross-correlation analysis of the rainfall and runoff time series. Rainfall antecedents were the only inputs of the models and direct runoff was the desired output. An ANFIS model with three sub-models defined based on three different ranges of lag times was developed. The performance of the sub-models was compared with previously developed ANFIS models and the physically-based Storm Water Management Model (SWMM). The ANFIS sub-models gave significantly superior results in terms of the RMSE, r2, CE and the prediction of the peak discharge, compared to other ANFIS models where the lag time was not considered. In addition, the ANFIS sub-models provided results that were comparable with results from SWMM. It is thus concluded that the lag time plays an important role in the selection of events for training and testing of data-driven models in event-based rainfall-runoff modeling.

  1. Canonical event based Bayesian model averaging for post-processing of multi-model ensemble precipitation forecasts

    Science.gov (United States)

    Li, Wentao; Duan, Qingyun

    2017-04-01

    Precipitation forecasts from numerical weather models usually contain biases in terms of mean and spread, and need to be post-processed before applying them as input to hydrological models. Bayesian Model Averaging (BMA) method is a widely used method for post-processing forecasts from multiple models. Traditionally, BMA is applied to time series of forecasts for a specific lead time directly. In this work, we propose to apply BMA based on "canonical events", which are precipitation events with specific lead times and durations to fully extract information from raw forecasts. For example, canonical events can be designed as the daily precipitation for day 1 to day 5, and the aggregation or average of total precipitation from day 6 to day 10, because forecasts beyond 5 day still have some skill but not as reliable as the first five days. Moreover, BMA parameters are traditionally calibrated using a moving window containing the forecast-observation pairs before a given forecast date, which cannot ensure similar meteorological condition when long training period is applied. In this work, the training dataset is chosen from the historical hindcast archive of forecast-observation pairs in a pre-specified time window surrounding a given forecast date. After all canonical events of different lead times and durations are calibrated for BMA models, ensemble members are generated from the calibrated probability forecasts using the Schaake shuffle to preserve the temporal dependency of forecasts for different lead times. This canonical event based BMA makes use of forecasts at different lead times more adequately and can generate continuous calibrated forecast time series for further application in hydrological modeling.

  2. Sexual frequency and planning among at-risk men who have sex with men in the United States: implications for event-based intermittent pre-exposure prophylaxis.

    Science.gov (United States)

    Volk, Jonathan E; Liu, Albert; Vittinghoff, Eric; Irvin, Risha; Kroboth, Elizabeth; Krakower, Douglas; Mimiaga, Matthew J; Mayer, Kenneth H; Sullivan, Patrick S; Buchbinder, Susan P

    2012-09-01

    Intermittent dosing of pre-exposure prophylaxis (iPrEP) has potential to decrease costs, improve adherence, and minimize toxicity. Practical event-based dosing of iPrEP requires men who have sex with men (MSM) to be sexually active on fewer than 3 days each week and plan for sexual activity. MSM who may be most suitable for event-based dosing were older, more educated, more frequently used sexual networking websites, and more often reported that their last sexual encounter was not with a committed partner. A substantial proportion of these MSM endorse high-risk sexual activity, and event-based iPrEP may best target this population.

  3. Event-Based Modeling of Driver Yielding Behavior to Pedestrians at Two-Lane Roundabout Approaches.

    Science.gov (United States)

    Salamati, Katayoun; Schroeder, Bastian J; Geruschat, Duane R; Rouphail, Nagui M

    2014-01-01

    Unlike other types of controlled intersections, drivers do not always comply with the "yield to pedestrian" sign at the roundabouts. This paper aims to identify the contributing factors affecting the likelihood of driver yielding to pedestrians at two-lane roundabouts. It further models the likelihood of driver yielding based on these factors using logistic regression. The models have been applied to 1150 controlled pedestrian crossings at entry and exit legs of two-lane approaches of six roundabouts across the country. The logistic regression models developed support prior research that the likelihood of driver yielding at the entry leg of roundabouts is higher than at the exit. Drivers tend to yield to pedestrians carrying a white cane more often than to sighted pedestrians. Drivers traveling in the far lane, relative to pedestrian location, have a lower probability of yielding to a pedestrian. As the speed increases the probability of driver yielding decreases. At the exit leg of the roundabout, drivers turning right from the adjacent lane have a lower propensity of yielding than drivers coming from other directions. The findings of this paper further suggest that although there has been much debate on pedestrian right-of-way laws and distinction between pedestrian waiting positions (in the street versus at the curb), this factor does not have a significant impact on driver yielding rate. The logistic regression models also quantify the effect of each of these factors on propensity of driver yielding. The models include variables which are specific to each study location and explain the impact size of each study location on probability of yielding. The models generated in this research will be useful to transportation professionals and researchers interested in understanding the factors that impact driver yielding at modern roundabouts. The results of the research can be used to isolate factors that may increase yielding (such as lower roundabout approach speeds

  4. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    Science.gov (United States)

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  5. Estimation of channel characteristics of narrow bipolar events based on the transmission-line model

    Science.gov (United States)

    Zhu, Baoyou; Zhou, Helin; Ma, Ming; Lv, Fanchao; Tao, Shanchang

    2010-10-01

    Narrow bipolar event (NBE) is a distinct class of intracloud lightning discharge, which is associated with the strongest radio frequency emissions and produces typical narrow bipolar radiation field waveforms. On the basis of the transmission-line model, we introduce a direct technique to measure the time taken by the current front to propagate along the channel from distant radiation field pulses; the channel length of the NBE can then be estimated by multiplying this time by an assumed propagation speed. Our method involves integrating over the initial half cycle of narrow bipolar waveform of the NBE. The ratio of the integral result to the initial peak amplitude makes a good approximation to the time taken by the current front to travel along the channel, even though the current amplitude suffers heavy attenuation along the propagating channel. This method can be applied to all NBEs which produce narrow bipolar radiation field waveforms. Besides, if both the far radiation field and the near-electrostatic field measurements were available, one could combine the method here and that of Eack (2004) to obtain the channel length of the NBE.

  6. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    OpenAIRE

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory perfor...

  7. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    Directory of Open Access Journals (Sweden)

    Wooyeon Sunwoo

    2017-01-01

    Full Text Available Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC prior to a rainfall event. Soil moisture is one of the most important variables in rainfall–runoff modeling, and remotely sensed soil moisture is recognized as an effective way to improve the accuracy of runoff prediction. In this study, the IWC was evaluated based on remotely sensed soil moisture by using the Soil Conservation Service-Curve Number (SCS-CN method, which is one of the representative event-based models used for reducing the uncertainty of runoff prediction. Four proxy variables for the IWC were determined from the measurements of total rainfall depth (API5, ground-based soil moisture (SSMinsitu, remotely sensed surface soil moisture (SSM, and soil water index (SWI provided by the advanced scatterometer (ASCAT. To obtain a robust IWC framework, this study consists of two main parts: the validation of remotely sensed soil moisture, and the evaluation of runoff prediction using four proxy variables with a set of rainfall–runoff events in the East Asian monsoon region. The results showed an acceptable agreement between remotely sensed soil moisture (SSM and SWI and ground based soil moisture data (SSMinsitu. In the proxy variable analysis, the SWI indicated the optimal value among the proposed proxy variables. In the runoff prediction analysis considering various infiltration conditions, the SSM and SWI proxy variables significantly reduced the runoff prediction error as compared with API5 by 60% and 66%, respectively. Moreover, the proposed IWC framework with

  8. A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems

    Science.gov (United States)

    Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo

    2017-01-01

    Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.

  9. A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems.

    Science.gov (United States)

    Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo

    2017-01-12

    Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.

  10. Benefits and limitations of data assimilation for discharge forecasting using an event-based rainfall–runoff model

    Directory of Open Access Journals (Sweden)

    M. Coustau

    2013-03-01

    Full Text Available Mediterranean catchments in southern France are threatened by potentially devastating fast floods which are difficult to anticipate. In order to improve the skill of rainfall-runoff models in predicting such flash floods, hydrologists use data assimilation techniques to provide real-time updates of the model using observational data. This approach seeks to reduce the uncertainties present in different components of the hydrological model (forcing, parameters or state variables in order to minimize the error in simulated discharges. This article presents a data assimilation procedure, the best linear unbiased estimator (BLUE, used with the goal of improving the peak discharge predictions generated by an event-based hydrological model Soil Conservation Service lag and route (SCS-LR. For a given prediction date, selected model inputs are corrected by assimilating discharge data observed at the basin outlet. This study is conducted on the Lez Mediterranean basin in southern France. The key objectives of this article are (i to select the parameter(s which allow for the most efficient and reliable correction of the simulated discharges, (ii to demonstrate the impact of the correction of the initial condition upon simulated discharges, and (iii to identify and understand conditions in which this technique fails to improve the forecast skill. The correction of the initial moisture deficit of the soil reservoir proves to be the most efficient control parameter for adjusting the peak discharge. Using data assimilation, this correction leads to an average of 12% improvement in the flood peak magnitude forecast in 75% of cases. The investigation of the other 25% of cases points out a number of precautions for the appropriate use of this data assimilation procedure.

  11. Breaking The Millisecond Barrier On SpiNNaker: Implementing Asynchronous Event-Based Plastic Models With Microsecond Resolution

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-06-01

    Full Text Available Spike-based neuromorphic sensors such as retinas and cochleas, change the way in which the world is sampled. Instead of producing data sampled at a constant rate, these sensors output spikes that are asynchronous and event driven. The event-based nature of neuromorphic sensors implies a complete paradigm shift in current perception algorithms towards those that emphasize the importance of precise timing. The spikes produced by these sensors usually have a time resolution in the order of microseconds. This high temporal resolution is a crucial factor in learning tasks. It is also widely used in the field of biological neural networks. Sound localization for instance relies on detecting time lags between the two ears which, in the barn owl, reaches a temporal resolution of 5 microseconds. Current available neuromorphic computation platforms such as SpiNNaker often limit their users to a time resolution in the order of milliseconds that is not compatible with the asynchronous outputs of neuromorphic sensors. To overcome these limitations and allow for the exploration of new types of neuromorphic computing architectures, we introduce a novel software framework on the SpiNNaker platform. This framework allows for simulations of spiking networks and plasticity mechanisms using a completely asynchronous and event-based scheme running with a microsecond time resolution. Results on two example networks using this new implementation are presented.

  12. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    Science.gov (United States)

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory performance. The formal modeling results demonstrated that adults differed significantly from the 7-year-olds and 10-year-olds on both the prospective component and the retrospective component of the task. The 7-year-olds and 10-year-olds differed only in the ability to recognize prospective memory target events. The prospective memory task imposed a cost to ongoing activities in all three age groups. PMID:20053020

  13. The Cognitive Processes Underlying Event-Based Prospective Memory in School-Age Children and Young Adults: A Formal Model-Based Study

    Science.gov (United States)

    Smith, Rebekah E.; Bayen, Ute J.; Martin, Claudia

    2010-01-01

    Fifty children 7 years of age (29 girls, 21 boys), 53 children 10 years of age (29 girls, 24 boys), and 36 young adults (19 women, 17 men) performed a computerized event-based prospective memory task. All 3 groups differed significantly in prospective memory performance, with adults showing the best performance and with 7-year-olds showing the…

  14. Reconstruction of flood events based on documentary data and transnational flood risk analysis of the Upper Rhine and its French and German tributaries since AD 1480

    Science.gov (United States)

    Himmelsbach, I.; Glaser, R.; Schoenbein, J.; Riemann, D.; Martin, B.

    2015-10-01

    This paper presents the long-term analysis of flood occurrence along the southern part of the Upper Rhine River system and of 14 of its tributaries in France and Germany covering the period starting from 1480 BC. Special focus is given on the temporal and spatial variations of flood events and their underlying meteorological causes over time. Examples are presented of how long-term information about flood events and knowledge about the historical aspect of flood protection in a given area can help to improve the understanding of risk analysis and therefor transnational risk management. Within this context, special focus is given to flood vulnerability while comparing selected historical and modern extreme events, establishing a common evaluation scheme. The transnational aspect becomes especially evident analyzing the tributaries: on this scale, flood protection developed impressively different on the French and German sides. We argue that comparing high technological standards of flood protection, which were initiated by the dukes of Baden on the German side starting in the early 19th century, misled people to the common belief that the mechanical means of flood protection like dams and barrages can guarantee the security from floods and their impacts. This lead to widespread settlements and the establishment of infrastructure as well as modern industries in potentially unsafe areas until today. The legal status in Alsace on the French side of the Rhine did not allow for continuous flood protection measurements, leading to a constant - and probably at last annoying - reminder that the floodplains are a potentially unsafe place to be. From a modern perspective of flood risk management, this leads to a significant lower aggregation of value in the floodplains of the small rivers in Alsace compared to those on the Baden side - an interesting fact - especially if the modern European Flood directive is taken into account.

  15. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  16. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  17. Review the number of accidents in Tehran over a two-year period and prediction of the number of events based on a time-series model

    Science.gov (United States)

    Teymuri, Ghulam Heidar; Sadeghian, Marzieh; Kangavari, Mehdi; Asghari, Mehdi; Madrese, Elham; Abbasinia, Marzieh; Ahmadnezhad, Iman; Gholizadeh, Yavar

    2013-01-01

    Background: One of the significant dangers that threaten people’s lives is the increased risk of accidents. Annually, more than 1.3 million people die around the world as a result of accidents, and it has been estimated that approximately 300 deaths occur daily due to traffic accidents in the world with more than 50% of that number being people who were not even passengers in the cars. The aim of this study was to examine traffic accidents in Tehran and forecast the number of future accidents using a time-series model. Methods: The study was a cross-sectional study that was conducted in 2011. The sample population was all traffic accidents that caused death and physical injuries in Tehran in 2010 and 2011, as registered in the Tehran Emergency ward. The present study used Minitab 15 software to provide a description of accidents in Tehran for the specified time period as well as those that occurred during April 2012. Results: The results indicated that the average number of daily traffic accidents in Tehran in 2010 was 187 with a standard deviation of 83.6. In 2011, there was an average of 180 daily traffic accidents with a standard deviation of 39.5. One-way analysis of variance indicated that the average number of accidents in the city was different for different months of the year (P < 0.05). Most of the accidents occurred in March, July, August, and September. Thus, more accidents occurred in the summer than in the other seasons. The number of accidents was predicted based on an auto-regressive, moving average (ARMA) for April 2012. The number of accidents displayed a seasonal trend. The prediction of the number of accidents in the city during April of 2012 indicated that a total of 4,459 accidents would occur with mean of 149 accidents per day during these three months. Conclusion: The number of accidents in Tehran displayed a seasonal trend, and the number of accidents was different for different seasons of the year. PMID:26120405

  18. Risk of Coronary Heart Events Based on Rose Angina Questionnaire and ECG Besides Diabetes and Other Metabolic Risk Factors: Results of a 10-Year Follow-up in Tehran Lipid and Glucose Study

    Science.gov (United States)

    Mansournia, Mohammad Ali; Holakouie-Naieni, Kourosh; Fahimfar, Noushin; Almasi-Hashiani, Amir; Cheraghi, Zahra; Ayubi, Erfan; Hadaegh, Farzad; Eskandari, Fatemeh; Azizi, Fereidoun; Khalili, Davood

    2017-01-01

    Background High-risk individuals for CHD could be diagnosed by some non-invasive and low-priced techniques such as Minnesota ECG coding and rose angina questionnaire (RQ). Objectives The present study aimed at determining the risk of incident CHD according to ECG and RQ besides diabetes and other metabolic risk factors in our population. Methods Participants comprised of 5431 individuals aged ≥ 30 years within the framework of Tehran lipid and glucose study. Based on their status on history of CHD, ECG, and RQ at baseline, all participants were classified to 5 following groups: (1) History-Rose-ECG- (the reference group); (2) History-Rose+ECG-; (3) History-Rose-ECG+; (4) History-Rose+ECG+; and (5) History+. We used Cox regression model to find the role of ECG and RQ on CHD, independent of other risk factors. Results Overall, 562 CHD events were detected during the median of 10.3 years follow-up. CHD incidence rates were 55.9 and 9.09 cases per 1000 person-year for participants with and without history of CHD, respectively. Hazard ratios (HRs) (95% CIs) were 4.11 (3.27 - 5.11) for History + and 2.18 (1.63 - 2.90), 1.92 (1.47 - 2.51), and 2.48 (1.46 - 4.20) for History-Rose+ECG-, History-Rose-ECG+, and History-Rose+ECG+, respectively. RQ and ECG had the same HRs as high as those for hypertension and hypercholesterolemia; however, diabetes showed statistically and clinically more effects on CVD than RQ and ECG. Conclusions RQ in general and, ECG especially in asymptomatic patients, were good predictors for CHD events in both Iranian males and females; however, their predictive powers were lower than that of diabetes. PMID:28848610

  19. Risk of Coronary Heart Events Based on Rose Angina Questionnaire and ECG Besides Diabetes and Other Metabolic Risk Factors: Results of a 10-Year Follow-up in Tehran Lipid and Glucose Study.

    Science.gov (United States)

    Mansournia, Mohammad Ali; Holakouie-Naieni, Kourosh; Fahimfar, Noushin; Almasi-Hashiani, Amir; Cheraghi, Zahra; Ayubi, Erfan; Hadaegh, Farzad; Eskandari, Fatemeh; Azizi, Fereidoun; Khalili, Davood

    2017-04-01

    High-risk individuals for CHD could be diagnosed by some non-invasive and low-priced techniques such as Minnesota ECG coding and rose angina questionnaire (RQ). The present study aimed at determining the risk of incident CHD according to ECG and RQ besides diabetes and other metabolic risk factors in our population. Participants comprised of 5431 individuals aged ≥ 30 years within the framework of Tehran lipid and glucose study. Based on their status on history of CHD, ECG, and RQ at baseline, all participants were classified to 5 following groups: (1) History-Rose-ECG- (the reference group); (2) History-Rose+ECG-; (3) History-Rose-ECG+; (4) History-Rose+ECG+; and (5) History+. We used Cox regression model to find the role of ECG and RQ on CHD, independent of other risk factors. Overall, 562 CHD events were detected during the median of 10.3 years follow-up. CHD incidence rates were 55.9 and 9.09 cases per 1000 person-year for participants with and without history of CHD, respectively. Hazard ratios (HRs) (95% CIs) were 4.11 (3.27 - 5.11) for History + and 2.18 (1.63 - 2.90), 1.92 (1.47 - 2.51), and 2.48 (1.46 - 4.20) for History-Rose+ECG-, History-Rose-ECG+, and History-Rose+ECG+, respectively. RQ and ECG had the same HRs as high as those for hypertension and hypercholesterolemia; however, diabetes showed statistically and clinically more effects on CVD than RQ and ECG. RQ in general and, ECG especially in asymptomatic patients, were good predictors for CHD events in both Iranian males and females; however, their predictive powers were lower than that of diabetes.

  20. Bayesian operational risk models

    OpenAIRE

    Silvia Figini; Lijun Gao; Paolo Giudici

    2013-01-01

    Operational risk is hard to quantify, for the presence of heavy tailed loss distributions. Extreme value distributions, used in this context, are very sensitive to the data, and this is a problem in the presence of rare loss data. Self risk assessment questionnaires, if properly modelled, may provide the missing piece of information that is necessary to adequately estimate op- erational risks. In this paper we propose to embody self risk assessment data into suitable prior distributions, and ...

  1. Choice of rainfall inputs for event-based rainfall-runoff modeling in a catchment with multiple rainfall stations using data-driven techniques

    Science.gov (United States)

    Chang, Tak Kwin; Talei, Amin; Alaghmand, Sina; Ooi, Melanie Po-Leen

    2017-02-01

    Input selection for data-driven rainfall-runoff models is an important task as these models find the relationship between rainfall and runoff by direct mapping of inputs to output. In this study, two different input selection methods were used: cross-correlation analysis (CCA), and a combination of mutual information and cross-correlation analyses (MICCA). Selected inputs were used to develop adaptive network-based fuzzy inference system (ANFIS) in Sungai Kayu Ara basin, Selangor, Malaysia. The study catchment has 10 rainfall stations and one discharge station located at the outlet of the catchment. A total of 24 rainfall-runoff events (10-min interval) from 1996 to 2004 were selected from which 18 events were used for training and the remaining 6 were reserved for validating (testing) the models. The results of ANFIS models then were compared against the ones obtained by conceptual model HEC-HMS. The CCA and MICCA methods selected the rainfall inputs only from 2 (stations 1 and 5) and 3 (stations 1, 3, and 5) rainfall stations, respectively. ANFIS model developed based on MICCA inputs (ANFIS-MICCA) performed slightly better than the one developed based on CCA inputs (ANFIS-CCA). ANFIS-CCA and ANFIS-MICCA were able to perform comparably to HEC-HMS model where rainfall data of all 10 stations had been used; however, in peak estimation, ANFIS-MICCA was the best model. The sensitivity analysis on HEC-HMS was conducted by recalibrating the model by using the same selected rainfall stations for ANFIS. It was concluded that HEC-HMS model performance deteriorates if the number of rainfall stations reduces. In general, ANFIS was found to be a reliable alternative for HEC-HMS in cases whereby not all rainfall stations are functioning. This study showed that the selected stations have received the highest total rain and rainfall intensity (stations 3 and 5). Moreover, the contributing rainfall stations selected by CCA and MICCA were found to be located near the outlet of

  2. Reconstructing high-magnitude/low-frequency landslide events based on soil redistribution modelling and a Late-Holocene sediment record from New Zealand

    NARCIS (Netherlands)

    Claessens, L.F.G.; Lowe, D.J.; Hayward, B.W.; Schaap, B.F.; Schoorl, J.M.; Veldkamp, A.

    2006-01-01

    A sediment record is used, in combination with shallow landslide soil redistribution and sediment-yield modelling, to reconstruct the incidence of high-magnitude/low-frequency landslide events in the upper part of a catchment and the history of a wetland in the lower part. Eleven sediment cores were

  3. A model of TLR4 signaling and tolerance using a qualitative, particle-event-based method: introduction of spatially configured stochastic reaction chambers (SCSRC).

    Science.gov (United States)

    An, Gary

    2009-01-01

    There have been great advances in the examination and characterization of intracellular signaling and synthetic pathways. However, these pathways are generally represented using static diagrams when in reality they exist with considerable dynamic complexity. In addition to the expansion of existing mathematical pathway representation tools (many utilizing systems biology markup language format), there is a growing recognition that spatially explicit modeling methods may be necessary to capture essential aspects of intracellular dynamics. This paper introduces spatially configured stochastic reaction chambers (SCSRC), an agent-based modeling (ABM) framework that incorporates an abstracted molecular 'event' rule system with a spatially explicit representation of the relationship between signaling and synthetic compounds. Presented herein is an example of the SCSRC as applied to Toll-like receptor (TLR) 4 signaling and the inflammatory response. The underlying rationale for the architecture of the SCSRC is described. A SCSRC model of TLR-4 signaling was developed after a review of the literature regarding TLR-4 signaling and downstream synthetic events. The TLR-4 SCSRC was implemented in the free-ware software platform, Netlogo. A series of in silico experiments were performed to evaluate the response of the TLR-4 SCSRC with respect to response to simulated administration of lipopolysaccharide (LPS). The pro-inflammatory response was represented by simulated secretion of tumor necrosis factor (TNF). Subsequent in silico experiments examined the response to of the TLR-4 SCSRC in terms of a simulated preconditioning effect represented as tolerance of pro-inflammatory signaling to a second dose of LPS. The SCSRC produces simulated dynamics of TLR-4 signaling in response to LPS stimulation that are qualitatively similar to that reported in the literature. The expression of various components of the signaling cascade demonstrated stochastic noise, consistent with molecular

  4. Melanoma Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Credit Risk Modeling

    DEFF Research Database (Denmark)

    Lando, David

    and students in finance, at quantitative analysts in banks and other financial institutions, and at regulators interested in the modeling aspects of credit risk. David Lando considers the two broad approaches to credit risk analysis: that based on classical option pricing models on the one hand......Credit risk is today one of the most intensely studied topics in quantitative finance. This book provides an introduction and overview for readers who seek an up-to-date reference to the central problems of the field and to the tools currently used to analyze them. The book is aimed at researchers...

  6. Breast cancer risks and risk prediction models.

    Science.gov (United States)

    Engel, Christoph; Fischer, Christine

    2015-02-01

    BRCA1/2 mutation carriers have a considerably increased risk to develop breast and ovarian cancer. The personalized clinical management of carriers and other at-risk individuals depends on precise knowledge of the cancer risks. In this report, we give an overview of the present literature on empirical cancer risks, and we describe risk prediction models that are currently used for individual risk assessment in clinical practice. Cancer risks show large variability between studies. Breast cancer risks are at 40-87% for BRCA1 mutation carriers and 18-88% for BRCA2 mutation carriers. For ovarian cancer, the risk estimates are in the range of 22-65% for BRCA1 and 10-35% for BRCA2. The contralateral breast cancer risk is high (10-year risk after first cancer 27% for BRCA1 and 19% for BRCA2). Risk prediction models have been proposed to provide more individualized risk prediction, using additional knowledge on family history, mode of inheritance of major genes, and other genetic and non-genetic risk factors. User-friendly software tools have been developed that serve as basis for decision-making in family counseling units. In conclusion, further assessment of cancer risks and model validation is needed, ideally based on prospective cohort studies. To obtain such data, clinical management of carriers and other at-risk individuals should always be accompanied by standardized scientific documentation.

  7. Melanoma risk prediction models

    Directory of Open Access Journals (Sweden)

    Nikolić Jelena

    2014-01-01

    Full Text Available Background/Aim. The lack of effective therapy for advanced stages of melanoma emphasizes the importance of preventive measures and screenings of population at risk. Identifying individuals at high risk should allow targeted screenings and follow-up involving those who would benefit most. The aim of this study was to identify most significant factors for melanoma prediction in our population and to create prognostic models for identification and differentiation of individuals at risk. Methods. This case-control study included 697 participants (341 patients and 356 controls that underwent extensive interview and skin examination in order to check risk factors for melanoma. Pairwise univariate statistical comparison was used for the coarse selection of the most significant risk factors. These factors were fed into logistic regression (LR and alternating decision trees (ADT prognostic models that were assessed for their usefulness in identification of patients at risk to develop melanoma. Validation of the LR model was done by Hosmer and Lemeshow test, whereas the ADT was validated by 10-fold cross-validation. The achieved sensitivity, specificity, accuracy and AUC for both models were calculated. The melanoma risk score (MRS based on the outcome of the LR model was presented. Results. The LR model showed that the following risk factors were associated with melanoma: sunbeds (OR = 4.018; 95% CI 1.724- 9.366 for those that sometimes used sunbeds, solar damage of the skin (OR = 8.274; 95% CI 2.661-25.730 for those with severe solar damage, hair color (OR = 3.222; 95% CI 1.984-5.231 for light brown/blond hair, the number of common naevi (over 100 naevi had OR = 3.57; 95% CI 1.427-8.931, the number of dysplastic naevi (from 1 to 10 dysplastic naevi OR was 2.672; 95% CI 1.572-4.540; for more than 10 naevi OR was 6.487; 95%; CI 1.993-21.119, Fitzpatricks phototype and the presence of congenital naevi. Red hair, phototype I and large congenital naevi were

  8. Models of Credit Risk Measurement

    OpenAIRE

    Hagiu Alina

    2011-01-01

    Credit risk is defined as that risk of financial loss caused by failure by the counterparty. According to statistics, for financial institutions, credit risk is much important than market risk, reduced diversification of the credit risk is the main cause of bank failures. Just recently, the banking industry began to measure credit risk in the context of a portfolio along with the development of risk management started with models value at risk (VAR). Once measured, credit risk can be diversif...

  9. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample th...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine......Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...

  10. Mental Models of Security Risks

    Science.gov (United States)

    Asgharpour, Farzaneh; Liu, Debin; Camp, L. Jean

    In computer security, risk communication refers to informing computer users about the likelihood and magnitude of a threat. Efficacy of risk communication depends not only on the nature of the risk, but also on the alignment between the conceptual model embedded in the risk communication and the user's mental model of the risk. The gap between the mental models of security experts and non-experts could lead to ineffective risk communication. Our research shows that for a variety of the security risks self-identified security experts and non-experts have different mental models. We propose that the design of the risk communication methods should be based on the non-expert mental models.

  11. Model Risk in Portfolio Optimization

    Directory of Open Access Journals (Sweden)

    David Stefanovits

    2014-08-01

    Full Text Available We consider a one-period portfolio optimization problem under model uncertainty. For this purpose, we introduce a measure of model risk. We derive analytical results for this measure of model risk in the mean-variance problem assuming we have observations drawn from a normal variance mixture model. This model allows for heavy tails, tail dependence and leptokurtosis of marginals. The results show that mean-variance optimization is seriously compromised by model uncertainty, in particular, for non-Gaussian data and small sample sizes. To mitigate these shortcomings, we propose a method to adjust the sample covariance matrix in order to reduce model risk.

  12. Enterprise Risk Management Models

    CERN Document Server

    Olson, David L

    2010-01-01

    Enterprise risk management has always been important. However, the events of the 21st Century have made it even more critical. The top level of business management became suspect after scandals at ENRON, WorldCom, and other business entities. Financially, many firms experienced difficulties from bubbles. The problems of interacting cultures demonstrated risk from terrorism as well, with numerous terrorist attacks, to include 9/11 in the U.S. Risks can arise in many facets of business. Businesses in fact exist to cope with risk in their area of specialization. Financial risk management has focu

  13. Qualitative Event-based Diagnosis with Possible Conflicts Applied to Spacecraft Power Distribution Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based diagnosis enables efficient and safe operation of engineered systems. In this paper, we describe two algorithms based on a qualitative event-based fault...

  14. Wildfire Risk Main Model

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The model combines three modeled fire behavior parameters (rate of spread, flame length, crown fire potential) and one modeled ecological health measure (fire regime...

  15. Modelling allergenic risk

    DEFF Research Database (Denmark)

    Birot, Sophie

    for all the methods using uncertainty analysis [11]. The recommended approach for the allergen risk assessment was implemented in a Shiny application with the R software. Thus, allergen risk assessment can be performed easily by non-statisticians with the interactive application....... Allergen and Allergy Management) aims at developing strategies for food allergies based on evidences. Especially, food allergen risk assessment helps food producers or authorities to make decisions on withdrawing a food product from the market or adding more information on the label when allergen presence...... is unintended. The risk assessment method has three different kinds of input. The exposure is calculated from the product consumption and the allergen contamination in the food product. The exposure is then compared to the thresholds to which allergic individuals react in order to calculate the chance...

  16. Software Development Risk Management Model

    OpenAIRE

    Islam, Shareeful

    2011-01-01

    Risk management is an effective tool to control risks in software projects and increases the likelihood of project success. Risk management needs to be integrated as early as possible in the project. This dissertation proposes a Goal-driven Software Development Risk Management Model (GSRM) and explicitly integrates it into requirements engineering phase. This integration provides an early warning of potential problems so that both preventive and corrective actions can be undertaken to avoid t...

  17. Operational Risk Modeling

    OpenAIRE

    Gabriela ANGHELACHE; Ana Cornelia OLTEANU

    2011-01-01

    Losses resulting from operational risk events from a complex interaction between organizational factors, personal and market participants that do not fit a simple classification scheme. Taking into account past losses (ex. Barings, Daiwa, etc.) we can say that operational risk is a major financial losses in the banking sector, although until recently have been underestimated, considering that they are generally minor, note setting survival of a bank.

  18. Operational Risk Modeling

    Directory of Open Access Journals (Sweden)

    Gabriela ANGHELACHE

    2011-06-01

    Full Text Available Losses resulting from operational risk events from a complex interaction between organizational factors, personal and market participants that do not fit a simple classification scheme. Taking into account past losses (ex. Barings, Daiwa, etc. we can say that operational risk is a major financial losses in the banking sector, although until recently have been underestimated, considering that they are generally minor, note setting survival of a bank.

  19. Risk Management Technologies With Logic and Probabilistic Models

    CERN Document Server

    Solozhentsev, E D

    2012-01-01

    This book presents intellectual, innovative, information technologies (I3-technologies) based on logical and probabilistic (LP) risk models. The technologies presented here consider such models for structurally complex systems and processes with logical links and with random events in economics and technology.  The volume describes the following components of risk management technologies: LP-calculus; classes of LP-models of risk and efficiency; procedures for different classes; special software for different classes; examples of applications; methods for the estimation of probabilities of events based on expert information. Also described are a variety of training courses in these topics. The classes of risk models treated here are: LP-modeling, LP-classification, LP-efficiency, and LP-forecasting. Particular attention is paid to LP-models of risk of failure to resolve difficult economic and technical problems. Amongst the  discussed  procedures of I3-technologies  are the construction of  LP-models,...

  20. Multilevel joint competing risk models

    Science.gov (United States)

    Karunarathna, G. H. S.; Sooriyarachchi, M. R.

    2017-09-01

    Joint modeling approaches are often encountered for different outcomes of competing risk time to event and count in many biomedical and epidemiology studies in the presence of cluster effect. Hospital length of stay (LOS) has been the widely used outcome measure in hospital utilization due to the benchmark measurement for measuring multiple terminations such as discharge, transferred, dead and patients who have not completed the event of interest at the follow up period (censored) during hospitalizations. Competing risk models provide a method of addressing such multiple destinations since classical time to event models yield biased results when there are multiple events. In this study, the concept of joint modeling has been applied to the dengue epidemiology in Sri Lanka, 2006-2008 to assess the relationship between different outcomes of LOS and platelet count of dengue patients with the district cluster effect. Two key approaches have been applied to build up the joint scenario. In the first approach, modeling each competing risk separately using the binary logistic model, treating all other events as censored under the multilevel discrete time to event model, while the platelet counts are assumed to follow a lognormal regression model. The second approach is based on the endogeneity effect in the multilevel competing risks and count model. Model parameters were estimated using maximum likelihood based on the Laplace approximation. Moreover, the study reveals that joint modeling approach yield more precise results compared to fitting two separate univariate models, in terms of AIC (Akaike Information Criterion).

  1. Colorectal Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  2. Cervical Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  3. Liver Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  4. Ovarian Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  5. Bladder Cancer Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  6. Models for Pesticide Risk Assessment

    Science.gov (United States)

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  7. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  8. Risk Modelling of Agricultural Products

    Science.gov (United States)

    Nugrahani, E. H.

    2017-03-01

    In the real world market, agricultural commodity are imposed with fluctuating prices. This means that the price of agricultural products are relatively volatile, which means that agricultural business is a quite risky business for farmers. This paper presents some mathematical models to model such risks in the form of its volatility, based on certain assumptions. The proposed models are time varying volatility model, as well as time varying volatility with mean reversion and with seasonal mean equation models. Implementation on empirical data show that agricultural products are indeed risky.

  9. Cabin Environment Physics Risk Model

    Science.gov (United States)

    Mattenberger, Christopher J.; Mathias, Donovan Leigh

    2014-01-01

    This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.

  10. Information risk and security modeling

    Science.gov (United States)

    Zivic, Predrag

    2005-03-01

    This research paper presentation will feature current frameworks to addressing risk and security modeling and metrics. The paper will analyze technical level risk and security metrics of Common Criteria/ISO15408, Centre for Internet Security guidelines, NSA configuration guidelines and metrics used at this level. Information IT operational standards view on security metrics such as GMITS/ISO13335, ITIL/ITMS and architectural guidelines such as ISO7498-2 will be explained. Business process level standards such as ISO17799, COSO and CobiT will be presented with their control approach to security metrics. Top level, the maturity standards such as SSE-CMM/ISO21827, NSA Infosec Assessment and CobiT will be explored and reviewed. For each defined level of security metrics the research presentation will explore the appropriate usage of these standards. The paper will discuss standards approaches to conducting the risk and security metrics. The research findings will demonstrate the need for common baseline for both risk and security metrics. This paper will show the relation between the attribute based common baseline and corporate assets and controls for risk and security metrics. IT will be shown that such approach spans over all mentioned standards. The proposed approach 3D visual presentation and development of the Information Security Model will be analyzed and postulated. Presentation will clearly demonstrate the benefits of proposed attributes based approach and defined risk and security space for modeling and measuring.

  11. Lunar Landing Operational Risk Model

    Science.gov (United States)

    Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian

    2010-01-01

    Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.

  12. Fuzzy audit risk modeling algorithm

    Directory of Open Access Journals (Sweden)

    Zohreh Hajihaa

    2011-07-01

    Full Text Available Fuzzy logic has created suitable mathematics for making decisions in uncertain environments including professional judgments. One of the situations is to assess auditee risks. During recent years, risk based audit (RBA has been regarded as one of the main tools to fight against fraud. The main issue in RBA is to determine the overall audit risk an auditor accepts, which impact the efficiency of an audit. The primary objective of this research is to redesign the audit risk model (ARM proposed by auditing standards. The proposed model of this paper uses fuzzy inference systems (FIS based on the judgments of audit experts. The implementation of proposed fuzzy technique uses triangular fuzzy numbers to express the inputs and Mamdani method along with center of gravity are incorporated for defuzzification. The proposed model uses three FISs for audit, inherent and control risks, and there are five levels of linguistic variables for outputs. FISs include 25, 25 and 81 rules of if-then respectively and officials of Iranian audit experts confirm all the rules.

  13. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  14. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...

  15. Model risk analysis for risk management and option pricing

    NARCIS (Netherlands)

    Kerkhof, F.L.J.

    2003-01-01

    Due to the growing complexity of products in financial markets, market participants rely more and more on quantitative models for trading and risk management decisions. This introduces a fairly new type of risk, namely, model risk. In the first part of this thesis we investigate the quantitative

  16. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  17. The Link Between Alcohol Use and Aggression Toward Sexual Minorities: An Event-Based Analysis

    OpenAIRE

    Parrott, Dominic J.; Gallagher, Kathryn E.; Vincent, Wilson; Bakeman, Roger

    2010-01-01

    The current study used an event-based assessment approach to examine the day-to-day relationship between heterosexual men’s alcohol consumption and perpetration of aggression toward sexual minorities. Participants were 199 heterosexual drinking men between the ages of 18–30 who completed (1) separate timeline followback interviews to assess alcohol use and aggression toward sexual minorities during the past year, and (2) written self-report measures of risk factors for aggression toward sexua...

  18. Event-based prospective memory performance in autism spectrum disorder

    NARCIS (Netherlands)

    Altgassen, A.M.; Schmitz-Hübsch, M.; Kliegel, M.

    2010-01-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with

  19. Training Team Problem Solving Skills: An Event-Based Approach.

    Science.gov (United States)

    Oser, R. L.; Gualtieri, J. W.; Cannon-Bowers, J. A.; Salas, E.

    1999-01-01

    Discusses how to train teams in problem-solving skills. Topics include team training, the use of technology, instructional strategies, simulations and training, theoretical framework, and an event-based approach for training teams to perform in naturalistic environments. Contains 68 references. (Author/LRW)

  20. Deterministic event-based simulation of universal quantum computation

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, H. De; Raedt, K. De; Landau, DP; Lewis, SP; Schuttler, HB

    2006-01-01

    We demonstrate that locally connected networks of classical processing units that leave primitive learning capabilities can be used to perform a deterministic; event-based simulation of universal tluanttim computation. The new simulation method is applied to implement Shor's factoring algorithm.

  1. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  2. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  3. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  4. Risk matrix model for rotating equipment

    Directory of Open Access Journals (Sweden)

    Wassan Rano Khan

    2014-07-01

    Full Text Available Different industries have various residual risk levels for their rotating equipment. Accordingly the occurrence rate of the failures and associated failure consequences categories are different. Thus, a generalized risk matrix model is developed in this study which can fit various available risk matrix standards. This generalized risk matrix will be helpful to develop new risk matrix, to fit the required risk assessment scenario for rotating equipment. Power generation system was taken as case study. It was observed that eight subsystems were under risk. Only vibration monitor system was under high risk category, while remaining seven subsystems were under serious and medium risk categories.

  5. A joint renewal process used to model event based data

    National Research Council Canada - National Science Library

    Mergenthaler, Wolfgang; Jaroszewski, Daniel; Feller, Sebastian; Laumann, Larissa

    2016-01-01

    .... Event data, herein defined as a collection of triples containing a time stamp, a failure code and eventually a descriptive text, can best be evaluated by using the paradigm of joint renewal processes...

  6. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  7. Credit Risk Modelling and Implementation of Credit Risk Models in China

    OpenAIRE

    Yu, Mengxiao

    2007-01-01

    Credit risk, or the risk of counterparty default, is an important factor in the valuation and risk management of financial assets. It has become increasingly important to financial institutions. A variety of credit risk models have been developed to measure credit risk. They are J.P. Morgan's CreditMetrics; KMV's PortfolioManager based on Merton (1974) option pricing model; macroeconomic model CreditPortfolio View developed by McKinsey; CSFB's Credit Risk+ Model based on actuarial science fra...

  8. Event-based prospective memory performance in autism spectrum disorder.

    Science.gov (United States)

    Altgassen, Mareike; Schmitz-Hübsch, Maren; Kliegel, Matthias

    2010-03-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with autism spectrum disorder and 19 matched neurotypical controls participated. The laboratory-based prospective memory test was embedded in a visuo-spatial working memory test and required participants to remember to respond to a cue-event. Everyday planning performance was assessed with proxy ratings. Although parents of the autism group rated their children's everyday performance as significantly poorer than controls' parents, no group differences were found in event-based prospective memory. Nevertheless, individual differences in laboratory-based and everyday performances were related. Clinical implications of these findings are discussed.

  9. RISK LOAN PORTFOLIO OPTIMIZATION MODEL BASED ON CVAR RISK MEASURE

    Directory of Open Access Journals (Sweden)

    Ming-Chang LEE

    2015-07-01

    Full Text Available In order to achieve commercial banks liquidity, safety and profitability objective requirements, loan portfolio risk analysis based optimization decisions are rational allocation of assets.  The risk analysis and asset allocation are the key technology of banking and risk management.  The aim of this paper, build a loan portfolio optimization model based on risk analysis.  Loan portfolio rate of return by using Value-at-Risk (VaR and Conditional Value-at-Risk (CVaR constraint optimization decision model reflects the bank's risk tolerance, and the potential loss of direct control of the bank.  In this paper, it analyze a general risk management model applied to portfolio problems with VaR and CVaR risk measures by using Using the Lagrangian Algorithm.  This paper solves the highly difficult problem by matrix operation method.  Therefore, the combination of this paper is easy understanding the portfolio problems with VaR and CVaR risk model is a hyperbola in mean-standard deviation space.  It is easy calculation in proposed method.

  10. Event-based prospective memory performance in autism spectrum disorder

    OpenAIRE

    Altgassen, Mareike; Schmitz-H?bsch, Maren; Kliegel, Matthias

    2009-01-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with autism spectrum disorder and 19 matched neurotypical controls participated. The laboratory-based prospective memory test was embedded in a visuo-spatial working memory test and required participants to ...

  11. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Mitigating risk during strategic supply network modeling

    OpenAIRE

    Müssigmann, Nikolaus

    2006-01-01

    Mitigating risk during strategic supply network modeling. - In: Managing risks in supply chains / ed. by Wolfgang Kersten ... - Berlin : Schmidt, 2006. - S. 213-226. - (Operations and technology management ; 1)

  13. Competing Risks and Multistate Models with R

    CERN Document Server

    Beyersmann, Jan; Schumacher, Martin

    2012-01-01

    This book covers competing risks and multistate models, sometimes summarized as event history analysis. These models generalize the analysis of time to a single event (survival analysis) to analysing the timing of distinct terminal events (competing risks) and possible intermediate events (multistate models). Both R and multistate methods are promoted with a focus on nonparametric methods.

  14. Modeling Research Project Risks with Fuzzy Maps

    Science.gov (United States)

    Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana

    2009-01-01

    The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…

  15. Nottingham knee osteoarthritis risk prediction models.

    Science.gov (United States)

    Zhang, Weiya; McWilliams, Daniel F; Ingham, Sarah L; Doherty, Sally A; Muthuri, Stella; Muir, Kenneth R; Doherty, Michael

    2011-09-01

    (1) To develop risk prediction models for knee osteoarthritis (OA) and (2) to estimate the risk reduction that results from modification of potential risk factors. This was a 12-year retrospective cohort study undertaken in the general population in Nottingham, UK. Baseline risk factors were collected by questionnaire. Incident radiographic knee OA was defined by Kellgren and Lawrence (KL) score ≥2. Incident symptomatic knee OA was defined by KL ≥2 plus knee pain. Progression of knee OA was defined by KL ≥1 grade increase from baseline. A logistic regression model was used for prediction. Calibration and discrimination of the models were tested in the Osteoarthritis Initiative (OAI) population and Genetics of Osteoarthritis and Lifestyle (GOAL) population. ORs of the models were compared with those obtained from meta-analysis of existing literature. From a community sample of 424 people aged over 40, 3 risk prediction models were developed. These included incidence of radiographic knee OA, incidence of symptomatic knee OA and progression of knee OA. All models had good calibration and moderate discrimination power in OAI and GOAL. The ORs lied within the 95% CIs of the published studies. The risk reduction due to modifying obesity at the individual and the population levels were demonstrated. Risk prediction of knee OA based on the well established, common modifiable risk factors has been established. The models may be used to predict the risk of knee OA, and risk reduction due to preventing a specific risk factor.

  16. Risk Modelling for Passages in Approach Channel

    Directory of Open Access Journals (Sweden)

    Leszek Smolarek

    2013-01-01

    Full Text Available Methods of multivariate statistics, stochastic processes, and simulation methods are used to identify and assess the risk measures. This paper presents the use of generalized linear models and Markov models to study risks to ships along the approach channel. These models combined with simulation testing are used to determine the time required for continuous monitoring of endangered objects or period at which the level of risk should be verified.

  17. ISM Approach to Model Offshore Outsourcing Risks

    Directory of Open Access Journals (Sweden)

    Sunand Kumar

    2014-07-01

    Full Text Available In an effort to achieve a competitive advantage via cost reductions and improved market responsiveness, organizations are increasingly employing offshore outsourcing as a major component of their supply chain strategies. But as evident from literature number of risks such as Political risk, Risk due to cultural differences, Compliance and regulatory risk, Opportunistic risk and Organization structural risk, which adversely affect the performance of offshore outsourcing in a supply chain network. This also leads to dissatisfaction among different stake holders. The main objective of this paper is to identify and understand the mutual interaction among various risks which affect the performance of offshore outsourcing.  To this effect, authors have identified various risks through extant review of literature.  From this information, an integrated model using interpretive structural modelling (ISM for risks affecting offshore outsourcing is developed and the structural relationships between these risks are modeled.  Further, MICMAC analysis is done to analyze the driving power and dependency of risks which shall be helpful to managers to identify and classify important criterions and to reveal the direct and indirect effects of each criterion on offshore outsourcing. Results show that political risk and risk due to cultural differences are act as strong drivers.

  18. Concordance for prognostic models with competing risks

    DEFF Research Database (Denmark)

    Wolbers, Marcel; Blanche, Paul; Koller, Michael T

    2014-01-01

    The concordance probability is a widely used measure to assess discrimination of prognostic models with binary and survival endpoints. We formally define the concordance probability for a prognostic model of the absolute risk of an event of interest in the presence of competing risks and relate i...... of the working model. We further illustrate the methods by computing the concordance probability for a prognostic model of coronary heart disease (CHD) events in the presence of the competing risk of non-CHD death.......The concordance probability is a widely used measure to assess discrimination of prognostic models with binary and survival endpoints. We formally define the concordance probability for a prognostic model of the absolute risk of an event of interest in the presence of competing risks and relate...

  19. Caries risk assessment models in caries prediction

    Directory of Open Access Journals (Sweden)

    Amila Zukanović

    2013-11-01

    Full Text Available Objective. The aim of this research was to assess the efficiency of different multifactor models in caries prediction. Material and methods. Data from the questionnaire and objective examination of 109 examinees was entered into the Cariogram, Previser and Caries-Risk Assessment Tool (CAT multifactor risk assessment models. Caries risk was assessed with the help of all three models for each patient, classifying them as low, medium or high-risk patients. The development of new caries lesions over a period of three years [Decay Missing Filled Tooth (DMFT increment = difference between Decay Missing Filled Tooth Surface (DMFTS index at baseline and follow up], provided for examination of the predictive capacity concerning different multifactor models. Results. The data gathered showed that different multifactor risk assessment models give significantly different results (Friedman test: Chi square = 100.073, p=0.000. Cariogram is the model which identified the majority of examinees as medium risk patients (70%. The other two models were more radical in risk assessment, giving more unfavorable risk –profiles for patients. In only 12% of the patients did the three multifactor models assess the risk in the same way. Previser and CAT gave the same results in 63% of cases – the Wilcoxon test showed that there is no statistically significant difference in caries risk assessment between these two models (Z = -1.805, p=0.071. Conclusions. Evaluation of three different multifactor caries risk assessment models (Cariogram, PreViser and CAT showed that only the Cariogram can successfully predict new caries development in 12-year-old Bosnian children.

  20. Why operational risk modelling creates inverse incentives

    NARCIS (Netherlands)

    Doff, R.

    2015-01-01

    Operational risk modelling has become commonplace in large international banks and is gaining popularity in the insurance industry as well. This is partly due to financial regulation (Basel II, Solvency II). This article argues that operational risk modelling is fundamentally flawed, despite efforts

  1. GERMcode: A Stochastic Model for Space Radiation Risk Assessment

    Science.gov (United States)

    Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2012-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and high charge and energy (HZE) particles that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of HZE particles in tissue and shielding materials is made with a stochastic approach that includes both particle track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections. For NSRL applications, the GERMcode evaluates a set of biophysical properties, such as the Poisson distribution of particles or delta-ray hits for a given cellular area and particle dose, the radial dose on tissue, and the frequency distribution of energy deposition in a DNA volume. By utilizing the ProE/Fishbowl ray-tracing analysis, the GERMcode will be used as a bi-directional radiation transport model for future spacecraft shielding analysis in support of Mars mission risk assessments. Recent radiobiological experiments suggest the need for new approaches to risk assessment that include time-dependent biological events due to the signaling times for activation and relaxation of biological processes in cells and tissue. Thus, the tracking of the temporal and spatial distribution of events in tissue is a major goal of the GERMcode in support of the simulation of biological processes important in GCR risk assessments. In order to validate our approach, basic radiobiological responses such as cell survival curves, mutation, chromosomal

  2. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  3. An event-based neurobiological recognition system with orientation detector for objects in multiple orientations

    Directory of Open Access Journals (Sweden)

    Hanyu Wang

    2016-11-01

    Full Text Available A new multiple orientation event-based neurobiological recognition system is proposed by integrating recognition and tracking function in this paper, which is used for asynchronous address-event representation (AER image sensors. The characteristic of this system has been enriched to recognize the objects in multiple orientations with only training samples moving in a single orientation. The system extracts multi-scale and multi-orientation line features inspired by models of the primate visual cortex. An orientation detector based on modified Gaussian blob tracking algorithm is introduced for object tracking and orientation detection. The orientation detector and feature extraction block work in simultaneous mode, without any increase in categorization time. An addresses lookup table (addresses LUT is also presented to adjust the feature maps by addresses mapping and reordering, and they are categorized in the trained spiking neural network. This recognition system is evaluated with the MNIST dataset which have played important roles in the development of computer vision, and the accuracy is increase owing to the use of both ON and OFF events. AER data acquired by a DVS are also tested on the system, such as moving digits, pokers, and vehicles. The experimental results show that the proposed system can realize event-based multi-orientation recognition.The work presented in this paper makes a number of contributions to the event-based vision processing system for multi-orientation object recognition. It develops a new tracking-recognition architecture to feedforward categorization system and an address reorder approach to classify multi-orientation objects using event-based data. It provides a new way to recognize multiple orientation objects with only samples in single orientation.

  4. The link between alcohol use and aggression toward sexual minorities: an event-based analysis.

    Science.gov (United States)

    Parrott, Dominic J; Gallagher, Kathryn E; Vincent, Wilson; Bakeman, Roger

    2010-09-01

    The current study used an event-based assessment approach to examine the day-to-day relationship between heterosexual men's alcohol consumption and perpetration of aggression toward sexual minorities. Participants were 199 heterosexual drinking men between the ages of 18-30 who completed (1) separate timeline followback interviews to assess alcohol use and aggression toward sexual minorities during the past year, and (2) written self-report measures of risk factors for aggression toward sexual minorities. Results indicated that aggression toward sexual minorities was twice as likely on a day when drinking was reported than on nondrinking days, with over 80% of alcohol-related aggressive acts perpetrated within the group context. Patterns of alcohol use (i.e., number of drinking days, mean drinks per drinking day, number of heavy drinking days) were not associated with perpetration after controlling for demographic variables and pertinent risk factors. Results suggest that it is the acute effects of alcohol, and not men's patterns of alcohol consumption, that facilitate aggression toward sexual minorities. More importantly, these data are the first to support an event-based link between alcohol use and aggression toward sexual minorities (or any minority group), and provide the impetus for future research to examine risk factors and mechanisms for intoxicated aggression toward sexual minorities and other stigmatized groups.

  5. Energy risk management and value at risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sadeghi, Mehdi [Economics department, Imam Sadiq University, P.B. 14655-159, Tehran (Iran, Islamic Republic of)]. E-mail: sadeghi@isu.ac.ir; Shavvalpour, Saeed [Economics department, Imam Sadiq University, P.B. 14655-159, Tehran (Iran, Islamic Republic of)]. E-mail: shavalpoor@isu.ac.ir

    2006-12-15

    The value of energy trades can change over time with market conditions and underlying price variables. The rise of competition and deregulation in energy markets has led to relatively free energy markets that are characterized by high price shifts. Within oil markets the volatile oil price environment after OPEC agreements in the 1970s requires a risk quantification.' Value-at-risk' has become an essential tool for this end when quantifying market risk. There are various methods for calculating value-at-risk. The methods we introduced in this paper are Historical Simulation ARMA Forecasting and Variance-Covariance based on GARCH modeling approaches. The results show that among various approaches the HSAF methodology presents more efficient results, so that if the level of confidence is 99%, the value-at-risk calculated through HSAF methodology is greater than actual price changes in almost 97.6 percent of the forecasting period.

  6. Energy risk management and value at risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Mehdi Sadeghi; Saeed Shavvalpour [Imam Sadiq University, Tehran (Iran). Economics Dept.

    2006-12-15

    The value of energy trades can change over time with market conditions and underlying price variables. The rise of competition and deregulation in energy markets has led to relatively free energy markets that are characterized by high price shifts. Within oil markets the volatile oil price environment after OPEC agreements in the 1970s requires a risk quantification. ''Value-at-risk'' has become an essential tool for this end when quantifying market risk. There are various methods for calculating value-at-risk. The methods we introduced in this paper are Historical Simulation ARMA Forecasting and Variance-Covariance based on GARCH modeling approaches. The results show that among various approaches the HSAF methodology presents more efficient results, so that if the level of confidence is 99%, the value-at-risk calculated through HSAF methodology is greater than actual price changes in almost 97.6 percent of the forecasting period. (author)

  7. A Network Model of Credit Risk Contagion

    Directory of Open Access Journals (Sweden)

    Ting-Qiang Chen

    2012-01-01

    Full Text Available A network model of credit risk contagion is presented, in which the effect of behaviors of credit risk holders and the financial market regulators and the network structure are considered. By introducing the stochastic dominance theory, we discussed, respectively, the effect mechanisms of the degree of individual relationship, individual attitude to credit risk contagion, the individual ability to resist credit risk contagion, the monitoring strength of the financial market regulators, and the network structure on credit risk contagion. Then some derived and proofed propositions were verified through numerical simulations.

  8. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  9. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  10. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  11. Competing Risks Copula Models for Unemployment Duration

    DEFF Research Database (Denmark)

    Lo, Simon M. S.; Stephan, Gesine; Wilke, Ralf

    2017-01-01

    The copula graphic estimator (CGE) for competing risks models has received little attention in empirical research, despite having been developed into a comprehensive research method. In this paper, we bridge the gap between theoretical developments and applied research by considering a general...... class of competing risks copula models, which nests popular models such as the Cox proportional hazards model, the semiparametric multivariate mixed proportional hazards model (MMPHM), and the CGE as special cases. Analyzing the effects of a German Hartz reform on unemployment duration, we illustrate...

  12. Modeling foreign exchange risk premium in Armenia

    NARCIS (Netherlands)

    Poghosyan, Tigran; Kocenda, Evnen; Zemcik, Petr

    2008-01-01

    This paper applies stochastic discount factor methodology to modeling the foreign exchange risk premium in Armenia. We use weekly data on foreign and domestic currency deposits, which coexist in the Armenian banking system. This coexistence implies elimination of the cross-country risks and

  13. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  14. Modeling Risk Convergence for European Financial Markets

    Directory of Open Access Journals (Sweden)

    Radu LUPU

    2014-09-01

    Full Text Available This article studies the convergence of risk on a sample of 13 European indexes. We use a set of 31 model specifications of a significant number of models belonging to the GARCH class and on their estimates we build an aggregate index in a Value-at-Risk approach. We use this index as a base for our convergence analysis. The results indicate a positive and significant tendency of convergence growth for the European financial market

  15. Risk Measurement and Risk Modelling using Applications of Vine Copulas

    NARCIS (Netherlands)

    D.E. Allen (David); M.J. McAleer (Michael); A.K. Singh (Abhay)

    2014-01-01

    markdownabstract__abstract__ This paper features an application of Regular Vine copulas which are a novel and recently developed statistical and mathematical tool which can be applied in the assessment of composite nancial risk. Copula-based dependence modelling is a popular tool in nancial

  16. Risk measurement and risk modelling using applications of Vine copulas

    NARCIS (Netherlands)

    D.E. Allen (David); M.J. McAleer (Michael); A.K. Singh (Abhay)

    2017-01-01

    textabstractThis paper features an application of Regular Vine copulas which are a novel and recently developed statistical and mathematical tool which can be applied in the assessment of composite financial risk. Copula-based dependence modelling is a popular tool in financial applications, but is

  17. Building caries risk assessment models for children.

    Science.gov (United States)

    Gao, X-L; Hsu, C-Y S; Xu, Y; Hwarng, H B; Loh, T; Koh, D

    2010-06-01

    Despite the well-recognized importance of caries risk assessment, practical models remain to be established. This study was designed to develop biopsychosocial models for caries risk assessment in various settings. With a questionnaire, an oral examination, and biological (salivary, microbiological, and plaque pH) tests, a prospective study was conducted among 1782 children aged 3-6 years, with 1576 (88.4%) participants followed in 12 months. Multiple risk factors, indicators, and protective factors were identified. Various risk assessment models were constructed by the random selection of 50% of the cases and further validated in the remaining cases. For the prediction of a "one-year caries increment", screening models without biological tests achieved a sensitivity/specificity of 82%/73%; with biological tests, full-blown models achieved the sensitivity/specificity of 90%/90%. For identification of a quarter of the children with high caries burden (baseline dmft > 2), a community-screening model requiring only a questionnaire reached a sensitivity/specificity of 82%/81%. These models are promising tools for cost-effective caries control and evidence-based treatment planning. decayed, missing, filled teeth in primary dentition (dmft); receiver operation characteristics (ROC); relative risk (RR); confidence interval (CI); National Institutes of Health (NIH); World Health Organization (WHO); US Department of Health and Human Services (US/DHHS); American Academy of Pediatric Dentistry (AAPD).

  18. Event-Based User Classification in Weibo Media

    Science.gov (United States)

    Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235

  19. Event-based internet biosurveillance: relation to epidemiological observation

    Directory of Open Access Journals (Sweden)

    Nelson Noele P

    2012-06-01

    Full Text Available Abstract Background The World Health Organization (WHO collects and publishes surveillance data and statistics for select diseases, but traditional methods of gathering such data are time and labor intensive. Event-based biosurveillance, which utilizes a variety of Internet sources, complements traditional surveillance. In this study we assess the reliability of Internet biosurveillance and evaluate disease-specific alert criteria against epidemiological data. Methods We reviewed and compared WHO epidemiological data and Argus biosurveillance system data for pandemic (H1N1 2009 (April 2009 – January 2010 from 8 regions and 122 countries to: identify reliable alert criteria among 15 Argus-defined categories; determine the degree of data correlation for disease progression; and assess timeliness of Internet information. Results Argus generated a total of 1,580 unique alerts; 5 alert categories generated statistically significant (p  Conclusion Confirmed pandemic (H1N1 2009 cases collected by Argus and WHO methods returned consistent results and confirmed the reliability and timeliness of Internet information. Disease-specific alert criteria provide situational awareness and may serve as proxy indicators to event progression and escalation in lieu of traditional surveillance data; alerts may identify early-warning indicators to another pandemic, preparing the public health community for disease events.

  20. Techniques and Simulation Models in Risk Management

    Directory of Open Access Journals (Sweden)

    Mirela GHEORGHE

    2012-12-01

    Full Text Available In the present paper, the scientific approach of the research starts from the theoretical framework of the simulation concept and then continues in the setting of the practical reality, thus providing simulation models for a broad range of inherent risks specific to any organization and simulation of those models, using the informatics instrument @Risk (Palisade. The reason behind this research lies in the need for simulation models that will allow the person in charge with decision taking inside the field of risk management to adopt new corporate strategies which will answer their current needs. The results of the research are represented by two simulation models specific to risk management. The first model follows the net profit simulation as well as simulating the impact that could be generated by a series of inherent risk factors such as losing some important colleagues, a drop in selling prices, a drop in sales volume, retrofitting, and so on. The second simulation model is associated to the IT field, through the analysis of 10 informatics threats, in order to evaluate the potential financial loss.

  1. Models to Assess the Bankruptcy Risk

    Directory of Open Access Journals (Sweden)

    Simona Valeria TOMA

    2013-08-01

    Full Text Available Closely related to financial risk assessment, one of the main concerns of the organizations should be the evaluation of bankruptcy risk, in this period of slow economic growth. Organization bankruptcies have increased in recent years worldwide. The aim of this paper is to demonstrate that the methods and models for forecasting bankruptcy of organizations, for the bankruptcy risk assessment are seeing for the health financing of an entity in financial accounting diagnosis and that the organizations requires assessment of risks accompanying the work, in which some signals fragility (vulnerable health this and other projected bankruptcy (insolvability threatens its survival (continuity. The bankruptcy risk assessment is important for profit-seeking investors because they must know how to value a company in or near bankruptcy is an important skill, but to detect any signs of looming bankruptcy is necessary to calculate and to analyse all kinds of financial rations: working capital, profitability, debt levels and liquidity.

  2. Risk Measurement and Risk Modelling Using Applications of Vine Copulas

    Directory of Open Access Journals (Sweden)

    David E. Allen

    2017-09-01

    Full Text Available This paper features an application of Regular Vine copulas which are a novel and recently developed statistical and mathematical tool which can be applied in the assessment of composite financial risk. Copula-based dependence modelling is a popular tool in financial applications, but is usually applied to pairs of securities. By contrast, Vine copulas provide greater flexibility and permit the modelling of complex dependency patterns using the rich variety of bivariate copulas which may be arranged and analysed in a tree structure to explore multiple dependencies. The paper features the use of Regular Vine copulas in an analysis of the co-dependencies of 10 major European Stock Markets, as represented by individual market indices and the composite STOXX 50 index. The sample runs from 2005 to the end of 2013 to permit an exploration of how correlations change indifferent economic circumstances using three different sample periods: pre-GFC (January 2005–July 2007, GFC (July 2007– September 2009, and post-GFC periods (September 2009–December 2013. The empirical results suggest that the dependencies change in a complex manner, and are subject to change in different economic circumstances. One of the attractions of this approach to risk modelling is the flexibility in the choice of distributions used to model co-dependencies. The practical application of Regular Vine metrics is demonstrated via an example of the calculation of the VaR of a portfolio made up of the indices.

  3. Modelling of Systemic Risk of Banking Sector

    Directory of Open Access Journals (Sweden)

    Laura Gudelytė

    2014-03-01

    Full Text Available Purpose – to evaluate the general networking and simulation approaches of modelling of systemic risk and the financial contagion and their ability to assess the banking sector resilience in the case of external economic shocks and collapse of idiosyncratic financial institutions.Design/methodology/approach – a general overview of research papers presenting concepts and methodologies of assessment of systemic risk of the banking sector.Findings – limitations of the networking approach and possible ways to improve modelling of systemic risk. The network approach cannot explain the causes of initial default of bank. On the other hand, assumptions made on LGD and interbank exposures are very strong. These features are important limitations of network and simulation approaches.Research limitations/implications – the application of reviewed methods in the case of Lithuanian banking sector falls, however, due to the lack of exhaustive data. On the other hand, until now, applied methods for systemic risk due to the lack of data have been limited. Also, because of this reason, there are difficulties to create adequate dynamic assessment for systemic risk models. Therefore, in assessing systemic risk of the banking sector, the same problem remains: is it possible to parameterize the financial crisis, its spread and speed and other characteristics according to quantitative methods. Knowing the liquidity, credit risk and other standards set in Basel Accords, it is also not enough to properly manage the systemic risk of the whole banking sector because for the proper activity of the banking sector not only characteristics related to capital requirements have influence on it, but also external (mostly the macroeconomic, political factors.Practical implications – determination of the explicit connection based on quantitative methods determining the systemic risk of the banking sector would be exact and objective assessment and useful not only for the

  4. Risk management model of winter navigation operations.

    Science.gov (United States)

    Valdez Banda, Osiris A; Goerlandt, Floris; Kuzmin, Vladimir; Kujala, Pentti; Montewka, Jakub

    2016-07-15

    The wintertime maritime traffic operations in the Gulf of Finland are managed through the Finnish-Swedish Winter Navigation System. This establishes the requirements and limitations for the vessels navigating when ice covers this area. During winter navigation in the Gulf of Finland, the largest risk stems from accidental ship collisions which may also trigger oil spills. In this article, a model for managing the risk of winter navigation operations is presented. The model analyses the probability of oil spills derived from collisions involving oil tanker vessels and other vessel types. The model structure is based on the steps provided in the Formal Safety Assessment (FSA) by the International Maritime Organization (IMO) and adapted into a Bayesian Network model. The results indicate that ship independent navigation and convoys are the operations with higher probability of oil spills. Minor spills are most probable, while major oil spills found very unlikely but possible. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Improvement of hydrological flood forecasting through an event based output correction method

    Science.gov (United States)

    Klotz, Daniel; Nachtnebel, Hans Peter

    2014-05-01

    This contribution presents an output correction method for hydrological models. A conceptualisation of the method is presented and tested in an alpine basin in Salzburg, Austria. The aim is to develop a method which is not prone to the drawbacks of autoregressive models. Output correction methods are an attractive option for improving hydrological predictions. They are complementary to the main modelling process and do not interfere with the modelling process itself. In general, output correction models estimate the future error of a prediction and use the estimation to improve the given prediction. Different estimation techniques are available dependent on the utilized information and the estimation procedure itself. Autoregressive error models are widely used for such corrections. Autoregressive models with exogenous inputs (ARX) allow the use of additional information for the error modelling, e.g. measurements from upper basins or predicted input-signals. Autoregressive models do however exhibit deficiencies, since the errors of hydrological models do generally not behave in an autoregressive manner. The decay of the error is usually different from an autoregressive function and furthermore the residuals exhibit different patterns under different circumstances. As for an example, one might consider different error-propagation behaviours under high- and low-flow situations or snow melt driven conditions. This contribution presents a conceptualisation of an event-based correction model and focuses on flood events only. The correction model uses information about the history of the residuals and exogenous variables to give an error-estimation. The structure and parameters of the correction models can be adapted to given event classes. An event-class is a set of flood events that exhibit a similar pattern for the residuals or the hydrological conditions. In total, four different event-classes have been identified in this study. Each of them represents a different

  6. A Probabilistic Asteroid Impact Risk Model

    Science.gov (United States)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  7. Risk modelling and management: An overview

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); D.E. Allen (David); M.J. McAleer (Michael); T. Pérez-Amaral (Teodosio)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation are substantially revised versions of the papers that were presented at the 2011 Madrid International Conference on "Risk Modelling and Management" (RMM2011). The papers cover the following topics: currency

  8. Risk Modelling and Management: An Overview

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); D.E. Allen (David); M.J. McAleer (Michael); T. Pérez-Amaral (Teodosio)

    2013-01-01

    textabstractThe papers in this special issue of Mathematics and Computers in Simulation are substantially revised versions of the papers that were presented at the 2011 Madrid International Conference on “Risk Modelling and Management” (RMM2011). The papers cover the following topics: currency

  9. Malignancy risk models for oral lesions.

    Science.gov (United States)

    Zarate, Ana-María; Brezzo, María-Magdalena; Secchi, Dante-Gustavo; Barra, José-Luis; Brunotto, Mabel

    2013-09-01

    The aim of this work was to assess risk habits, clinical and cellular phenotypes and TP53 DNA changes in oral mucosa samples from patients with Oral Potentially Malignant Disorders (OPMD), in order to create models that enable genotypic and phenotypic patterns to be obtained that determine the risk of lesions becoming malignant. Clinical phenotypes, family history of cancer and risk habits were collected in clinical histories. TP53 gene mutation and morphometric-morphological features were studied, and multivariate models were applied. Three groups were estabished: a) oral cancer (OC) group (n=10), b) oral potentially malignant disorders group (n=10), and c) control group (n=8). An average of 50% of patients with malignancy were found to have smoking and drinking habits. A high percentage of TP53 mutations were observed in OC (30%) and OPMD (average 20%) lesions (p=0.000). The majority of these mutations were GC TA transversion mutations (60%). However, patients with OC presented mutations in all the exons and introns studied. Highest diagnostic accuracy (p=0.0001) was observed when incorporating alcohol and tobacco habits variables with TP3 mutations. Our results prove to be statistically reliable, with parameter estimates that are nearly unbiased even for small sample sizes. Models 2 and 3 were the most accurate for assessing the risk of an OPMD becoming cancerous. However, in a public health context, model 3 is the most recommended because the characteristics considered are easier and less costly to evaluate.

  10. Issues in Value-at-Risk Modeling and Evaluation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); C.G. de Vries (Casper); B.N. Jorgensen (Bjørn); P.F. Christoffersen (Peter); F.X. Diebold (Francis); T. Schuermann (Til); J.A. Lopez (Jose); B. Hirtle (Beverly)

    1998-01-01

    textabstractDiscusses the issues in value-at-risk modeling and evaluation. Value of value at risk; Horizon problems and extreme events in financial risk management; Methods of evaluating value-at-risk estimates.

  11. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  12. Landslide risk models for decision making.

    Science.gov (United States)

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.

  13. Mathematical modelling of risk reduction in reinsurance

    Science.gov (United States)

    Balashov, R. B.; Kryanev, A. V.; Sliva, D. E.

    2017-01-01

    The paper presents a mathematical model of efficient portfolio formation in the reinsurance markets. The presented approach provides the optimal ratio between the expected value of return and the risk of yield values below a certain level. The uncertainty in the return values is conditioned by use of expert evaluations and preliminary calculations, which result in expected return values and the corresponding risk levels. The proposed method allows for implementation of computationally simple schemes and algorithms for numerical calculation of the numerical structure of the efficient portfolios of reinsurance contracts of a given insurance company.

  14. Public sector risk management: a specific model.

    Science.gov (United States)

    Lawlor, Ted

    2002-07-01

    Risk management programs for state mental health authorities are generally limited in scope and reactive in nature. Recent changes in how mental health care is provided render it necessary to redirect the risk management focus from its present institutional basis to a statewide, network-based paradigm that is integrated across public and private inpatient and community programs alike. These changes include treating an increasing number of individuals in less-secure settings and contracting for an increasing number of public mental health services with private providers. The model proposed here is closely linked to the Quality Management Process.

  15. Value at Risk models for Energy Risk Management

    OpenAIRE

    Novák, Martin

    2010-01-01

    The main focus of this thesis lies on description of Risk Management in context of Energy Trading. The paper will predominantly discuss Value at Risk and its modifications as a main overall indicator of Energy Risk.

  16. Crop insurance: Risks and models of insurance

    Directory of Open Access Journals (Sweden)

    Čolović Vladimir

    2014-01-01

    Full Text Available The issue of crop protection is very important because of a variety of risks that could cause difficult consequences. One type of risk protection is insurance. The author in the paper states various models of insurance in some EU countries and the systems of subsidizing of insurance premiums by state. The author also gives a picture of crop insurance in the U.S., noting that in this country pays great attention to this matter. As for crop insurance in Serbia, it is not at a high level. The main problem with crop insurance is not only the risks but also the way of protection through insurance. The basic question that arises not only in the EU is the question is who will insure and protect crops. There are three possibilities: insurance companies under state control, insurance companies that are public-private partnerships or private insurance companies on a purely commercial basis.

  17. Contributions of cerebellar event-based temporal processing and preparatory function to speech perception.

    Science.gov (United States)

    Schwartze, Michael; Kotz, Sonja A

    2016-10-01

    The role of the cerebellum in the anatomical and functional architecture of the brain is a matter of ongoing debate. We propose that cerebellar temporal processing contributes to speech perception on a number of accounts: temporally precise cerebellar encoding and rapid transmission of an event-based representation of the temporal structure of the speech signal serves to prepare areas in the cerebral cortex for the subsequent perceptual integration of sensory information. As speech dynamically evolves in time this fundamental preparatory function may extend its scope to the predictive allocation of attention in time and supports the fine-tuning of temporally specific models of the environment. In this framework, an oscillatory account considering a range of frequencies may best serve the linking of the temporal and speech processing systems. Lastly, the concerted action of these processes may not only advance predictive adaptation to basic auditory dynamics but optimize the perceptual integration of speech. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Construction Safety Risk Modeling and Simulation.

    Science.gov (United States)

    Tixier, Antoine J-P; Hallowell, Matthew R; Rajagopalan, Balaji

    2017-10-01

    By building on a genetic-inspired attribute-based conceptual framework for safety risk analysis, we propose a novel approach to define, model, and simulate univariate and bivariate construction safety risk at the situational level. Our fully data-driven techniques provide construction practitioners and academicians with an easy and automated way of getting valuable empirical insights from attribute-based data extracted from unstructured textual injury reports. By applying our methodology on a data set of 814 injury reports, we first show the frequency-magnitude distribution of construction safety risk to be very similar to that of many natural phenomena such as precipitation or earthquakes. Motivated by this observation, and drawing on state-of-the-art techniques in hydroclimatology and insurance, we then introduce univariate and bivariate nonparametric stochastic safety risk generators based on kernel density estimators and copulas. These generators enable the user to produce large numbers of synthetic safety risk values faithful to the original data, allowing safety-related decision making under uncertainty to be grounded on extensive empirical evidence. One of the implications of our study is that like natural phenomena, construction safety may benefit from being studied quantitatively by leveraging empirical data rather than strictly being approached through a managerial perspective using subjective data, which is the current industry standard. Finally, a side but interesting finding is that in our data set, attributes related to high energy levels (e.g., machinery, hazardous substance) and to human error (e.g., improper security of tools) emerge as strong risk shapers. © 2017 Society for Risk Analysis.

  19. Human Plague Risk: Spatial-Temporal Models

    Science.gov (United States)

    Pinzon, Jorge E.

    2010-01-01

    This chpater reviews the use of spatial-temporal models in identifying potential risks of plague outbreaks into the human population. Using earth observations by satellites remote sensing there has been a systematic analysis and mapping of the close coupling between the vectors of the disease and climate variability. The overall result is that incidence of plague is correlated to positive El Nino/Southem Oscillation (ENSO).

  20. Operational risk modeled analytically II: the consequences of classification invariance

    OpenAIRE

    Vivien Brunel

    2015-01-01

    Most of the banks' operational risk internal models are based on loss pooling in risk and business line categories. The parameters and outputs of operational risk models are sensitive to the pooling of the data and the choice of the risk classification. In a simple model, we establish the link between the number of risk cells and the model parameters by requiring invariance of the bank's loss distribution upon a change in classification. We provide details on the impact of this requirement on...

  1. Asteroid! An Event-Based Science Module. Teacher's Guide. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  2. Asteroid! An Event-Based Science Module. Student Edition. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  3. Oil Spill! An Event-Based Science Module. Student Edition. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  4. Oil Spill!: An Event-Based Science Module. Teacher's Guide. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  5. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  6. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  7. Hurricane! An Event-Based Science Module. Student Edition. Meteorology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  8. Hurricane!: An Event-Based Science Module. Teacher's Guide. Meteorology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn about problems with hurricanes and scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning,…

  9. Fraud! An Event-Based Science Module. Student Edition. Chemistry Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  10. Fraud! An Event-Based Science Module. Teacher's Guide. Chemistry Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school life science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  11. Groundwater Risk Assessment Model (GRAM: Groundwater Risk Assessment Model for Wellfield Protection

    Directory of Open Access Journals (Sweden)

    Nara Somaratne

    2013-09-01

    Full Text Available A groundwater risk assessment was carried out for 30 potable water supply systems under a framework of protecting drinking water quality across South Australia. A semi-quantitative Groundwater Risk Assessment Model (GRAM was developed based on a “multi-barrier” approach using likelihood of release, contaminant pathway and consequence equation. Groundwater vulnerability and well integrity have been incorporated to the pathway component of the risk equation. The land use of the study basins varies from protected water reserves to heavily stocked grazing lands. Based on the risk assessment, 15 systems were considered as low risk, four as medium and 11 systems as at high risk. The GRAM risk levels were comparable with indicator bacteria—total coliform—detection. Most high risk systems were the result of poor well construction and casing corrosion rather than the land use. We carried out risk management actions, including changes to well designs and well operational practices, design to increase time of residence and setting the production zone below identified low permeable zones to provide additional barriers to contaminants. The highlight of the risk management element is the well integrity testing using down hole geophysical methods and camera views of the casing condition.

  12. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event-basered...... defineret og udført i en almindelig web-browser....

  13. A Risk Management Model for Merger and Acquisition

    OpenAIRE

    B. S. Chui

    2011-01-01

    In this paper, a merger and acquisition risk management model is proposed for considering risk factors in the merger and acquisition activities. The proposed model aims to maximize the probability of success in merger and acquisition activities by managing and reducing the associated risks. The modeling of the proposed merger and acquisition risk management model is described and illustrated in this paper. The illustration result shows that the proposed model can help to screen the best targe...

  14. Regional scale ecological risk assessment: using the relative risk model

    National Research Council Canada - National Science Library

    Landis, Wayne G

    2005-01-01

    ...) in the performance of regional-scale ecological risk assessments. The initial chapters present the methodology and the critical nature of the interaction between risk assessors and decision makers...

  15. Modelling and Simulating of Risk Behaviours in Virtual Environments Based on Multi-Agent and Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Linqin Cai

    2013-11-01

    Full Text Available Due to safety and ethical issues, traditional experimental approaches to modelling underground risk behaviours can be costly, dangerous and even impossible to realize. Based on multi-agent technology, a virtual coalmine platform for risk behaviour simulation is presented to model and simulate the human-machine-environment related risk factors in underground coalmines. To reveal mine workers' risk behaviours, a fuzzy emotional behaviour model is proposed to simulate underground miners' responding behaviours to potential hazardous events based on cognitive appraisal theories and fuzzy logic techniques. The proposed emotion model can generate more believable behaviours for virtual miners according to personalized emotion states, internal motivation needs and behaviour selection thresholds. Finally, typical accident cases of underground hazard spotting and locomotive transport were implemented. The behaviour believability of virtual miners was evaluated with a user assessment method. Experimental results show that the proposed models can create more realistic and reasonable behaviours in virtual coalmine environments, which can improve miners' risk awareness and further train miners' emergent decision-making ability when facing unexpected underground situations.

  16. A novel probabilistic framework for event-based speech recognition

    Science.gov (United States)

    Juneja, Amit; Espy-Wilson, Carol

    2003-10-01

    One of the reasons for unsatisfactory performance of the state-of-the-art automatic speech recognition (ASR) systems is the inferior acoustic modeling of low-level acoustic-phonetic information in the speech signal. An acoustic-phonetic approach to ASR, on the other hand, explicitly targets linguistic information in the speech signal, but such a system for continuous speech recognition (CSR) is not known to exist. A probabilistic and statistical framework for CSR based on the idea of the representation of speech sounds by bundles of binary valued articulatory phonetic features is proposed. Multiple probabilistic sequences of linguistically motivated landmarks are obtained using binary classifiers of manner phonetic features-syllabic, sonorant and continuant-and the knowledge-based acoustic parameters (APs) that are acoustic correlates of those features. The landmarks are then used for the extraction of knowledge-based APs for source and place phonetic features and their binary classification. Probabilistic landmark sequences are constrained using manner class language models for isolated or connected word recognition. The proposed method could overcome the disadvantages encountered by the early acoustic-phonetic knowledge-based systems that led the ASR community to switch to systems highly dependent on statistical pattern analysis methods and probabilistic language or grammar models.

  17. Modeling Financial Risk in Telecommunication Field

    Directory of Open Access Journals (Sweden)

    Natalia V. Kuznietsova

    2017-10-01

    Full Text Available Background. The telecommunication field in Ukraine is dynamically developing continuously renewing its proposals for the market and consumer requirements. That is why a timely estimation of financial risks and optimization of financial expenses regarding development of new components and possible losses of clients is especially urgent problem today. Objective. The aim of the paper is to suggest an approach for estimation of financial risks and forecasting of the client loss and optimal service time utilization based on intellectual data analysis and behavior models. Methods. To determine the probability of customer loss the neural networks theory, gradient busting, random forest and logistic regression are used. The survival analysis models for possible client transition time to another company are developed. Results. The best model for forecasting the clients intending for transition to another telecommunication company turned out to be the one based on gradient busting. Conclusions. It was shown that timely estimation of financial losses, provoked by possible loss of clients, is an urgent task for intellectual data analysis. A perspective approach for optimization of the company financial resources is determining the time period related to possible loss of clients.

  18. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    Science.gov (United States)

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2; Occupational hazards risk assessment index=2Health effect level×2exposure ratio×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions and

  19. SParSE++: improved event-based stochastic parameter search.

    Science.gov (United States)

    Roh, Min K; Daigle, Bernie J

    2016-11-25

    Despite the increasing availability of high performance computing capabilities, analysis and characterization of stochastic biochemical systems remain a computational challenge. To address this challenge, the Stochastic Parameter Search for Events (SParSE) was developed to automatically identify reaction rates that yield a probabilistic user-specified event. SParSE consists of three main components: the multi-level cross-entropy method, which identifies biasing parameters to push the system toward the event of interest, the related inverse biasing method, and an optional interpolation of identified parameters. While effective for many examples, SParSE depends on the existence of a sufficient amount of intrinsic stochasticity in the system of interest. In the absence of this stochasticity, SParSE can either converge slowly or not at all. We have developed SParSE++, a substantially improved algorithm for characterizing target events in terms of system parameters. SParSE++ makes use of a series of novel parameter leaping methods that accelerate the convergence rate to the target event, particularly in low stochasticity cases. In addition, the interpolation stage is modified to compute multiple interpolants and to choose the optimal one in a statistically rigorous manner. We demonstrate the performance of SParSE++ on four example systems: a birth-death process, a reversible isomerization model, SIRS disease dynamics, and a yeast polarization model. In all four cases, SParSE++ shows significantly improved computational efficiency over SParSE, with the largest improvements resulting from analyses with the strictest error tolerances. As researchers continue to model realistic biochemical systems, the need for efficient methods to characterize target events will grow. The algorithmic advancements provided by SParSE++ fulfill this need, enabling characterization of computationally intensive biochemical events that are currently resistant to analysis.

  20. NGNP Risk Management Database: A Model for Managing Risk

    Energy Technology Data Exchange (ETDEWEB)

    John Collins

    2009-09-01

    To facilitate the implementation of the Risk Management Plan, the Next Generation Nuclear Plant (NGNP) Project has developed and employed an analytical software tool called the NGNP Risk Management System (RMS). A relational database developed in Microsoft® Access, the RMS provides conventional database utility including data maintenance, archiving, configuration control, and query ability. Additionally, the tool’s design provides a number of unique capabilities specifically designed to facilitate the development and execution of activities outlined in the Risk Management Plan. Specifically, the RMS provides the capability to establish the risk baseline, document and analyze the risk reduction plan, track the current risk reduction status, organize risks by reference configuration system, subsystem, and component (SSC) and Area, and increase the level of NGNP decision making.

  1. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  2. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  3. Risk-driven security testing using risk analysis with threat modeling approach.

    Science.gov (United States)

    Palanivel, Maragathavalli; Selvadurai, Kanmani

    2014-01-01

    Security testing is a process of determining risks present in the system states and protects them from vulnerabilities. But security testing does not provide due importance to threat modeling and risk analysis simultaneously that affects confidentiality and integrity of the system. Risk analysis includes identification, evaluation and assessment of risks. Threat modeling approach is identifying threats associated with the system. Risk-driven security testing uses risk analysis results in test case identification, selection and assessment to prioritize and optimize the testing process. Threat modeling approach, STRIDE is generally used to identify both technical and non-technical threats present in the system. Thus, a security testing mechanism based on risk analysis results using STRIDE approach has been proposed for identifying highly risk states. Risk metrics considered for testing includes risk impact, risk possibility and risk threshold. Risk threshold value is directly proportional to risk impact and risk possibility. Risk-driven security testing results in reduced test suite which in turn reduces test case selection time. Risk analysis optimizes the test case selection and execution process. For experimentation, the system models namely LMS, ATM, OBS, OSS and MTRS are considered. The performance of proposed system is analyzed using Test Suite Reduction Rate (TSRR) and FSM coverage. TSRR varies from 13.16 to 21.43% whereas FSM coverage is achieved up to 91.49%. The results show that the proposed method combining risk analysis with threat modeling identifies states with high risks to improve the testing efficiency.

  4. Incorporating Enterprise Risk Management in the Business Model Innovation Process

    OpenAIRE

    Yariv Taran; Harry Boer; Peter Lindgren

    2013-01-01

    Purpose: Relative to other types of innovations, little is known about business model innovation, let alone the process of managing the risks involved in that process. Using the emerging (enterprise) risk management literature, an approach is proposed through which risk management can be embedded in the business model innovation process. Design: The integrated business model innovation risk management model developed in this paper has been tested through an action research study in a Dani...

  5. Risk Assessment Model and Supply Chain Risk Catalog

    OpenAIRE

    Borut Jereb; Tina Cvahte; Bojan Rosi

    2011-01-01

    By managing risk on the level of the supply chain we gain insight of all potential threats to all organizations involved in the chain as well as to the supply chain itself, especially to the logistics resources: flow of goods, services and information; logistics infrastructure and suprastructure; and people. Supply chain risk management should represent a crucial activity in every organization. As there is currently no standard specifically aimed at holistic supply chain risk management, we p...

  6. IDA’s Integrated Risk Assessment and Management Model

    Science.gov (United States)

    2009-06-01

    Management Model James S. Thomason, Project Leader Approved for public release; distribution is unlimited. This work was conducted under IDA’s independent...P-4470 IDA’s Integrated Risk Assessment and Management Model James S. Thomason, Project Leader IDA’s Integrated Risk Assessment and... Management Model INSTITUTE FOR DEFENSE ANALYSES “There has been plenty of risk management talk,

  7. Assessing the Continuum of Event-Based Biosurveillance Through an Operational Lens

    Energy Technology Data Exchange (ETDEWEB)

    Corley, Courtney D.; Lancaster, Mary J.; Brigantic, Robert T.; Chung, James S.; Walters, Ronald A.; Arthur, Ray; Bruckner-Lea, Cindy J.; Calapristi, Augustin J.; Dowling, Glenn; Hartley, David M.; Kennedy, Shaun; Kircher, Amy; Klucking, Sara; Lee, Eva K.; McKenzie, Taylor K.; Nelson, Noele P.; Olsen, Jennifer; Pancerella, Carmen M.; Quitugua, Teresa N.; Reed, Jeremy T.; Thomas, Carla S.

    2012-03-28

    This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have significant impact in the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance (EBB) technologies is unclear. This manuscript frames the continuum of EBB methods, models, and constructs through an operational lens (i.e., aspects and attributes associated with operational considerations in the development, testing, and validation of the EBB methods and models and their use in an operational environment). A 2-day subject matter expert workshop was held to scientifically identify, develop, and vet a set of attributes for the broad range of such operational considerations. Workshop participants identified and described comprehensive attributes for the characterization of EBB. The identified attributes are: (1) event, (2) readiness, (3) operational aspects, (4) geographic coverage, (5) population coverage, (6) input data, (7) output, and (8) cost. Ultimately, the analyses herein discuss the broad scope, complexity, and relevant issues germane to EBB useful in an operational environment.

  8. Model-Based Mitigation of Availability Risks

    NARCIS (Netherlands)

    Zambon, Emmanuele; Bolzoni, D.; Etalle, Sandro; Salvato, Marco

    2007-01-01

    The assessment and mitigation of risks related to the availability of the IT infrastructure is becoming increasingly important in modern organizations. Unfortunately, present standards for Risk Assessment and Mitigation show limitations when evaluating and mitigating availability risks. This is due

  9. Pymote: High Level Python Library for Event-Based Simulation and Evaluation of Distributed Algorithms

    National Research Council Canada - National Science Library

    Arbula, Damir; Lenac, Kristijan

    2013-01-01

    .... Simulation is a fundamental part of distributed algorithm design and evaluation process. In this paper, we present a library for event-based simulation and evaluation of distributed algorithms...

  10. A Risk Management Model for Merger and Acquisition

    Directory of Open Access Journals (Sweden)

    B. S. Chui

    2011-05-01

    Full Text Available In this paper, a merger and acquisition risk management model is proposed for considering risk factors in the merger and acquisition activities. The proposed model aims to maximize the probability of success in merger and acquisition activities by managing and reducing the associated risks. The modeling of the proposed merger and acquisition risk management model is described and illustrated in this paper. The illustration result shows that the proposed model can help to screen the best target company with minimum associated risks in the merger and acquisition activity.

  11. THE MODEL FOR RISK ASSESSMENT ERP-SYSTEMS INFORMATION SECURITY

    Directory of Open Access Journals (Sweden)

    V. S. Oladko

    2016-12-01

    Full Text Available The article deals with the problem assessment of information security risks in the ERP-system. ERP-system functions and architecture are studied. The model malicious impacts on levels of ERP-system architecture are composed. Model-based risk assessment, which is the quantitative and qualitative approach to risk assessment, built on the partial unification 3 methods for studying the risks of information security - security models with full overlapping technique CRAMM and FRAP techniques developed.

  12. Multi-locus models of genetic risk of disease

    Science.gov (United States)

    2010-01-01

    Background Evidence for genetic contribution to complex diseases is described by recurrence risks to relatives of diseased individuals. Genome-wide association studies allow a description of the genetics of the same diseases in terms of risk loci, their effects and allele frequencies. To reconcile the two descriptions requires a model of how risks from individual loci combine to determine an individual's overall risk. Methods We derive predictions of risk to relatives from risks at individual loci under a number of models and compare them with published data on disease risk. Results The model in which risks are multiplicative on the risk scale implies equality between the recurrence risk to monozygotic twins and the square of the recurrence risk to sibs, a relationship often not observed, especially for low prevalence diseases. We show that this theoretical equality is achieved by allowing impossible probabilities of disease. Other models, in which probabilities of disease are constrained to a maximum of one, generate results more consistent with empirical estimates for a range of diseases. Conclusions The unconstrained multiplicative model, often used in theoretical studies because of its mathematical tractability, is not a realistic model. We find three models, the constrained multiplicative, Odds (or Logit) and Probit (or liability threshold) models, all fit the data on risk to relatives. Currently, in practice it would be difficult to differentiate between these models, but this may become possible if genetic variants that explain the majority of the genetic variance are identified. PMID:20181060

  13. Modelling of Malaria Risk Areas in Ghana by using Environmental ...

    African Journals Online (AJOL)

    This research sought to use GIS and multi-criteria decision analysis to produce a predictive model of malaria using eight risk factors ranging from environmental to anthropogenic. Each of the risk factors was classified into three classes of malaria risk according to how it impacts malaria prevalence. The classified risk factors ...

  14. Model-based assessment of erosion risks on man-made slopes in recultivation areas

    Science.gov (United States)

    Kunth, F.; Schmidt, J.

    2012-04-01

    The present study deals with non-vegetated slopes of post mining areas which are heavily endangered by soil erosion by water. The prevention of massive on-site damages as well as off-site effects by the emission of acid dump materials is one of the major challenges in the context of recultivation of closed-down open cast mining areas. Hence, the aim of this study is the development of a reproducible methodology to determine erosion risks on slopes in recultivation areas. Moreover, a standardised technique is developed to plan, dimension and test erosion protection measures in recultivation landscapes. The analyses of the study are based on the event-based physical erosion model EROSION 3D. The widely used model is able to predict runoff as well as detachment, transport and deposition of sediments. Its use and validation ranges from erosion prediction from agricultural land to sediment input into water bodies. The required input parameters of EROSION 2D/3D (hydraulic roughness, infiltration rates etc.) were determined under field conditions by simulated rainfall experiments. These field experiments took place on selected non-vegetated plots of the Lusatian mining district in eastern Germany. Due to their huge influence on infiltration and erosion processes special characteristics of coal-containing dump soils (hydrophobicity, air trapping effect) have to be considered and implemented into the model within this survey.

  15. A semi-quantitative model for risk appreciation and risk weighing

    NARCIS (Netherlands)

    Bos, P.M.J.; Boon, P.E.; Voet, van der H.; Janer, G.; Piersma, A.H.; Bruschweiler, B.; Nielsen, E.; Slob, W.

    2009-01-01

    Risk managers need detailed information on (1) the type of effect, (2) the size (severity) of the expected effect(s) and (3) the fraction of the population at risk to decide on well-balanced risk reduction measures. A previously developed integrated probabilistic risk assessment (IPRA) model

  16. Proliferation Risk Characterization Model Prototype Model - User and Programmer Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Dukelow, J.S.; Whitford, D.

    1998-12-01

    A model for the estimation of the risk of diversion of weapons-capable materials was developed. It represents both the threat of diversion and site vulnerability as a product of a small number of variables (two to eight), each of which can take on a small number (two to four) of qualitatively defined (but quantitatively implemented) values. The values of the overall threat and vulnerability variables are then converted to threat and vulnerability categories. The threat and vulnerability categories are used to define the likelihood of diversion, also defined categorically. The evaluator supplies an estimate of the consequences of a diversion, defined categorically, but with the categories based on the IAEA Attractiveness levels. Likelihood and Consequences categories are used to define the Risk, also defined categorically. The threat, vulnerability, and consequences input provided by the evaluator contains a representation of his/her uncertainty in each variable assignment which is propagated all the way through to the calculation of the Risk categories. [Appendix G available on diskette only.

  17. Ant colony optimization and event-based dynamic task scheduling and staffing for software projects

    Science.gov (United States)

    Ellappan, Vijayan; Ashwini, J.

    2017-11-01

    In programming change organizations from medium to inconceivable scale broadens, the issue of wander orchestrating is amazingly unusual and testing undertaking despite considering it a manual system. Programming wander-organizing requirements to deal with the issue of undertaking arranging and in addition the issue of human resource portion (also called staffing) in light of the way that most of the advantages in programming ventures are individuals. We propose a machine learning approach with finds respond in due order regarding booking by taking in the present arranging courses of action and an event based scheduler revives the endeavour arranging system moulded by the learning computation in perspective of the conformity in event like the begin with the Ander, the instant at what time possessions be free starting to ended errands, and the time when delegates stick together otherwise depart the wander inside the item change plan. The route toward invigorating the timetable structure by the even based scheduler makes the arranging method dynamic. It uses structure components to exhibit the interrelated surges of endeavours, slip-ups and singular all through different progression organizes and is adjusted to mechanical data. It increases past programming wander movement ask about by taking a gander at a survey based process with a one of a kind model, organizing it with the data based system for peril assessment and cost estimation, and using a choice showing stage.

  18. Too exhausted to remember: ego depletion undermines subsequent event-based prospective memory.

    Science.gov (United States)

    Li, Jian-Bin; Nie, Yan-Gang; Zeng, Min-Xia; Huntoon, Meghan; Smith, Jessi L

    2013-01-01

    Past research has consistently found that people are likely to do worse on high-level cognitive tasks after exerting self-control on previous actions. However, little has been unraveled about to what extent ego depletion affects subsequent prospective memory. Drawing upon the self-control strength model and the relationship between self-control resources and executive control, this study proposes that the initial actions of self-control may undermine subsequent event-based prospective memory (EBPM). Ego depletion was manipulated through watching a video requiring visual attention (Experiment 1) or completing an incongruent Stroop task (Experiment 2). Participants were then tested on EBPM embedded in an ongoing task. As predicted, the results showed that after ruling out possible intervening variables (e.g. mood, focal and nonfocal cues, and characteristics of ongoing task and ego depletion task), participants in the high-depletion condition performed significantly worse on EBPM than those in the low-depletion condition. The results suggested that the effect of ego depletion on EBPM was mainly due to an impaired prospective component rather than to a retrospective component.

  19. Overpaying morbidity adjusters in risk equalization models

    NARCIS (Netherlands)

    R.C. van Kleef (Richard); R.C.J.A. van Vliet (René); W.P.M.M. van de Ven (Wynand)

    2016-01-01

    textabstractMost competitive social health insurance markets include risk equalization to compensate insurers for predictable variation in healthcare expenses. Empirical literature shows that even the most sophisticated risk equalization models—with advanced morbidity adjusters—substantially

  20. Modeling HIV Risk in Highly Vulnerable Youth

    Science.gov (United States)

    Huba, G. J.; Panter, A. T.; Melchior, Lisa A.; Trevithick, Lee; Woods, Elizabeth R.; Wright, Eric; Feudo, Rudy; Tierney, Steven; Schneir, Arlene; Tenner, Adam; Remafedi, Gary; Greenberg, Brian; Sturdevant, Marsha; Goodman, Elizabeth; Hodgins, Antigone; Wallace, Michael; Brady, Russell E.; Singer, Barney; Marconi, Katherine

    2003-01-01

    This article examines the structure of several HIV risk behaviors in an ethnically and geographically diverse sample of 8,251 clients from 10 innovative demonstration projects intended for adolescents living with, or at risk for, HIV. Exploratory and confirmatory factor analyses identified 2 risk factors for men (sexual intercourse with men and a…

  1. Modeling HIV risk in highly vulnerable youth

    NARCIS (Netherlands)

    Huba, GJ; Panter, AT; Melchior, LA; Trevithick, L; Woods, ER; Wright, E; Feudo, R; Tierney, S; Schneir, A; Tenner, A; Remafedi, G; Greenberg, B; Sturdevant, M; Goodman, E; Hodgins, A; Wallace, M; Brady, RE; Singer, B; Marconi, K

    2003-01-01

    This article examines the structure of several HIV risk behaviors in an ethnically and geographically diverse sample of 8,251 clients from 10 innovative demonstration projects intended for adolescents living with, or at risk for, HIV. Exploratory and confirmatory factor analyses identified 2 risk

  2. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate......Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...

  3. Incorporating Enterprise Risk Management in the Business Model Innovation Process

    Directory of Open Access Journals (Sweden)

    Yariv Taran

    2013-12-01

    Full Text Available Purpose: Relative to other types of innovations, little is known about business model innovation, let alone the process of managing the risks involved in that process. Using the emerging (enterprise risk management literature, an approach is proposed through which risk management can be embedded in the business model innovation process. Design: The integrated business model innovation risk management model developed in this paper has been tested through an action research study in a Danish company. Findings: The study supports our proposition that the implementation of risk management throughout the innovation process reduces the risks related to the uncertainty and complexity of developing and implementing a new business model. Originality: The study supports the proposition that the implementation of risk management throughout the innovation process reduces the risks related to the uncertainty and complexity of developing and implementing a new business model. The business model risk management model makes managers much more focused on identifying problematic issues and putting explicit plans and timetables into place for resolving/reducing risks, and assists companies in aligning the risk treatment choices made during the

  4. Modeling Extinction Risk of Endemic Birds of Mainland China

    OpenAIRE

    Youhua Chen

    2013-01-01

    The extinction risk of endemic birds of mainland China was modeled over evolutionary time. Results showed that extinction risk of endemic birds in mainland China always tended to be similar within subclades over the evolutionary time of species divergence, and the overall evolution of extinction risk of species presented a conservatism pattern, as evidenced by the disparity-through-time plot. A constant-rate evolutionary model was the best one to quantify the evolution of extinction risk of e...

  5. Managing multiple international risks simultaneously with an optimal hedging model

    OpenAIRE

    Gboroton F. Sarassoro; Raymond M. Leuthold

    1991-01-01

    A risk management model based on portfolio theory which accounts jointly for price, quantity, interest rate and exchange rate risks is developed and applied to cocoa and coffee production and exports in the Ivory Coast. Utilizing commodity and financial futures markets jointly, the results show that a government export agency can reduce risks from 27% to 89% by following a multicommodity hedging program which manages several risks simultaneously. The model and technique developed are applicab...

  6. The globalization of risk and risk perception: why we need a new model of risk communication for vaccines.

    Science.gov (United States)

    Larson, Heidi; Brocard Paterson, Pauline; Erondu, Ngozi

    2012-11-01

    Risk communication and vaccines is complex and the nature of risk perception is changing, with perceptions converging, evolving and having impacts well beyond specific geographic localities and points in time, especially when amplified through the Internet and other modes of global communication. This article examines the globalization of risk perceptions and their impacts, including the example of measles and the globalization of measles, mumps and rubella (MMR) vaccine risk perceptions, and calls for a new, more holistic model of risk assessment, risk communication and risk mitigation, embedded in an ongoing process of risk management for vaccines and immunization programmes. It envisions risk communication as an ongoing process that includes trust-building strategies hand-in-hand with operational and policy strategies needed to mitigate and manage vaccine-related risks, as well as perceptions of risk.

  7. A comparative review of radiation-induced cancer risk models

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Hee; Kim, Ju Youl [FNC Technology Co., Ltd., Yongin (Korea, Republic of); Han, Seok Jung [Risk and Environmental Safety Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-06-15

    With the need for a domestic level 3 probabilistic safety assessment (PSA), it is essential to develop a Korea-specific code. Health effect assessments study radiation-induced impacts; in particular, long-term health effects are evaluated in terms of cancer risk. The objective of this study was to analyze the latest cancer risk models developed by foreign organizations and to compare the methodology of how they were developed. This paper also provides suggestions regarding the development of Korean cancer risk models. A review of cancer risk models was carried out targeting the latest models: the NUREG model (1993), the BEIR VII model (2006), the UNSCEAR model (2006), the ICRP 103 model (2007), and the U.S. EPA model (2011). The methodology of how each model was developed is explained, and the cancer sites, dose and dose rate effectiveness factor (DDREF) and mathematical models are also described in the sections presenting differences among the models. The NUREG model was developed by assuming that the risk was proportional to the risk coefficient and dose, while the BEIR VII, UNSCEAR, ICRP, and U.S. EPA models were derived from epidemiological data, principally from Japanese atomic bomb survivors. The risk coefficient does not consider individual characteristics, as the values were calculated in terms of population-averaged cancer risk per unit dose. However, the models derived by epidemiological data are a function of sex, exposure age, and attained age of the exposed individual. Moreover, the methodologies can be used to apply the latest epidemiological data. Therefore, methodologies using epidemiological data should be considered first for developing a Korean cancer risk model, and the cancer sites and DDREF should also be determined based on Korea-specific studies. This review can be used as a basis for developing a Korean cancer risk model in the future.

  8. MANAGING BRAND EQUITY RISK: ADDING EXOGENOUS RISKS TO AN EVALUATION MODEL

    OpenAIRE

    Catalin Mihail Barbu; Sorin Tudor; Dorian Laurentiu Florea

    2014-01-01

    Risk can no longer be ignored when talking about brand management, as risk management can no longer disregard brands for manifold reasons. Building on the risk-based brand equity model, this paper contributes to the development of an evaluation model, by suggesting formulas for 3 exogenous risk sources related to the market and competitive structure: the new brand marketing effort, consumer behavior change, and the extant brands adaptation.

  9. Prospective memory while driving: comparison of time- and event-based intentions.

    Science.gov (United States)

    Trawley, Steven L; Stephens, Amanda N; Rendell, Peter G; Groeger, John A

    2017-06-01

    Prospective memories can divert attentional resources from ongoing activities. However, it is unclear whether these effects and the theoretical accounts that seek to explain them will generalise to a complex real-world task such as driving. Twenty-four participants drove two simulated routes while maintaining a fixed headway with a lead vehicle. Drivers were given either event-based (e.g. arriving at a filling station) or time-based errands (e.g. on-board clock shows 3:30). In contrast to the predominant view in the literature which suggests time-based tasks are more demanding, drivers given event-based errands showed greater difficulty in mirroring lead vehicle speed changes compared to the time-based group. Results suggest that common everyday secondary tasks, such as scouting the roadside for a bank, may have a detrimental impact on driving performance. The additional finding that this cost was only evident with the event-based task highlights a potential area of both theoretical and practical interest. Practitioner Summary: Drivers were given either time- or event-based errands whilst engaged in a simulated drive. We examined the effect of errands on an ongoing vehicle follow task. In contrast to previous non-driving studies, event-based errands are more disruptive. Common everyday errands may have a detrimental impact on driving performance.

  10. Efficiency of Event-Based Sampling According to Error Energy Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2010-03-01

    Full Text Available The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control. Compared to the integral sampling criteria, the error energy criterion gives more weight to extreme sampling error values. The proposed sampling principle extends a range of event-based sampling schemes and makes the choice of particular sampling criterion more flexible to application requirements. In the paper, it is proved analytically that the proposed event-based sampling criterion is more effective than the periodic sampling by a factor defined by the ratio of the maximum to the mean of the cubic root of the signal time-derivative square in the analyzed time interval. Furthermore, it is shown that the sampling according to energy criterion is less effective than the send-on-delta scheme but more effective than the sampling according to integral criterion. On the other hand, it is indicated that higher effectiveness in sampling according to the selected event-based criterion is obtained at the cost of increasing the total sampling error defined as the sum of errors for all the samples taken.

  11. Efficiency of event-based sampling according to error energy criterion.

    Science.gov (United States)

    Miskowicz, Marek

    2010-01-01

    The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control). Compared to the integral sampling criteria, the error energy criterion gives more weight to extreme sampling error values. The proposed sampling principle extends a range of event-based sampling schemes and makes the choice of particular sampling criterion more flexible to application requirements. In the paper, it is proved analytically that the proposed event-based sampling criterion is more effective than the periodic sampling by a factor defined by the ratio of the maximum to the mean of the cubic root of the signal time-derivative square in the analyzed time interval. Furthermore, it is shown that the sampling according to energy criterion is less effective than the send-on-delta scheme but more effective than the sampling according to integral criterion. On the other hand, it is indicated that higher effectiveness in sampling according to the selected event-based criterion is obtained at the cost of increasing the total sampling error defined as the sum of errors for all the samples taken.

  12. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    ) innovation is a risky enterprise, many companies are still choosing not to apply any risk management in the BM innovation process. The objective of this paper is to develop a better understanding of how risks are handled in the practice of BM innovation. An analysis of the BM innovation experiences of two...... industrial companies shows that both companies are experiencing high levels of uncertainty and complexity during their innovation processes and are, consequently, struggling to find new processes for handling the risks involved. Based on the two companies’ experiences, various testable propositions are put...... forward, which link success and failure to the way companies appreciate and handle the risks involved in BM innovation....

  13. Digital disease detection: A systematic review of event-based internet biosurveillance systems.

    Science.gov (United States)

    O'Shea, Jesse

    2017-05-01

    Internet access and usage has changed how people seek and report health information. Meanwhile,infectious diseases continue to threaten humanity. The analysis of Big Data, or vast digital data, presents an opportunity to improve disease surveillance and epidemic intelligence. Epidemic intelligence contains two components: indicator based and event-based. A relatively new surveillance type has emerged called event-based Internet biosurveillance systems. These systems use information on events impacting health from Internet sources, such as social media or news aggregates. These systems circumvent the limitations of traditional reporting systems by being inexpensive, transparent, and flexible. Yet, innovations and the functionality of these systems can change rapidly. To update the current state of knowledge on event-based Internet biosurveillance systems by identifying all systems, including current functionality, with hopes to aid decision makers with whether to incorporate new methods into comprehensive programmes of surveillance. A systematic review was performed through PubMed, Scopus, and Google Scholar databases, while also including grey literature and other publication types. 50 event-based Internet systems were identified, including an extraction of 15 attributes for each system, described in 99 articles. Each system uses different innovative technology and data sources to gather data, process, and disseminate data to detect infectious disease outbreaks. The review emphasises the importance of using both formal and informal sources for timely and accurate infectious disease outbreak surveillance, cataloguing all event-based Internet biosurveillance systems. By doing so, future researchers will be able to use this review as a library for referencing systems, with hopes of learning, building, and expanding Internet-based surveillance systems. Event-based Internet biosurveillance should act as an extension of traditional systems, to be utilised as an

  14. Model Averaging Software for Dichotomous Dose Response Risk Estimation

    Directory of Open Access Journals (Sweden)

    Matthew W. Wheeler

    2008-02-01

    Full Text Available Model averaging has been shown to be a useful method for incorporating model uncertainty in quantitative risk estimation. In certain circumstances this technique is computationally complex, requiring sophisticated software to carry out the computation. We introduce software that implements model averaging for risk assessment based upon dichotomous dose-response data. This software, which we call Model Averaging for Dichotomous Response Benchmark Dose (MADr-BMD, fits the quantal response models, which are also used in the US Environmental Protection Agency benchmark dose software suite, and generates a model-averaged dose response model to generate benchmark dose and benchmark dose lower bound estimates. The software fulfills a need for risk assessors, allowing them to go beyond one single model in their risk assessments based on quantal data by focusing on a set of models that describes the experimental data.

  15. Event-Based Proof of the Mutual Exclusion Property of Peterson’s Algorithm

    Directory of Open Access Journals (Sweden)

    Ivanov Ievgen

    2015-12-01

    Full Text Available Proving properties of distributed algorithms is still a highly challenging problem and various approaches that have been proposed to tackle it [1] can be roughly divided into state-based and event-based proofs. Informally speaking, state-based approaches define the behavior of a distributed algorithm as a set of sequences of memory states during its executions, while event-based approaches treat the behaviors by means of events which are produced by the executions of an algorithm. Of course, combined approaches are also possible.

  16. Data Scaling for Operational Risk Modelling

    NARCIS (Netherlands)

    H.S. Na; L. Couto Miranda; J.H. van den Berg (Jan); M. Leipoldt

    2006-01-01

    textabstractIn 2004, the Basel Committee on Banking Supervision defined Operational Risk (OR) as the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. After publication of the new capital accord containing this dfinition, statistical

  17. Risk communication: a mental models approach

    National Research Council Canada - National Science Library

    Morgan, M. Granger (Millett Granger)

    2002-01-01

    ... information about risks. The procedure uses approaches from risk and decision analysis to identify the most relevant information; it also uses approaches from psychology and communication theory to ensure that its message is understood. This book is written in nontechnical terms, designed to make the approach feasible for anyone willing to try it. It is illustrat...

  18. Prediction of event-based stormwater runoff quantity and quality by ANNs developed using PMI-based input selection

    Science.gov (United States)

    He, Jianxun; Valeo, Caterina; Chu, Angus; Neumann, Norman F.

    2011-03-01

    SummaryEvent-based stormwater runoff quantity and quality modeling remains a challenge since the processes of rainfall induced pollutant discharge are not completely understood. The complexity of physically-based models often limits the practical use of quality models in practice. Artificial neural networks (ANNs) are a data driven modeling approach that can avoid the necessity of fully understanding complex physical processes. In this paper, feed-forward multi-layer perceptron (MLP) network, a popular type of ANN, was applied to predict stormwater runoff quantity and quality including turbidity, specific conductance, water temperature, pH, and dissolved oxygen (DO) in storm events. A recently proposed input selection algorithm based on partial mutual information (PMI), which identifies input variables in a stepwise manner, was employed to select input variable sets for the development of ANNs. The ANNs developed via this approach could produce satisfactory prediction of event-based stormwater runoff quantity and quality. In particular, this approach demonstrated a superior performance over the approach involving ANNs fed by inputs selected using partial correlation and all potential inputs in flow modeling. This result suggests the applicability of PMI in developing ANN models. In addition, the ANN for flow outperformed conventional multiple linear regression (MLR) and multiple nonlinear regression (MNLR) models. For an ANN development of turbidity (multiplied by flow rate) and specific conductance, significant improvement was achieved by including a previous 3-week total rainfall amount into their input variable sets. This antecedent rainfall variable is considered a factor in the availability of land surface pollutants for wash-off. A sensitivity analysis demonstrated the potential role of this rainfall variable on modeling particulate solids and dissolved matters in stormwater runoff.

  19. Stochastic models in risk theory and management accounting

    NARCIS (Netherlands)

    Brekelmans, R.C.M.

    2000-01-01

    This thesis deals with stochastic models in two fields: risk theory and management accounting. Firstly, two extensions of the classical risk process are analyzed. A method is developed that computes bounds of the probability of ruin for the classical risk rocess extended with a constant interest

  20. Comparison with Sugeno Model and Measurement Of Cancer Risk ...

    African Journals Online (AJOL)

    In this study, for the three cancer types selected as pilot by introducing a new type of fuzzy logic model, the opportunity of revealing of risks for catching these cancer types of people and the opportunity of providing preliminary diagnosis to the person to remove this risk are presented. After the calculation of risk outcome, the ...

  1. Risk Modeling Approaches in Terms of Volatility Banking Transactions

    Directory of Open Access Journals (Sweden)

    Angelica Cucşa (Stratulat

    2016-01-01

    Full Text Available The inseparability of risk and banking activity is one demonstrated ever since banking systems, the importance of the topic being presend in current life and future equally in the development of banking sector. Banking sector development is done in the context of the constraints of nature and number of existing risks and those that may arise, and serves as limiting the risk of banking activity. We intend to develop approaches to analyse risk through mathematical models by also developing a model for the Romanian capital market 10 active trading picks that will test investor reaction in controlled and uncontrolled conditions of risk aggregated with harmonised factors.

  2. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    Science.gov (United States)

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2017-10-01

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  3. Tests of control in the Audit Risk Model : Effective? Efficient?

    NARCIS (Netherlands)

    Blokdijk, J.H. (Hans)

    2004-01-01

    Lately, the Audit Risk Model has been subject to criticism. To gauge its validity, this paper confronts the Audit Risk Model as incorporated in International Standard on Auditing No. 400, with the real life situations faced by auditors in auditing financial statements. This confrontation exposes

  4. Radiation risk estimation based on measurement error models

    CERN Document Server

    Masiuk, Sergii; Shklyar, Sergiy; Chepurny, Mykola; Likhtarov, Illya

    2017-01-01

    This monograph discusses statistics and risk estimates applied to radiation damage under the presence of measurement errors. The first part covers nonlinear measurement error models, with a particular emphasis on efficiency of regression parameter estimators. In the second part, risk estimation in models with measurement errors is considered. Efficiency of the methods presented is verified using data from radio-epidemiological studies.

  5. Terrorism Risk Modeling for Intelligence Analysis and Infrastructure Protection

    Science.gov (United States)

    2007-01-01

    COVERED 00-00-2007 to 00-00-2007 4. TITLE AND SUBTITLE Terrorism Risk Modeling for Intelligence Analysis and Infrastructure Protection 5a...meet high standards for re- search quality and objectivity. Terrorism Risk Modeling for Intelligence Analysis and Infrastructure Protection Henry H...across differ- ent urban areas, to assess terrorism risks within a metropolitan area, and to target intelligence analysis and collection efforts. The

  6. Risk matrix model applied to the outsourcing of logistics' activities

    Directory of Open Access Journals (Sweden)

    Fouad Jawab

    2015-09-01

    Full Text Available Purpose: This paper proposes the application of the risk matrix model in the field of logistics outsourcing. Such an application can serve as the basis for decision making regarding the conduct of a risk management in the logistics outsourcing process and allow its prevention. Design/methodology/approach: This study is based on the risk management of logistics outsourcing in the field of the retail sector in Morocco. The authors identify all possible risks and then classify and prioritize them using the Risk Matrix Model. Finally, we have come to four possible decisions for the identified risks. The analysis was made possible through interviews and discussions with the heads of departments and agents who are directly involved in each outsourced activity. Findings and Originality/value: It is possible to improve the risk matrix model by proposing more personalized prevention measures according to each company that operates in the mass-market retailing. Originality/value: This study is the only one made in the process of logistics outsourcing in the retail sector in Morocco through Label’vie as a case study. First, we had identified as thorough as we could all possible risks, then we applied the Risk Matrix Model to sort them out in an ascending order of importance and criticality. As a result, we could hand out to the decision-makers the mapping for an effective control of risks and a better guiding of the process of risk management.

  7. Disentangling the effect of event-based cues on children's time-based prospective memory performance.

    Science.gov (United States)

    Redshaw, Jonathan; Henry, Julie D; Suddendorf, Thomas

    2016-10-01

    Previous time-based prospective memory research, both with children and with other groups, has measured the ability to perform an action with the arrival of a time-dependent yet still event-based cue (e.g., the occurrence of a specific clock pattern) while also engaged in an ongoing activity. Here we introduce a novel means of operationalizing time-based prospective memory and assess children's growing capacities when the availability of an event-based cue is varied. Preschoolers aged 3, 4, and 5years (N=72) were required to ring a bell when a familiar 1-min sand timer had completed a cycle under four conditions. In a 2×2 within-participants design, the timer was either visible or hidden and was either presented in the context of a single task or embedded within a dual picture-naming task. Children were more likely to ring the bell before 2min had elapsed in the visible-timer and single-task conditions, with performance improving with age across all conditions. These results suggest a divergence in the development of time-based prospective memory in the presence versus absence of event-based cues, and they also suggest that performance on typical time-based tasks may be partly driven by event-based prospective memory. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  9. Event-based prospective memory in depression: The impact of cue focality

    NARCIS (Netherlands)

    Altgassen, A.M.; Kliegel, M.; Martin, M.

    2009-01-01

    This study is the first to compare event-based prospective memory performance in individuals with depression and healthy controls. The degree to which self-initiated processing is required to perform the prospective memory task was varied. Twenty-eight individuals with depression and 32 healthy

  10. Risk management modeling and its application in maritime safety

    Science.gov (United States)

    Qin, Ting-Rong; Chen, Wei-Jiong; Zeng, Xiang-Kun

    2008-12-01

    Quantified risk assessment (QRA) needs mathematicization of risk theory. However, attention has been paid almost exclusively to applications of assessment methods, which has led to neglect of research into fundamental theories, such as the relationships among risk, safety, danger, and so on. In order to solve this problem, as a first step, fundamental theoretical relationships about risk and risk management were analyzed for this paper in the light of mathematics, and then illustrated with some charts. Second, man-machine-environment-management (MMEM) theory was introduced into risk theory to analyze some properties of risk. On the basis of this, a three-dimensional model of risk management was established that includes: a goal dimension; a management dimension; an operation dimension. This goal management operation (GMO) model was explained and then emphasis was laid on the discussion of the risk flowchart (operation dimension), which lays the groundwork for further study of risk management and qualitative and quantitative assessment. Next, the relationship between Formal Safety Assessment (FSA) and Risk Management was researched. This revealed that the FSA method, which the international maritime organization (IMO) is actively spreading, comes from Risk Management theory. Finally, conclusion were made about how to apply this risk management method to concrete fields efficiently and conveniently, as well as areas where further research is required.

  11. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  12. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  13. The complex model of risk and progression of AMD estimation

    Directory of Open Access Journals (Sweden)

    V. S. Akopyan

    2014-07-01

    Full Text Available Purpose: to develop a method and a statistical model to estimate individual risk of AMD and the risk for progression to advanced AMD using clinical and genetic risk factors.Methods: A statistical risk assessment model was developed using stepwise binary logistic regression analysis. to estimate the population differences in the prevalence of allelic variants of genes and for the development of models adapted to the population of Moscow region genotyping and assessment of the influence of other risk factors was performed in two groups: patients with differ- ent stages of AMD (n = 74, and control group (n = 116. Genetic risk factors included in the study: polymorphisms in the complement system genes (C3 and CFH, genes at 10q26 locus (ARMS2 and HtRA1, polymorphism in the mitochondrial gene Mt-ND2. Clinical risk factors included in the study: age, gender, high body mass index, smoking history.Results: A comprehensive analysis of genetic and clinical risk factors for AMD in the study group was performed. Compiled statis- tical model assessment of individual risk of AMD, the sensitivity of the model — 66.7%, specificity — 78.5%, AUC = 0.76. Risk factors of late AMD, compiled a statistical model describing the probability of late AMD, the sensitivity of the model — 66.7%, specificity — 78.3%, AUC = 0.73. the developed system allows determining the most likely version of the current late AMD: dry or wet.Conclusion: the developed test system and the mathematical algorhythm for determining the risk of AMD, risk of progression to advanced AMD have fair diagnostic informative and promising for use in clinical practice.

  14. The complex model of risk and progression of AMD estimation

    Directory of Open Access Journals (Sweden)

    V. S. Akopyan

    2012-01-01

    Full Text Available Purpose: to develop a method and a statistical model to estimate individual risk of AMD and the risk for progression to advanced AMD using clinical and genetic risk factors.Methods: A statistical risk assessment model was developed using stepwise binary logistic regression analysis. to estimate the population differences in the prevalence of allelic variants of genes and for the development of models adapted to the population of Moscow region genotyping and assessment of the influence of other risk factors was performed in two groups: patients with differ- ent stages of AMD (n = 74, and control group (n = 116. Genetic risk factors included in the study: polymorphisms in the complement system genes (C3 and CFH, genes at 10q26 locus (ARMS2 and HtRA1, polymorphism in the mitochondrial gene Mt-ND2. Clinical risk factors included in the study: age, gender, high body mass index, smoking history.Results: A comprehensive analysis of genetic and clinical risk factors for AMD in the study group was performed. Compiled statis- tical model assessment of individual risk of AMD, the sensitivity of the model — 66.7%, specificity — 78.5%, AUC = 0.76. Risk factors of late AMD, compiled a statistical model describing the probability of late AMD, the sensitivity of the model — 66.7%, specificity — 78.3%, AUC = 0.73. the developed system allows determining the most likely version of the current late AMD: dry or wet.Conclusion: the developed test system and the mathematical algorhythm for determining the risk of AMD, risk of progression to advanced AMD have fair diagnostic informative and promising for use in clinical practice.

  15. Adoption of Building Information Modelling in project planning risk management

    Science.gov (United States)

    Mering, M. M.; Aminudin, E.; Chai, C. S.; Zakaria, R.; Tan, C. S.; Lee, Y. Y.; Redzuan, A. A.

    2017-11-01

    An efficient and effective risk management required a systematic and proper methodology besides knowledge and experience. However, if the risk management is not discussed from the starting of the project, this duty is notably complicated and no longer efficient. This paper presents the adoption of Building Information Modelling (BIM) in project planning risk management. The objectives is to identify the traditional risk management practices and its function, besides, determine the best function of BIM in risk management and investigating the efficiency of adopting BIM-based risk management during the project planning phase. In order to obtain data, a quantitative approach is adopted in this research. Based on data analysis, the lack of compliance with project requirements and failure to recognise risk and develop responses to opportunity are the risks occurred when traditional risk management is implemented. When using BIM in project planning, it works as the tracking of cost control and cash flow give impact on the project cycle to be completed on time. 5D cost estimation or cash flow modeling benefit risk management in planning, controlling and managing budget and cost reasonably. There were two factors that mostly benefit a BIM-based technology which were formwork plan with integrated fall plan and design for safety model check. By adopting risk management, potential risks linked with a project and acknowledging to those risks can be identified to reduce them to an acceptable extent. This means recognizing potential risks and avoiding threat by reducing their negative effects. The BIM-based risk management can enhance the planning process of construction projects. It benefits the construction players in various aspects. It is important to know the application of BIM-based risk management as it can be a lesson learnt to others to implement BIM and increase the quality of the project.

  16. A Hierarchal Risk Assessment Model Using the Evidential Reasoning Rule

    Directory of Open Access Journals (Sweden)

    Xiaoxiao Ji

    2017-02-01

    Full Text Available This paper aims to develop a hierarchical risk assessment model using the newly-developed evidential reasoning (ER rule, which constitutes a generic conjunctive probabilistic reasoning process. In this paper, we first provide a brief introduction to the basics of the ER rule and emphasize the strengths for representing and aggregating uncertain information from multiple experts and sources. Further, we discuss the key steps of developing the hierarchical risk assessment framework systematically, including (1 formulation of risk assessment hierarchy; (2 representation of both qualitative and quantitative information; (3 elicitation of attribute weights and information reliabilities; (4 aggregation of assessment information using the ER rule and (5 quantification and ranking of risks using utility-based transformation. The proposed hierarchical risk assessment framework can potentially be implemented to various complex and uncertain systems. A case study on the fire/explosion risk assessment of marine vessels demonstrates the applicability of the proposed risk assessment model.

  17. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    Science.gov (United States)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  18. A comprehensive Network Security Risk Model for process control networks.

    Science.gov (United States)

    Henry, Matthew H; Haimes, Yacov Y

    2009-02-01

    The risk of cyber attacks on process control networks (PCN) is receiving significant attention due to the potentially catastrophic extent to which PCN failures can damage the infrastructures and commodity flows that they support. Risk management addresses the coupled problems of (1) reducing the likelihood that cyber attacks would succeed in disrupting PCN operation and (2) reducing the severity of consequences in the event of PCN failure or manipulation. The Network Security Risk Model (NSRM) developed in this article provides a means of evaluating the efficacy of candidate risk management policies by modeling the baseline risk and assessing expectations of risk after the implementation of candidate measures. Where existing risk models fall short of providing adequate insight into the efficacy of candidate risk management policies due to shortcomings in their structure or formulation, the NSRM provides model structure and an associated modeling methodology that captures the relevant dynamics of cyber attacks on PCN for risk analysis. This article develops the NSRM in detail in the context of an illustrative example.

  19. Flexible competing risks regression modeling and goodness-of-fit

    DEFF Research Database (Denmark)

    Scheike, Thomas; Zhang, Mei-Jie

    2008-01-01

    In this paper we consider different approaches for estimation and assessment of covariate effects for the cumulative incidence curve in the competing risks model. The classic approach is to model all cause-specific hazards and then estimate the cumulative incidence curve based on these cause...... of the flexible regression models to analyze competing risks data when non-proportionality is present in the data....

  20. Managing business model innovation risks - lessons for theory and practice

    DEFF Research Database (Denmark)

    Taran, Yariv; Chester Goduscheit, René; Boer, Harry

    2015-01-01

    This paper focuses on the challenges related to, and the risk management needed in, the process of business model innovation. Business model innovation may involve hefty investments, high levels of uncertainty, complexity and, inevitably, risk. Although many firms follow a first mover strategic...... industrial firms, we discuss the reasons that led to these failures, and outline various possible solutions for practitioners to manage business model innovation adequately....

  1. A software quality model and metrics for risk assessment

    Science.gov (United States)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  2. Predicting the cumulative risk of death during hospitalization by modeling weekend, weekday and diurnal mortality risks.

    Science.gov (United States)

    Coiera, Enrico; Wang, Ying; Magrabi, Farah; Concha, Oscar Perez; Gallego, Blanca; Runciman, William

    2014-05-21

    Current prognostic models factor in patient and disease specific variables but do not consider cumulative risks of hospitalization over time. We developed risk models of the likelihood of death associated with cumulative exposure to hospitalization, based on time-varying risks of hospitalization over any given day, as well as day of the week. Model performance was evaluated alone, and in combination with simple disease-specific models. Patients admitted between 2000 and 2006 from 501 public and private hospitals in NSW, Australia were used for training and 2007 data for evaluation. The impact of hospital care delivered over different days of the week and or times of the day was modeled by separating hospitalization risk into 21 separate time periods (morning, day, night across the days of the week). Three models were developed to predict death up to 7-days post-discharge: 1/a simple background risk model using age, gender; 2/a time-varying risk model for exposure to hospitalization (admission time, days in hospital); 3/disease specific models (Charlson co-morbidity index, DRG). Combining these three generated a full model. Models were evaluated by accuracy, AUC, Akaike and Bayesian information criteria. There was a clear diurnal rhythm to hospital mortality in the data set, peaking in the evening, as well as the well-known 'weekend-effect' where mortality peaks with weekend admissions. Individual models had modest performance on the test data set (AUC 0.71, 0.79 and 0.79 respectively). The combined model which included time-varying risk however yielded an average AUC of 0.92. This model performed best for stays up to 7-days (93% of admissions), peaking at days 3 to 5 (AUC 0.94). Risks of hospitalization vary not just with the day of the week but also time of the day, and can be used to make predictions about the cumulative risk of death associated with an individual's hospitalization. Combining disease specific models with such time varying- estimates appears to

  3. Recursive inter-generational utility in global climate risk modeling

    Energy Technology Data Exchange (ETDEWEB)

    Minh, Ha-Duong [Centre International de Recherche sur l' Environnement et le Developpement (CIRED-CNRS), 75 - Paris (France); Treich, N. [Institut National de Recherches Agronomiques (INRA-LEERNA), 31 - Toulouse (France)

    2003-07-01

    This paper distinguishes relative risk aversion and resistance to inter-temporal substitution in climate risk modeling. Stochastic recursive preferences are introduced in a stylized numeric climate-economy model using preliminary IPCC 1998 scenarios. It shows that higher risk aversion increases the optimal carbon tax. Higher resistance to inter-temporal substitution alone has the same effect as increasing the discount rate, provided that the risk is not too large. We discuss implications of these findings for the debate upon discounting and sustainability under uncertainty. (author)

  4. Risk prediction models for melanoma: a systematic review.

    Science.gov (United States)

    Usher-Smith, Juliet A; Emery, Jon; Kassianos, Angelos P; Walter, Fiona M

    2014-08-01

    Melanoma incidence is increasing rapidly worldwide among white-skinned populations. Earlier diagnosis is the principal factor that can improve prognosis. Defining high-risk populations using risk prediction models may help targeted screening and early detection approaches. In this systematic review, we searched Medline, EMBASE, and the Cochrane Library for primary research studies reporting or validating models to predict risk of developing cutaneous melanoma. A total of 4,141 articles were identified from the literature search and six through citation searching. Twenty-five risk models were included. Between them, the models considered 144 possible risk factors, including 18 measures of number of nevi and 26 of sun/UV exposure. Those most frequently included in final risk models were number of nevi, presence of freckles, history of sunburn, hair color, and skin color. Despite the different factors included and different cutoff values for sensitivity and specificity, almost all models yielded sensitivities and specificities that fit along a summary ROC with area under the ROC (AUROC) of 0.755, suggesting that most models had similar discrimination. Only two models have been validated in separate populations and both also showed good discrimination with AUROC values of 0.79 (0.70-0.86) and 0.70 (0.64-0.77). Further research should focus on validating existing models rather than developing new ones. ©2014 American Association for Cancer Research.

  5. Mechanistic modeling of insecticide risks to breeding birds in ...

    Science.gov (United States)

    Insecticide usage in the United States is ubiquitous in urban, suburban, and rural environments. In evaluating data for an insecticide registration application and for registration review, scientists at the United States Environmental Protection Agency (USEPA) assess the fate of the insecticide and the risk the insecticide poses to the environment and non-target wildlife. At the present time, current USEPA risk assessments do not include population-level endpoints. In this paper, we present a new mechanistic model, which allows risk assessors to estimate the effects of insecticide exposure on the survival and seasonal productivity of birds known to use agricultural fields during their breeding season. The new model was created from two existing USEPA avian risk assessment models, the Terrestrial Investigation Model (TIM v.3.0) and the Markov Chain Nest Productivity model (MCnest). The integrated TIM/MCnest model has been applied to assess the relative risk of 12 insecticides used to control corn pests on a suite of 31 avian species known to use cornfields in midwestern agroecosystems. The 12 insecticides that were assessed in this study are all used to treat major pests of corn (corn root worm borer, cutworm, and armyworm). After running the integrated TIM/MCnest model, we found extensive differences in risk to birds among insecticides, with chlorpyrifos and malathion (organophosphates) generally posing the greatest risk, and bifenthrin and ë-cyhalothrin (

  6. Dementia risk prediction in the population: are screening models accurate?

    Science.gov (United States)

    Stephan, Blossom C M; Kurth, Tobias; Matthews, Fiona E; Brayne, Carol; Dufouil, Carole

    2010-06-01

    Early identification of individuals at risk of dementia will become crucial when effective preventative strategies for this condition are developed. Various dementia prediction models have been proposed, including clinic-based criteria for mild cognitive impairment, and more-broadly constructed algorithms, which synthesize information from known dementia risk factors, such as poor cognition and health. Knowledge of the predictive accuracy of such models will be important if they are to be used in daily clinical practice or to screen the entire older population (individuals aged >or=65 years). This article presents an overview of recent progress in the development of dementia prediction models for use in population screening. In total, 25 articles relating to dementia risk screening met our inclusion criteria for review. Our evaluation of the predictive accuracy of each model shows that most are poor at discriminating at-risk individuals from not-at-risk cases. The best models incorporate diverse sources of information across multiple risk factors. Typically, poor accuracy is associated with single-factor models, long follow-up intervals and the outcome measure of all-cause dementia. A parsimonious and cost-effective consensus model needs to be developed that accurately identifies individuals with a high risk of future dementia.

  7. Risk Based Milk Pricing Model at Dairy Farmers Level

    Directory of Open Access Journals (Sweden)

    W. Septiani

    2017-12-01

    Full Text Available The milk price from a cooperative institution to farmer does not fully cover the production cost. Though, dairy farmers encounter various risks and uncertainties in conducting their business. The highest risk in milk supply lies in the activities at the farm. This study was designed to formulate a model for calculating milk price at farmer’s level based on risk. Risks that occur on farms include the risk of cow breeding, sanitation, health care, cattle feed management, milking and milk sales. This research used the location of the farm in West Java region. There were five main stages in the preparation of this model, (1 identification and analysis of influential factors, (2 development of a conceptual model, (3 structural analysis and the amount of production costs, (4 model calculation of production cost with risk factors, and (5 risk based milk pricing model. This research built a relationship between risks on smallholder dairy farms with the production costs to be incurred by the farmers. It was also obtained the formulation of risk adjustment factor calculation for the variable costs of production in dairy cattle farm. The difference in production costs with risk and the total production cost without risk was about 8% to 10%. It could be concluded that the basic price of milk proposed based on the research was around IDR 4,250-IDR 4,350/L for 3 to 4 cows ownership. Increasing farmer income was expected to be obtained by entering the value of this risk in the calculation of production costs. 

  8. Risk models to predict hypertension: a systematic review.

    Directory of Open Access Journals (Sweden)

    Justin B Echouffo-Tcheugui

    Full Text Available BACKGROUND: As well as being a risk factor for cardiovascular disease, hypertension is also a health condition in its own right. Risk prediction models may be of value in identifying those individuals at risk of developing hypertension who are likely to benefit most from interventions. METHODS AND FINDINGS: To synthesize existing evidence on the performance of these models, we searched MEDLINE and EMBASE; examined bibliographies of retrieved articles; contacted experts in the field; and searched our own files. Dual review of identified studies was conducted. Included studies had to report on the development, validation, or impact analysis of a hypertension risk prediction model. For each publication, information was extracted on study design and characteristics, predictors, model discrimination, calibration and reclassification ability, validation and impact analysis. Eleven studies reporting on 15 different hypertension prediction risk models were identified. Age, sex, body mass index, diabetes status, and blood pressure variables were the most common predictor variables included in models. Most risk models had acceptable-to-good discriminatory ability (C-statistic>0.70 in the derivation sample. Calibration was less commonly assessed, but overall acceptable. Two hypertension risk models, the Framingham and Hopkins, have been externally validated, displaying acceptable-to-good discrimination, and C-statistic ranging from 0.71 to 0.81. Lack of individual-level data precluded analyses of the risk models in subgroups. CONCLUSIONS: The discrimination ability of existing hypertension risk prediction tools is acceptable, but the impact of using these tools on prescriptions and outcomes of hypertension prevention is unclear.

  9. The Application of Asymmetric Liquidity Risk Measure in Modelling the Risk of Investment

    Directory of Open Access Journals (Sweden)

    Garsztka Przemysław

    2015-06-01

    Full Text Available The article analyses the relationship between investment risk (as measured by the variance of returns or standard deviation of returns and liquidity risk. The paper presents a method for calculating a new measure of liquidity risk, based on the characteristic line. In addition, it is checked what is the impact of liquidity risk to the volatility of daily returns. To describe this relationship dynamic econometric models were used. It was found that there was an econometric relationship between the proposed measure liquidity risk and the variance of returns.

  10. Risk models and scores for type 2 diabetes: systematic review.

    Science.gov (United States)

    Noble, Douglas; Mathur, Rohini; Dent, Tom; Meads, Catherine; Greenhalgh, Trisha

    2011-11-28

    To evaluate current risk models and scores for type 2 diabetes and inform selection and implementation of these in practice. Systematic review using standard (quantitative) and realist (mainly qualitative) methodology. Inclusion criteria Papers in any language describing the development or external validation, or both, of models and scores to predict the risk of an adult developing type 2 diabetes. Medline, PreMedline, Embase, and Cochrane databases were searched. Included studies were citation tracked in Google Scholar to identify follow-on studies of usability or impact. Data were extracted on statistical properties of models, details of internal or external validation, and use of risk scores beyond the studies that developed them. Quantitative data were tabulated to compare model components and statistical properties. Qualitative data were analysed thematically to identify mechanisms by which use of the risk model or score might improve patient outcomes. 8864 titles were scanned, 115 full text papers considered, and 43 papers included in the final sample. These described the prospective development or validation, or both, of 145 risk prediction models and scores, 94 of which were studied in detail here. They had been tested on 6.88 million participants followed for up to 28 years. Heterogeneity of primary studies precluded meta-analysis. Some but not all risk models or scores had robust statistical properties (for example, good discrimination and calibration) and had been externally validated on a different population. Genetic markers added nothing to models over clinical and sociodemographic factors. Most authors described their score as "simple" or "easily implemented," although few were specific about the intended users and under what circumstances. Ten mechanisms were identified by which measuring diabetes risk might improve outcomes. Follow-on studies that applied a risk score as part of an intervention aimed at reducing actual risk in people were sparse

  11. A review of unmanned aircraft system ground risk models

    Science.gov (United States)

    Washington, Achim; Clothier, Reece A.; Silva, Jose

    2017-11-01

    There is much effort being directed towards the development of safety regulations for unmanned aircraft systems (UAS). National airworthiness authorities have advocated the adoption of a risk-based approach, whereby regulations are driven by the outcomes of a systematic process to assess and manage identified safety risks. Subsequently, models characterising the primary hazards associated with UAS operations have now become critical to the development of regulations and in turn, to the future of the industry. Key to the development of airworthiness regulations for UAS is a comprehensive understanding of the risks UAS operations pose to people and property on the ground. A comprehensive review of the literature identified 33 different models (and component sub models) used to estimate ground risk posed by UAS. These models comprise failure, impact location, recovery, stress, exposure, incident stress and harm sub-models. The underlying assumptions and treatment of uncertainties in each of these sub-models differ significantly between models, which can have a significant impact on the development of regulations. This paper reviews the state-of-the-art in research into UAS ground risk modelling, discusses how the various sub-models relate to the different components of the regulation, and explores how model-uncertainties potentially impact the development of regulations for UAS.

  12. The Tripartite Model of Risk Perception (TRIRISK): Distinguishing Deliberative, Affective, and Experiential Components of Perceived Risk.

    Science.gov (United States)

    Ferrer, Rebecca A; Klein, William M P; Persoskie, Alexander; Avishai-Yitshak, Aya; Sheeran, Paschal

    2016-10-01

    Although risk perception is a key predictor in health behavior theories, current conceptions of risk comprise only one (deliberative) or two (deliberative vs. affective/experiential) dimensions. This research tested a tripartite model that distinguishes among deliberative, affective, and experiential components of risk perception. In two studies, and in relation to three common diseases (cancer, heart disease, diabetes), we used confirmatory factor analyses to examine the factor structure of the tripartite risk perception (TRIRISK) model and compared the fit of the TRIRISK model to dual-factor and single-factor models. In a third study, we assessed concurrent validity by examining the impact of cancer diagnosis on (a) levels of deliberative, affective, and experiential risk perception, and (b) the strength of relations among risk components, and tested predictive validity by assessing relations with behavioral intentions to prevent cancer. The tripartite factor structure was supported, producing better model fit across diseases (studies 1 and 2). Inter-correlations among the components were significantly smaller among participants who had been diagnosed with cancer, suggesting that affected populations make finer-grained distinctions among risk perceptions (study 3). Moreover, all three risk perception components predicted unique variance in intentions to engage in preventive behavior (study 3). The TRIRISK model offers both a novel conceptualization of health-related risk perceptions, and new measures that enhance predictive validity beyond that engendered by unidimensional and bidimensional models. The present findings have implications for the ways in which risk perceptions are targeted in health behavior change interventions, health communications, and decision aids.

  13. A Duality Result for the Generalized Erlang Risk Model

    Directory of Open Access Journals (Sweden)

    Lanpeng Ji

    2014-11-01

    Full Text Available In this article, we consider the generalized Erlang risk model and its dual model. By using a conditional measure-preserving correspondence between the two models, we derive an identity for two interesting conditional probabilities. Applications to the discounted joint density of the surplus prior to ruin and the deficit at ruin are also discussed.

  14. The proportional odds cumulative incidence model for competing risks

    DEFF Research Database (Denmark)

    Eriksson, Frank; Li, Jianing; Scheike, Thomas

    2015-01-01

    We suggest an estimator for the proportional odds cumulative incidence model for competing risks data. The key advantage of this model is that the regression parameters have the simple and useful odds ratio interpretation. The model has been considered by many authors, but it is rarely used in pr...

  15. Evaluating the Security Risks of System Using Hidden Markov Models

    African Journals Online (AJOL)

    Evaluating the Security Risks of System Using Hidden Markov Models. ... tool to an existing multifactor authentication model. The results of the analysis and the empirical study provide insights into the authentication model design problem and establish a foundation for future research in system authentication application.

  16. Decentralized Event-Based Communication Strategy on Leader-Follower Consensus Control

    Directory of Open Access Journals (Sweden)

    Duosi Xie

    2016-01-01

    Full Text Available This paper addresses the leader-follower consensus problem of networked systems by using a decentralized event-based control strategy. The event-based control strategy makes the controllers of agents update at aperiodic event instants. Two decentralized event functions are designed to generate these event instants. In particular, the second event function only uses its own information and the neighbors’ states at their latest event instants. By using this event function, no continuous communication among followers is required. As the followers only communicate at these discrete event instants, this strategy is able to save communication and to reduce channel occupation. It is analytically shown that the leader-follower networked system is able to reach consensus by utilizing the proposed control strategy. Simulation examples are shown to illustrate effectiveness of the proposed control strategy.

  17. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  18. Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance

    Science.gov (United States)

    Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra

    2017-06-01

    In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.

  19. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  20. Event-based media processing and analysis: A survey of the literature

    OpenAIRE

    Tzelepis, Christos; Ma, Zhigang; MEZARIS, Vasileios; Ionescu, Bogdan; Kompatsiaris, Ioannis; Boato, Giulia; Sebe, Nicu; Yan, Shuicheng

    2016-01-01

    Research on event-based processing and analysis of media is receiving an increasing attention from the scientific community due to its relevance for an abundance of applications, from consumer video management and video surveillance to lifelogging and social media. Events have the ability to semantically encode relationships of different informational modalities, such as visual-audio-text, time, involved agents and objects, with the spatio-temporal component of events being a key feature for ...

  1. Valenced Cues and Contexts Have Different Effects on Event-Based Prospective Memory

    OpenAIRE

    Peter Graf; Martin Yu

    2015-01-01

    This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when ...

  2. Detection of vulnerable relays and sensitive controllers under cascading events based on performance indices

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Hu, Yanting

    2014-01-01

    ) based detection strategy is proposed to identify the vulnerable relays and sensitive controllers under the overloading situation during cascading events. Based on the impedance margin sensitivity, diverse performance indices are proposed to help improving this detection. A study case of voltage...... instability induced cascaded blackout built in real time digital simulator (RTDS) will be used to demonstrate the proposed strategy. The simulation results indicate this strategy can effectively detect the vulnerable relays and sensitive controllers under overloading situations....

  3. Risk prediction model: Statistical and artificial neural network approach

    Science.gov (United States)

    Paiman, Nuur Azreen; Hariri, Azian; Masood, Ibrahim

    2017-04-01

    Prediction models are increasingly gaining popularity and had been used in numerous areas of studies to complement and fulfilled clinical reasoning and decision making nowadays. The adoption of such models assist physician's decision making, individual's behavior, and consequently improve individual outcomes and the cost-effectiveness of care. The objective of this paper is to reviewed articles related to risk prediction model in order to understand the suitable approach, development and the validation process of risk prediction model. A qualitative review of the aims, methods and significant main outcomes of the nineteen published articles that developed risk prediction models from numerous fields were done. This paper also reviewed on how researchers develop and validate the risk prediction models based on statistical and artificial neural network approach. From the review done, some methodological recommendation in developing and validating the prediction model were highlighted. According to studies that had been done, artificial neural network approached in developing the prediction model were more accurate compared to statistical approach. However currently, only limited published literature discussed on which approach is more accurate for risk prediction model development.

  4. A flexible model for actuarial risks under dependence

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.; Lukocius, V.

    Methods for computing risk measures, such as stop-loss premiums, tacitly assume independence of the underlying individual risks. This can lead to huge errors even when only small dependencies occur. In the present paper, a general model is developed which covers what happens in practice in a

  5. Tests of risk premia in linear factor models

    NARCIS (Netherlands)

    Kleibergen, F.R.

    2005-01-01

    We show that inference on risk premia in linear factor models that is based on the Fama-MacBeth and GLS risk premia estimators is misleading when the ß’s are small and/or the number of assets is large. We propose some novel statistics that remain trustworthy in these cases. The inadequacy of

  6. Music, clicks, and their imaginations favor differently the event-based timing component for rhythmic movements.

    Science.gov (United States)

    Bravi, Riccardo; Quarta, Eros; Del Tongo, Claudia; Carbonaro, Nicola; Tognetti, Alessandro; Minciacchi, Diego

    2015-06-01

    The involvement or noninvolvement of a clock-like neural process, an effector-independent representation of the time intervals to produce, is described as the essential difference between event-based and emergent timing. In a previous work (Bravi et al. in Exp Brain Res 232:1663-1675, 2014a. doi: 10.1007/s00221-014-3845-9 ), we studied repetitive isochronous wrist's flexion-extensions (IWFEs), performed while minimizing visual and tactile information, to clarify whether non-temporal and temporal characteristics of paced auditory stimuli affect the precision and accuracy of the rhythmic motor performance. Here, with the inclusion of new recordings, we expand the examination of the dataset described in our previous study to investigate whether simple and complex paced auditory stimuli (clicks and music) and their imaginations influence in a different way the timing mechanisms for repetitive IWFEs. Sets of IWFEs were analyzed by the windowed (lag one) autocorrelation-wγ(1), a statistical method recently introduced for the distinction between event-based and emergent timing. Our findings provide evidence that paced auditory information and its imagination favor the engagement of a clock-like neural process, and specifically that music, unlike clicks, lacks the power to elicit event-based timing, not counteracting the natural shift of wγ(1) toward positive values as frequency of movements increase.

  7. Hierarchical Modelling of Flood Risk for Engineering Decision Analysis

    DEFF Research Database (Denmark)

    Custer, Rocco

    to changing flood risk. In the presence of flood protection structures, flood development depends on the state of all protection structures in the system. As such, hazard is a function not only of rainfall and river discharge, but also of protection structures’ fragility. A methodology for flood risk analysis...... and decision analysis for hierarchical flood protection systems is proposed, which allows for joint consideration of hazard models and fragility models of protection structures. In the implementation of the flood risk analysis methodology several challenges are identified, two of which are addressed...... systems, as well as the implementation of the flood risk analysis methodology and the vulnerability modelling approach are illustrated with an example application. In summary, the present thesis provides a characterisation of hierarchical flood protection systems as well as several methodologies to model...

  8. Tactical Medical Logistics Planning Tool: Modeling Operational Risk Assessment

    National Research Council Canada - National Science Library

    Konoske, Paula

    2004-01-01

    ...) models the patient flow from the point of injury through more definitive care, and (2) supports operations research and systems analysis studies, operational risk assessment, and field medical services planning. TML+...

  9. Modeling Exposure to Persistent Chemicals in Hazard and Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cowan-Ellsberry, Christina E.; McLachlan, Michael S.; Arnot, Jon A.; MacLeod, Matthew; McKone, Thomas E.; Wania, Frank

    2008-11-01

    Fate and exposure modeling has not thus far been explicitly used in the risk profile documents prepared to evaluate significant adverse effect of candidate chemicals for either the Stockholm Convention or the Convention on Long-Range Transboundary Air Pollution. However, we believe models have considerable potential to improve the risk profiles. Fate and exposure models are already used routinely in other similar regulatory applications to inform decisions, and they have been instrumental in building our current understanding of the fate of POP and PBT chemicals in the environment. The goal of this paper is to motivate the use of fate and exposure models in preparing risk profiles in the POP assessment procedure by providing strategies for incorporating and using models. The ways that fate and exposure models can be used to improve and inform the development of risk profiles include: (1) Benchmarking the ratio of exposure and emissions of candidate chemicals to the same ratio for known POPs, thereby opening the possibility of combining this ratio with the relative emissions and relative toxicity to arrive at a measure of relative risk. (2) Directly estimating the exposure of the environment, biota and humans to provide information to complement measurements, or where measurements are not available or are limited. (3) To identify the key processes and chemical and/or environmental parameters that determine the exposure; thereby allowing the effective prioritization of research or measurements to improve the risk profile. (4) Predicting future time trends including how quickly exposure levels in remote areas would respond to reductions in emissions. Currently there is no standardized consensus model for use in the risk profile context. Therefore, to choose the appropriate model the risk profile developer must evaluate how appropriate an existing model is for a specific setting and whether the assumptions and input data are relevant in the context of the application

  10. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    Science.gov (United States)

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  11. Mental models of the benefits and risks of novel foods

    DEFF Research Database (Denmark)

    Hagemann, Kit; Scholderer, Joachim

    responsibilities in food safety risk assessment and environmental risk assessment. In Study 2, mental models were elicited in in-depth interviews with altogether 25 Danish consumers with wide variations in sociodemographic characteristics and education. As expected, the mental models elicited from the expert...... abstract, suggesting that most participants did not actually know anything about these processes beyond heuristic categorisations. In line with this, issues of uncertainty played a prominent role for consumers....

  12. A Corrosion Risk Assessment Model for Underground Piping

    Science.gov (United States)

    Datta, Koushik; Fraser, Douglas R.

    2009-01-01

    The Pressure Systems Manager at NASA Ames Research Center (ARC) has embarked on a project to collect data and develop risk assessment models to support risk-informed decision making regarding future inspections of underground pipes at ARC. This paper shows progress in one area of this project - a corrosion risk assessment model for the underground high-pressure air distribution piping system at ARC. It consists of a Corrosion Model of pipe-segments, a Pipe Wrap Protection Model; and a Pipe Stress Model for a pipe segment. A Monte Carlo simulation of the combined models provides a distribution of the failure probabilities. Sensitivity study results show that the model uncertainty, or lack of knowledge, is the dominant contributor to the calculated unreliability of the underground piping system. As a result, the Pressure Systems Manager may consider investing resources specifically focused on reducing these uncertainties. Future work includes completing the data collection effort for the existing ground based pressure systems and applying the risk models to risk-based inspection strategies of the underground pipes at ARC.

  13. Quantitative risk assessment modeling for nonhomogeneous urban road tunnels.

    Science.gov (United States)

    Meng, Qiang; Qu, Xiaobo; Wang, Xinchang; Yuanita, Vivi; Wong, Siew Chee

    2011-03-01

    Urban road tunnels provide an increasingly cost-effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section-based model incorporates one additional top event-toxic gases due to traffic congestion-and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section-based QRA models. Finally, a case study in Singapore is carried out. © 2010 Society for Risk Analysis.

  14. Usefulness and limitations of global flood risk models

    Science.gov (United States)

    Ward, Philip; Jongman, Brenden; Salamon, Peter; Simpson, Alanna; Bates, Paul; De Groeve, Tom; Muis, Sanne; Coughlan de Perez, Erin; Rudari, Roberto; Mark, Trigg; Winsemius, Hessel

    2016-04-01

    Global flood risk models are now a reality. Initially, their development was driven by a demand from users for first-order global assessments to identify risk hotspots. Relentless upward trends in flood damage over the last decade have enhanced interest in such assessments. The adoption of the Sendai Framework for Disaster Risk Reduction and the Warsaw International Mechanism for Loss and Damage Associated with Climate Change Impacts have made these efforts even more essential. As a result, global flood risk models are being used more and more in practice, by an increasingly large number of practitioners and decision-makers. However, they clearly have their limits compared to local models. To address these issues, a team of scientists and practitioners recently came together at the Global Flood Partnership meeting to critically assess the question 'What can('t) we do with global flood risk models?'. The results of this dialogue (Ward et al., 2013) will be presented, opening a discussion on similar broader initiatives at the science-policy interface in other natural hazards. In this contribution, examples are provided of successful applications of global flood risk models in practice (for example together with the World Bank, Red Cross, and UNISDR), and limitations and gaps between user 'wish-lists' and model capabilities are discussed. Finally, a research agenda is presented for addressing these limitations and reducing the gaps. Ward et al., 2015. Nature Climate Change, doi:10.1038/nclimate2742

  15. Adding Value to Ecological Risk Assessment with Population Modeling

    DEFF Research Database (Denmark)

    Forbes, Valery E.; Calow, Peter; Grimm, Volker

    2011-01-01

    Current measures used to estimate the risks of toxic chemicals are not relevant to the goals of the environmental protection process, and thus ecological risk assessment (ERA) is not used as extensively as it should be as a basis for cost-effective management of environmental resources. Appropriate...... population models can provide a powerful basis for expressing ecological risks that better inform the environmental management process and thus that are more likely to be used by managers. Here we provide at least five reasons why population modeling should play an important role in bridging the gap between...

  16. An integrated risk management model for source water protection areas.

    Science.gov (United States)

    Chiueh, Pei-Te; Shang, Wei-Ting; Lo, Shang-Lien

    2012-10-17

    Watersheds are recognized as the most effective management unit for the protection of water resources. For surface water supplies that use water from upstream watersheds, evaluating threats to water quality and implementing a watershed management plan are crucial for the maintenance of drinking water safe for humans. The aim of this article is to establish a risk assessment model that provides basic information for identifying critical pollutants and areas at high risk for degraded water quality. In this study, a quantitative risk model that uses hazard quotients for each water quality parameter was combined with a qualitative risk model that uses the relative risk level of potential pollution events in order to characterize the current condition and potential risk of watersheds providing drinking water. In a case study of Taipei Source Water Area in northern Taiwan, total coliforms and total phosphorus were the top two pollutants of concern. Intensive tea-growing and recreational activities around the riparian zone may contribute the greatest pollution to the watershed. Our risk assessment tool may be enhanced by developing, recording, and updating information on pollution sources in the water supply watersheds. Moreover, management authorities could use the resultant information to create watershed risk management plans.

  17. Systematic Review of Validity Assessments of Framingham Risk Score Results in Health Economic Modelling of Lipid-Modifying Therapies in Europe.

    Science.gov (United States)

    Hermansson, Jonas; Kahan, Thomas

    2017-10-27

    The Framingham Risk Score is used both in the clinical setting and in health economic analyses to predict the risk for future coronary heart disease events. Based on an American population, the Framingham Risk Score has been criticised for potential overestimation of risk in European populations. We investigated whether the use of the Framingham Risk Score actually was validated in health economic studies that modelled the effects of lipid-lowering treatment with statins on coronary heart disease events in European populations. In this systematic literature review of all relevant published studies in English (literature searched September 2016 in PubMed, EMBASE and SCOPUS), 99 studies were identified and 22 were screened in full text, 18 of which were included. Key data were extracted and synthesised narratively. The only type of validation identified was a comparison against coronary heart disease risk data from one primary preventive and one secondary preventive clinical investigation, and from observational population data in one study. Taken together, those three studies reported an overall satisfactory accuracy in the results obtained by Framingham Risk Score predictions, but the Framingham Risk Score tended to underestimate non-fatal myocardial infarctions. In five studies, potential issues in applying the Framingham Risk Score on a European population were not addressed. Further studies are needed to ascertain that the Framingham Risk Score can accurately predict cardiovascular outcome in health economic modelling studies on lipid-lowering therapy in European populations. Future modelling studies using the Framingham Risk Score would benefit from validating the results against other data.

  18. The Effects of Age and Cue-Action Reminders on Event-Based Prospective Memory Performance in Preschoolers

    Science.gov (United States)

    Kliegel, Matthias; Jager, Theodor

    2007-01-01

    The present study investigated event-based prospective memory in five age groups of preschoolers (i.e., 2-, 3-, 4-, 5-, and 6-year-olds). Applying a laboratory-controlled prospective memory procedure, the data showed that event-based prospective memory performance improves across the preschool years, at least between 3 and 6 years of age. However,…

  19. A model for the optimal risk management of (farm) firms

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    Current methods of risk management focus on efficiency and do not provide operational answers to the basic question of how to optimise and balance the two objectives, maximisation of expected income and minimisation of risk. This paper uses the Capital Asset Pricing Model (CAPM) to derive...... an operational criterion for the optimal risk management of firms. The criterion assumes that the objective of the firm manager is to maximise the market value of the firm and is based on the condition that the application of risk management tools has a symmetric effect on the variability of income around...... the mean. The criterion is based on the expected consequences of risk management on relative changes in the variance of return on equity and expected income. The paper demonstrates how the criterion may be used to evaluate and compare the effect of different risk management tools, and it illustrates how...

  20. A triple risk model for unexplained late stillbirth.

    Science.gov (United States)

    Warland, Jane; Mitchell, Edwin A

    2014-04-14

    The triple risk model for sudden infant death syndrome (SIDS) has been useful in understanding its pathogenesis. Risk factors for late stillbirth are well established, especially relating to maternal and fetal wellbeing. We propose a similar triple risk model for unexplained late stillbirth. The model proposed by us results from the interplay of three groups of factors: (1) maternal factors (such as maternal age, obesity, smoking), (2) fetal and placental factors (such as intrauterine growth retardation, placental insufficiency), and (3) a stressor (such as venocaval compression from maternal supine sleep position, sleep disordered breathing). We argue that the risk factors within each group in themselves may be insufficient to cause the death, but when they interrelate may produce a lethal combination. Unexplained late stillbirth occurs when a fetus who is somehow vulnerable dies as a result of encountering a stressor and/or maternal condition in a combination which is lethal for them.

  1. Lung cancer in never smokers Epidemiology and risk prediction models

    Science.gov (United States)

    McCarthy, William J.; Meza, Rafael; Jeon, Jihyoun; Moolgavkar, Suresh

    2012-01-01

    In this chapter we review the epidemiology of lung cancer incidence and mortality among never smokers/ nonsmokers and describe the never smoker lung cancer risk models used by CISNET modelers. Our review focuses on those influences likely to have measurable population impact on never smoker risk, such as secondhand smoke, even though the individual-level impact may be small. Occupational exposures may also contribute importantly to the population attributable risk of lung cancer. We examine the following risk factors in this chapter: age, environmental tobacco smoke, cooking fumes, ionizing radiation including radon gas, inherited genetic susceptibility, selected occupational exposures, preexisting lung disease, and oncogenic viruses. We also compare the prevalence of never smokers between the three CISNET smoking scenarios and present the corresponding lung cancer mortality estimates among never smokers as predicted by a typical CISNET model. PMID:22882894

  2. Calibration plots for risk prediction models in the presence of competing risks.

    Science.gov (United States)

    Gerds, Thomas A; Andersen, Per K; Kattan, Michael W

    2014-08-15

    A predicted risk of 17% can be called reliable if it can be expected that the event will occur to about 17 of 100 patients who all received a predicted risk of 17%. Statistical models can predict the absolute risk of an event such as cardiovascular death in the presence of competing risks such as death due to other causes. For personalized medicine and patient counseling, it is necessary to check that the model is calibrated in the sense that it provides reliable predictions for all subjects. There are three often encountered practical problems when the aim is to display or test if a risk prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves for competing risks models based on jackknife pseudo-values that are combined with a nearest neighborhood smoother and a cross-validation approach to deal with all three problems. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Lymphatic filariasis transmission risk map of India, based on a geo-environmental risk model.

    Science.gov (United States)

    Sabesan, Shanmugavelu; Raju, Konuganti Hari Kishan; Subramanian, Swaminathan; Srivastava, Pradeep Kumar; Jambulingam, Purushothaman

    2013-09-01

    The strategy adopted by a global program to interrupt transmission of lymphatic filariasis (LF) is mass drug administration (MDA) using chemotherapy. India also followed this strategy by introducing MDA in the historically known endemic areas. All other areas, which remained unsurveyed, were presumed to be nonendemic and left without any intervention. Therefore, identification of LF transmission risk areas in the entire country has become essential so that they can be targeted for intervention. A geo-environmental risk model (GERM) developed earlier was used to create a filariasis transmission risk map for India. In this model, a Standardized Filariasis Transmission Risk Index (SFTRI, based on geo-environmental risk variables) was used as a predictor of transmission risk. The relationship between SFTRI and endemicity (historically known) of an area was quantified by logistic regression analysis. The quantified relationship was validated by assessing the filarial antigenemia status of children living in the unsurveyed areas through a ground truth study. A significant positive relationship was observed between SFTRI and the endemicity of an area. Overall, the model prediction of filarial endemic status of districts was found to be correct in 92.8% of the total observations. Thus, among the 190 districts hitherto unsurveyed, as many as 113 districts were predicted to be at risk, and the remaining at no risk. The GERM developed on geographic information system (GIS) platform is useful for LF spatial delimitation on a macrogeographic/regional scale. Furthermore, the risk map developed will be useful for the national LF elimination program by identifying areas at risk for intervention and for undertaking surveillance in no-risk areas.

  4. A semi-quantitative model for risk appreciation and risk weighing

    DEFF Research Database (Denmark)

    Bos, Peter M.J.; Boon, Polly E.; van der Voet, Hilko

    2009-01-01

    Risk managers need detailed information on (1) the type of effect, (2) the size (severity) of the expected effect(s) and (3) the fraction of the population at risk to decide on well-balanced risk reduction measures. A previously developed integrated probabilistic risk assessment (IPRA) model...... provides quantitative information on these three parameters. A semi-quantitative tool is presented that combines information on these parameters into easy-readable charts that will facilitate risk evaluations of exposure situations and decisions on risk reduction measures. This tool is based on a concept...... of health impact categorization that has been successfully in force for several years within several emergency planning programs. Four health impact categories are distinguished: No-Health Impact, Low-Health Impact, Moderate-Health Impact and Severe-Health Impact. Two different charts are presented...

  5. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    Science.gov (United States)

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.

  6. The Global Earthquake Model and Disaster Risk Reduction

    Science.gov (United States)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  7. Chain Risk Model for quantifying cost effectiveness of phytosanitary measures

    NARCIS (Netherlands)

    Benninga, J.; Hennen, W.H.G.J.; Schans, van de J.

    2010-01-01

    A Chain Risk Model (CRM) was developed for a cost effective assessment of phytosanitary measures. The CRM model can be applied to phytosanitary assessments of all agricultural product chains. In CRM, stages are connected by product volume flows with which pest infections can be spread from one stage

  8. Land Use Scenario Modeling for Flood Risk Mitigation

    Directory of Open Access Journals (Sweden)

    José I. Barredo

    2010-05-01

    Full Text Available It is generally accepted that flood risk has been increasing in Europe in the last decades. Accordingly, it becomes a priority to better understand its drivers and mechanisms. Flood risk is evaluated on the basis of three factors: hazard, exposure and vulnerability. If one of these factors increases, then so does the risk. Land use change models used for ex-ante assessment of spatial trends provide planners with powerful tools for territorial decision making. However, until recently this type of model has been largely neglected in strategic planning for flood risk mitigation. Thus, ex-ante assessment of flood risk is an innovative application of land use change models. The aim of this paper is to propose a flood risk mitigation approach using exposure scenarios. The methodology is applied in the Pordenone province in northern Italy. In the past 50 years Pordenone has suffered several heavy floods, the disastrous consequences of which demonstrated the vulnerability of the area. Results of this study confirm that the main driving force of increased flood risk is found in new urban developments in flood-prone areas.

  9. Crisis and emergency risk communication as an integrative model.

    Science.gov (United States)

    Reynolds, Barbara; W Seeger, Matthew

    2005-01-01

    This article describes a model of communication known as crisis and emergency risk communication (CERC). The model is outlined as a merger of many traditional notions of health and risk communication with work in crisis and disaster communication. The specific kinds of communication activities that should be called for at various stages of disaster or crisis development are outlined. Although crises are by definition uncertain, equivocal, and often chaotic situations, the CERC model is presented as a tool health communicators can use to help manage these complex events.

  10. A probabilistic model for seismic risk assessment of buildings

    OpenAIRE

    Aguilar, Armando; Pujades Beneit, Lluís; Barbat Barbat, Horia Alejandro; Lantada Zarzosa, Maria de Las Nieves

    2009-01-01

    A probabilistic model to estimate seismic risk is presented. The main elements of this model are the seismic hazard, the vulnerability, and the structural response. These elements are evaluated through a probabilistic point of view, in order to compute the seismic risk. In order to illustrate the application of the proposed model a new method, mLM1, is used to estimate the expected physical damage of the buildings located in a block of Barcelona. Most of the buildings in this block are unr...

  11. Modeling perceptions of climatic risk in crop production.

    Science.gov (United States)

    Reinmuth, Evelyn; Parker, Phillip; Aurbacher, Joachim; Högy, Petra; Dabbert, Stephan

    2017-01-01

    In agricultural production, land-use decisions are components of economic planning that result in the strategic allocation of fields. Climate variability represents an uncertainty factor in crop production. Considering yield impact, climatic influence is perceived during and evaluated at the end of crop production cycles. In practice, this information is then incorporated into planning for the upcoming season. This process contributes to attitudes toward climate-induced risk in crop production. In the literature, however, the subjective valuation of risk is modeled as a risk attitude toward variations in (monetary) outcomes. Consequently, climatic influence may be obscured by political and market influences so that risk perceptions during the production process are neglected. We present a utility concept that allows the inclusion of annual risk scores based on mid-season risk perceptions that are incorporated into field-planning decisions. This approach is exemplified and implemented for winter wheat production in the Kraichgau, a region in Southwest Germany, using the integrated bio-economic simulation model FarmActor and empirical data from the region. Survey results indicate that a profitability threshold for this crop, the level of "still-good yield" (sgy), is 69 dt ha-1 (regional mean Kraichgau sample) for a given season. This threshold governs the monitoring process and risk estimators. We tested the modeled estimators against simulation results using ten projected future weather time series for winter wheat production. The mid-season estimators generally proved to be effective. This approach can be used to improve the modeling of planning decisions by providing a more comprehensive evaluation of field-crop response to climatic changes from an economic risk point of view. The methodology further provides economic insight in an agrometeorological context where prices for crops or inputs are lacking, but farmer attitudes toward risk should still be included in

  12. Distributionally Robust Return-Risk Optimization Models and Their Applications

    Directory of Open Access Journals (Sweden)

    Li Yang

    2014-01-01

    Full Text Available Based on the risk control of conditional value-at-risk, distributionally robust return-risk optimization models with box constraints of random vector are proposed. They describe uncertainty in both the distribution form and moments (mean and covariance matrix of random vector. It is difficult to solve them directly. Using the conic duality theory and the minimax theorem, the models are reformulated as semidefinite programming problems, which can be solved by interior point algorithms in polynomial time. An important theoretical basis is therefore provided for applications of the models. Moreover, an application of the models to a practical example of portfolio selection is considered, and the example is evaluated using a historical data set of four stocks. Numerical results show that proposed methods are robust and the investment strategy is safe.

  13. A differential deficit in time- versus event-based prospective memory in Parkinson's disease.

    Science.gov (United States)

    Raskin, Sarah A; Woods, Steven Paul; Poquette, Amelia J; McTaggart, April B; Sethna, Jim; Williams, Rebecca C; Tröster, Alexander I

    2011-03-01

    The aim of the current study was to clarify the nature and extent of impairment in time- versus event-based prospective memory in Parkinson's disease (PD). Prospective memory is thought to involve cognitive processes that are mediated by prefrontal systems and are executive in nature. Given that individuals with PD frequently show executive dysfunction, it is important to determine whether these individuals may have deficits in prospective memory that could impact daily functions, such as taking medications. Although it has been reported that individuals with PD evidence impairment in prospective memory, it is still unclear whether they show a greater deficit for time- versus event-based cues. Fifty-four individuals with PD and 34 demographically similar healthy adults were administered a standardized measure of prospective memory that allows for a direct comparison of time-based and event-based cues. In addition, participants were administered a series of standardized measures of retrospective memory and executive functions. Individuals with PD demonstrated impaired prospective memory performance compared to the healthy adults, with a greater impairment demonstrated for the time-based tasks. Time-based prospective memory performance was moderately correlated with measures of executive functioning, but only the Stroop Neuropsychological Screening Test emerged as a unique predictor in a linear regression. Findings are interpreted within the context of McDaniel and Einstein's (2000) multiprocess theory to suggest that individuals with PD experience particular difficulty executing a future intention when the cue to execute the prescribed intention requires higher levels of executive control. (c) 2011 APA, all rights reserved

  14. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  15. Avian collision risk models for wind energy impact assessments

    Energy Technology Data Exchange (ETDEWEB)

    Masden, E.A., E-mail: elizabeth.masden@uhi.ac.uk [Environmental Research Institute, North Highland College-UHI, University of the Highlands and Islands, Ormlie Road, Thurso, Caithness KW14 7EE (United Kingdom); Cook, A.S.C.P. [British Trust for Ornithology, The Nunnery, Thetford IP24 2PU (United Kingdom)

    2016-01-15

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  16. Event-Based Control for Average Consensus of Wireless Sensor Networks with Stochastic Communication Noises

    Directory of Open Access Journals (Sweden)

    Chuan Ji

    2013-01-01

    Full Text Available This paper focuses on the average consensus problem for the wireless sensor networks (WSNs with fixed and Markovian switching, undirected and connected network topologies in the noise environment. Event-based protocol is applied to each sensor node to reach the consensus. An event triggering strategy is designed based on a Lyapunov function. Under the event trigger condition, some sufficient conditions for average consensus in mean square are obtained. Finally, some numerical simulations are given to illustrate the effectiveness of the results derived in this paper.

  17. Measuring Risk Structure Using the Capital Asset Pricing Model

    Directory of Open Access Journals (Sweden)

    Zdeněk Konečný

    2015-01-01

    Full Text Available This article is aimed at proposing of an inovative method for calculating the shares of operational and financial risks. This methodological tool will support managers while monitoring the risk structure. The method is based on the capital asset pricing model (CAPM for calculation of equity cost, namely on determination of the beta coefficient, which is the only variable, that is dependent on entrepreneurial risk. There are combined both alternative approaches for calculation betas, which means, that there are accounting data used and there is distinguished unlevered beta and levered beta. The novelty of the proposed method is based on including of quantities for measuring operational and financial risks in beta calculation. The volatility of cash flow, as a quantity for measuring of operational risk, is included in the unlevered beta. Return on equity based on the cash flow and the indebtedness are variables used in calculation of the levered beta. This modification makes it possible to calculate the share of operational risk as the proportion of the unlevered/levered beta and the share of financial risk, which is the remainder of levered beta. The modified method is applied on companies from two sectors of the Czech economy. In the data set there are companies from one cyclical sector and from one neutral sector to find out potential differences in the risk structure. The findings show, that in both sectors the share of operational risk is over 50%, however, in the neutral sector is this more dominant.

  18. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    Science.gov (United States)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  19. Declarative event based models of concurrency and refinement in psi-calculi

    DEFF Research Database (Denmark)

    Normann, Håkon; Johansen, Christian; Hildebrandt, Thomas

    2015-01-01

    -calculi representation of Dynamic Condition Response Graphs, which conservatively extends prime event structures to allow finite representations of (omega) regular finite (and infinite) behaviours and have been shown to support run-time adaptation and refinement. We end by outlining the final aim of this research, which...... is to explore nominal calculi for declarative, run-time adaptable mobile processes with shared resources....

  20. An Event-based Distributed Diagnosis Framework using Structural Model Decomposition

    Data.gov (United States)

    National Aeronautics and Space Administration — Complex engineering systems require efficient on-line fault diagnosis methodologies to improve safety and reduce maintenance costs. Traditionally, diagnosis...

  1. Evaluation of Foreign Exchange Risk Capital Requirement Models

    Directory of Open Access Journals (Sweden)

    Ricardo S. Maia Clemente

    2005-12-01

    Full Text Available This paper examines capital requirement for financial institutions in order to cover market risk stemming from exposure to foreign currencies. The models examined belong to two groups according to the approach involved: standardized and internal models. In the first group, we study the Basel model and the model adopted by the Brazilian legislation. In the second group, we consider the models based on the concept of value at risk (VaR. We analyze the single and the double-window historical model, the exponential smoothing model (EWMA and a hybrid approach that combines features of both models. The results suggest that the Basel model is inadequate to the Brazilian market, exhibiting a large number of exceptions. The model of the Brazilian legislation has no exceptions, though generating higher capital requirements than other internal models based on VaR. In general, VaR-based models perform better and result in less capital allocation than the standardized approach model applied in Brazil.

  2. Developing a risk management maturity model: a comprehensive risk maturity model for Dutch municipalities

    NARCIS (Netherlands)

    Cienfuegos Spikin, I.J.

    2013-01-01

    As in the private sector, risk management has gained also increasing popularity by public entities. Nonetheless, the correct implementation of risk management by public entities might be a difficult task to accomplish. The Dutch case is an interesting example, since municipalities in the Netherlands

  3. Beliefs and stochastic modelling of interest rate scenario risk

    Science.gov (United States)

    Galic, E.; Molgedey, L.

    2001-04-01

    We present a framework that allows for a systematic assessment of risk given a specific model and belief on the market. Within this framework the time evolution of risk is modeled in a twofold way. On the one hand, risk is modeled by the time discrete and nonlinear garch(1,1) process, which allows for a (time-)local understanding of its level, together with a short term forecast. On the other hand, via a diffusion approximation, the time evolution of the probability density of risk is modeled by a Fokker-Planck equation. Then, as a final step, using Bayes theorem, beliefs are conditioned on the stationary probability density function as obtained from the Fokker-Planck equation. We believe this to be a highly rigorous framework to integrate subjective judgments of future market behavior and underlying models. In order to demonstrate the approach, we apply it to risk assessment of empirical interest rate scenario methodologies, i.e. the application of Principal Component Analysis to the the dynamics of bonds.

  4. Risk Classification with an Adaptive Naive Bayes Kernel Machine Model.

    Science.gov (United States)

    Minnier, Jessica; Yuan, Ming; Liu, Jun S; Cai, Tianxi

    2015-04-22

    Genetic studies of complex traits have uncovered only a small number of risk markers explaining a small fraction of heritability and adding little improvement to disease risk prediction. Standard single marker methods may lack power in selecting informative markers or estimating effects. Most existing methods also typically do not account for non-linearity. Identifying markers with weak signals and estimating their joint effects among many non-informative markers remains challenging. One potential approach is to group markers based on biological knowledge such as gene structure. If markers in a group tend to have similar effects, proper usage of the group structure could improve power and efficiency in estimation. We propose a two-stage method relating markers to disease risk by taking advantage of known gene-set structures. Imposing a naive bayes kernel machine (KM) model, we estimate gene-set specific risk models that relate each gene-set to the outcome in stage I. The KM framework efficiently models potentially non-linear effects of predictors without requiring explicit specification of functional forms. In stage II, we aggregate information across gene-sets via a regularization procedure. Estimation and computational efficiency is further improved with kernel principle component analysis. Asymptotic results for model estimation and gene set selection are derived and numerical studies suggest that the proposed procedure could outperform existing procedures for constructing genetic risk models.

  5. Model based climate information on drought risk in Africa

    Science.gov (United States)

    Calmanti, S.; Syroka, J.; Jones, C.; Carfagna, F.; Dell'Aquila, A.; Hoefsloot, P.; Kaffaf, S.; Nikulin, G.

    2012-04-01

    The United Nations World Food Programme (WFP) has embarked upon the endeavor of creating a sustainable Africa-wide natural disaster risk management system. A fundamental building block of this initiative is the setup of a drought impact modeling platform called Africa Risk-View that aims to quantify and monitor weather-related food security risk in Africa. The modeling approach is based the Water Requirement Satisfaction Index (WRSI), as the fundamental indicator of the performances of agriculture and uses historical records of food assistance operation to project future potential needs for livelihood protection. By using climate change scenarios as an input to Africa Risk-View it is possible, in principles, to evaluate the future impact of climate variability on critical issues such as food security and the overall performance of the envisaged risk management system. A necessary preliminary step to this challenging task is the exploration of the sources of uncertainties affecting the assessment based on modeled climate change scenarios. For this purpose, a limited set of climate models have been selected in order verify the relevance of using climate model output data with Africa Risk-View and to explore a minimal range of possible sources of uncertainty. This first evaluation exercise started before the setup of the CORDEX framework and has relied on model output available at the time. In particular only one regional downscaling was available for the entire African continent from the ENSEMBLES project. The analysis shows that current coarse resolution global climate models can not directly feed into the Africa RiskView risk-analysis tool. However, regional downscaling may help correcting the inherent biases observed in the datasets. Further analysis is performed by using the first data available under the CORDEX framework. In particular, we consider a set of simulation driven with boundary conditions from the reanalysis ERA-Interim to evaluate the skill drought

  6. An Integrated Risk Management Model for Source Water Protection Areas

    OpenAIRE

    Chiueh, Pei-Te; Shang, Wei-Ting; Lo, Shang-Lien

    2012-01-01

    Watersheds are recognized as the most effective management unit for the protection of water resources. For surface water supplies that use water from upstream watersheds, evaluating threats to water quality and implementing a watershed management plan are crucial for the maintenance of drinking water safe for humans. The aim of this article is to establish a risk assessment model that provides basic information for identifying critical pollutants and areas at high risk for degraded water qual...

  7. A risk computation model for environmental restoration activities

    Energy Technology Data Exchange (ETDEWEB)

    Droppo, J.B. Jr.; Strenge, D.L.; Buck, J.W.

    1991-01-01

    A risk computation model useful in environmental restoration activities was developed for the US Department of Energy (DOE). This model, the Multimedia Environmental Pollutant Assessment System (MEPAS), can be used to evaluate effects of potential exposures over a broad range of regulatory issues including radioactive carcinogenic, nonradioactive carcinogenic, and noncarcinogenic effects. MEPAS integrates risk computation components. Release, transport, dispersion, deposition, exposure, and uptake computations are linked in a single system for evaluation of air, surface water, ground water, and overland flow transport. MEPAS uses standard computation approaches. Whenever available and appropriate, US Environmental Protection Agency guidance and models were used to facilitate compatibility and acceptance. MEPAS is a computational tool that can be used at several phases of an environmental restoration effort. At a preliminary stage in problem characterization, potential problems can be prioritized. As more data become available, MEPAS can provide an estimate of baseline risks or evaluate environmental monitoring data. In the feasibility stage, MEPAS can compute risk from alternative remedies. However, MEPAS is not designed to replace a detailed risk assessment of the selected remedy. For major problems, it will be appropriate to use a more detailed, risk computation tool for a detailed, site-specific evaluation of the selected remedy. 15 refs., 2 figs.

  8. Physics-based Entry, Descent and Landing Risk Model

    Science.gov (United States)

    Gee, Ken; Huynh, Loc C.; Manning, Ted

    2014-01-01

    A physics-based risk model was developed to assess the risk associated with thermal protection system failures during the entry, descent and landing phase of a manned spacecraft mission. In the model, entry trajectories were computed using a three-degree-of-freedom trajectory tool, the aerothermodynamic heating environment was computed using an engineering-level computational tool and the thermal response of the TPS material was modeled using a one-dimensional thermal response tool. The model was capable of modeling the effect of micrometeoroid and orbital debris impact damage on the TPS thermal response. A Monte Carlo analysis was used to determine the effects of uncertainties in the vehicle state at Entry Interface, aerothermodynamic heating and material properties on the performance of the TPS design. The failure criterion was set as a temperature limit at the bondline between the TPS and the underlying structure. Both direct computation and response surface approaches were used to compute the risk. The model was applied to a generic manned space capsule design. The effect of material property uncertainty and MMOD damage on risk of failure were analyzed. A comparison of the direct computation and response surface approach was undertaken.

  9. Validation of a multifactorial risk factor model used for predicting future caries risk with nevada adolescents

    Directory of Open Access Journals (Sweden)

    Mobley Connie

    2011-05-01

    Full Text Available Abstract Background The objective of this study was to measure the validity and reliability of a multifactorial Risk Factor Model developed for use in predicting future caries risk in Nevada adolescents in a public health setting. Methods This study examined retrospective data from an oral health surveillance initiative that screened over 51,000 students 13-18 years of age, attending public/private schools in Nevada across six academic years (2002/2003-2007/2008. The Risk Factor Model included ten demographic variables: exposure to fluoridation in the municipal water supply, environmental smoke exposure, race, age, locale (metropolitan vs. rural, tobacco use, Body Mass Index, insurance status, sex, and sealant application. Multiple regression was used in a previous study to establish which significantly contributed to caries risk. Follow-up logistic regression ascertained the weight of contribution and odds ratios of the ten variables. Researchers in this study computed sensitivity, specificity, positive predictive value (PVP, negative predictive value (PVN, and prevalence across all six years of screening to assess the validity of the Risk Factor Model. Results Subjects' overall mean caries prevalence across all six years was 66%. Average sensitivity across all six years was 79%; average specificity was 81%; average PVP was 89% and average PVN was 67%. Conclusions Overall, the Risk Factor Model provided a relatively constant, valid measure of caries that could be used in conjunction with a comprehensive risk assessment in population-based screenings by school nurses/nurse practitioners, health educators, and physicians to guide them in assessing potential future caries risk for use in prevention and referral practices.

  10. Event-based prospective memory in mildly and severely autistic children.

    Science.gov (United States)

    Sheppard, Daniel P; Kvavilashvili, Lia; Ryder, Nuala

    2016-01-01

    There is a growing body of research into the development of prospective memory (PM) in typically developing children but research is limited in autistic children (Aut) and rarely includes children with more severe symptoms. This study is the first to specifically compare event-based PM in severely autistic children to mildly autistic and typically developing children. Fourteen mildly autistic children and 14 severely autistic children, aged 5-13 years, were matched for educational attainment with 26 typically developing children aged 5-6 years. Three PM tasks and a retrospective memory task were administered. Results showed that severely autistic children performed less well than typically developing children on two PM tasks but mildly autistic children did not differ from either group. No group differences were found on the most motivating (a toy reward) task. The findings suggest naturalistic tasks and motivation are important factors in PM success in severely autistic children and highlights the need to consider the heterogeneity of autism and symptom severity in relation to performance on event-based PM tasks. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Valenced cues and contexts have different effects on event-based prospective memory.

    Science.gov (United States)

    Graf, Peter; Yu, Martin

    2015-01-01

    This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  12. Valenced cues and contexts have different effects on event-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Peter Graf

    Full Text Available This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  13. Social importance enhances prospective memory: evidence from an event-based task.

    Science.gov (United States)

    Walter, Stefan; Meier, Beat

    2017-07-01

    Prospective memory performance can be enhanced by task importance, for example by promising a reward. Typically, this comes at costs in the ongoing task. However, previous research has suggested that social importance (e.g., providing a social motive) can enhance prospective memory performance without additional monitoring costs in activity-based and time-based tasks. The aim of the present study was to investigate the influence of social importance in an event-based task. We compared four conditions: social importance, promising a reward, both social importance and promising a reward, and standard prospective memory instructions (control condition). The results showed enhanced prospective memory performance for all importance conditions compared to the control condition. Although ongoing task performance was slowed in all conditions with a prospective memory task when compared to a baseline condition with no prospective memory task, additional costs occurred only when both the social importance and reward were present simultaneously. Alone, neither social importance nor promising a reward produced an additional slowing when compared to the cost in the standard (control) condition. Thus, social importance and reward can enhance event-based prospective memory at no additional cost.

  14. Event-based Plausibility Immediately Influences On-line Language Comprehension

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically-relevant lexical knowledge such as selectional restrictions is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients. Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns such as hair when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge, rather than lexical-grammatical knowledge. PMID:21517222

  15. Persistent hemifacial spasm after microvascular decompression: a risk assessment model.

    Science.gov (United States)

    Shah, Aalap; Horowitz, Michael

    2017-06-01

    Microvascular decompression (MVD) for hemifacial spasm (HFS) provides resolution of disabling symptoms such as eyelid twitching and muscle contractions of the entire hemiface. The primary aim of this study was to evaluate the predictive value of patient demographics and spasm characteristics on long-term outcomes, with or without intraoperative lateral spread response (LSR) as an additional variable in a risk assessment model. A retrospective study was undertaken to evaluate the associations of pre-operative patient characteristics, as well as intraoperative LSR and need for a staged procedure on the presence of persistent or recurrent HFS at the time of hospital discharge and at follow-up. A risk assessment model was constructed with the inclusion of six clinically or statistically significant variables from the univariate analyses. A receiving operator characteristic curve was generated, and area under the curve was calculated to determine the strength of the predictive model. A risk assessment model was first created consisting of significant pre-operative variables (Model 1) (age >50, female gender, history of botulinum toxin use, platysma muscle involvement). This model demonstrated borderline predictive value for persistent spasm at discharge (AUC .60; p=.045) and fair predictive value at follow-up (AUC .75; p=.001). Intraoperative variables (e.g. LSR persistence) demonstrated little additive value (Model 2) (AUC .67). Patients with a higher risk score (three or greater) demonstrated greater odds of persistent HFS at the time of discharge (OR 1.5 [95%CI 1.16-1.97]; p=.035), as well as greater odds of persistent or recurrent spasm at the time of follow-up (OR 3.0 [95%CI 1.52-5.95]; p=.002) Conclusions: A risk assessment model consisting of pre-operative clinical characteristics is useful in prognosticating HFS persistence at follow-up.

  16. A quality risk management model approach for cell therapy manufacturing.

    Science.gov (United States)

    Lopez, Fabio; Di Bartolo, Chiara; Piazza, Tommaso; Passannanti, Antonino; Gerlach, Jörg C; Gridelli, Bruno; Triolo, Fabio

    2010-12-01

    International regulatory authorities view risk management as an essential production need for the development of innovative, somatic cell-based therapies in regenerative medicine. The available risk management guidelines, however, provide little guidance on specific risk analysis approaches and procedures applicable in clinical cell therapy manufacturing. This raises a number of problems. Cell manufacturing is a poorly automated process, prone to operator-introduced variations, and affected by heterogeneity of the processed organs/tissues and lot-dependent variability of reagent (e.g., collagenase) efficiency. In this study, the principal challenges faced in a cell-based product manufacturing context (i.e., high dependence on human intervention and absence of reference standards for acceptable risk levels) are identified and addressed, and a risk management model approach applicable to manufacturing of cells for clinical use is described for the first time. The use of the heuristic and pseudo-quantitative failure mode and effect analysis/failure mode and critical effect analysis risk analysis technique associated with direct estimation of severity, occurrence, and detection is, in this specific context, as effective as, but more efficient than, the analytic hierarchy process. Moreover, a severity/occurrence matrix and Pareto analysis can be successfully adopted to identify priority failure modes on which to act to mitigate risks. The application of this approach to clinical cell therapy manufacturing in regenerative medicine is also discussed. © 2010 Society for Risk Analysis.

  17. Reassessing risk models for atypical hyperplasia: age may not matter.

    Science.gov (United States)

    Mazzola, Emanuele; Coopey, Suzanne B; Griffin, Molly; Polubriaginof, Fernanda; Buckley, Julliette M; Parmigiani, Giovanni; Garber, Judy E; Smith, Barbara L; Gadd, Michele A; Specht, Michelle C; Guidi, Anthony; Hughes, Kevin S

    2017-09-01

    The aim of this study was to investigate the influence of age at diagnosis of atypical hyperplasia ("atypia", ductal [ADH], lobular [ALH], or severe ADH) on the risk of developing subsequent invasive breast cancer or ductal carcinoma in situ (DCIS). Using standard survival analysis methods, we retrospectively analyzed 1353 women not treated with chemoprevention among a cohort of 2370 women diagnosed with atypical hyperplasia to determine the risk relationship between age at diagnosis and subsequent breast cancer. For all atypia diagnoses combined, our cohort showed a 5-, 10-, and 15-year risk of invasive breast cancer or DCIS of 0.56, 1.25, and 1.30, respectively, with no significant difference in the (65,75] year age group. For women aged (35,75] years, we observed no significant difference in the 15-year risk of invasive breast cancer or DCIS after atypical hyperplasia, although the baseline risk for a 40-year-old woman is approximately 1/8 the risk of a 70-year-old woman. The risks associated with invasive breast cancer or DCIS for women in our cohort diagnosed with ADH, severe ADH, or ALH, regardless of age, were 7.6% (95% CI 5.9-9.3%) at 5 years, 25.1% (20.7-29.2%) at 10 years, and 40.1% (32.8-46.6%) at 15 years. In contrast to current risk prediction models (e.g., Gail, Tyrer-Cuzick) which assume that the risk of developing breast cancer increases in relation to age at diagnosis of atypia, we found the 15-year cancer risk in our cohort was not significantly different for women between the ages of 35 (excluded) and 75. This implies that the "hits" received by the breast tissue along the "high-risk pathway" to cancer might possibly supersede other factors such as age.

  18. Risk Assessment in Fractured Clayey Tills - Which Modeling Tools?

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia; Bjerg, Poul Løgstrup; Binning, Philip John

    2012-01-01

    The article presents different tools available for risk assessment in fractured clayey tills and their advantages and limitations are discussed. Because of the complex processes occurring during contaminant transport through fractured media, the development of simple practical tools for risk...... assessment is challenging and the inclusion of the relevant processes is difficult. Furthermore the lack of long-term monitoring data prevents from verifying the accuracy of the different conceptual models. Further investigations based on long-term data and numerical modeling are needed to accurately...

  19. Applying the welfare model to at-own-risk discharges.

    Science.gov (United States)

    Krishna, Lalit Kumar Radha; Menon, Sumytra; Kanesvaran, Ravindran

    2017-08-01

    "At-own-risk discharges" or "self-discharges" evidences an irretrievable breakdown in the patient-clinician relationship when patients leave care facilities before completion of medical treatment and against medical advice. Dissolution of the therapeutic relationship terminates the physician's duty of care and professional liability with respect to care of the patient. Acquiescence of an at-own-risk discharge by the clinician is seen as respecting patient autonomy. The validity of such requests pivot on the assumptions that the patient is fully informed and competent to invoke an at-own-risk discharge and that care up to the point of the at-own-risk discharge meets prevailing clinical standards. Palliative care's use of a multidisciplinary team approach challenges both these assumptions. First by establishing multiple independent therapeutic relations between professionals in the multidisciplinary team and the patient who persists despite an at-own-risk discharge. These enduring therapeutic relationships negate the suggestion that no duty of care is owed the patient. Second, the continued employ of collusion, familial determinations, and the circumnavigation of direct patient involvement in family-centric societies compromises the patient's decision-making capacity and raises questions as to the patient's decision-making capacity and their ability to assume responsibility for the repercussions of invoking an at-own-risk discharge. With the validity of at-own-risk discharge request in question and the welfare and patient interest at stake, an alternative approach to assessing at-own-risk discharge requests are called for. The welfare model circumnavigates these concerns and preserves the patient's welfare through the employ of a multidisciplinary team guided holistic appraisal of the patient's specific situation that is informed by clinical and institutional standards and evidenced-based practice. The welfare model provides a robust decision-making framework for

  20. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas.

    Science.gov (United States)

    Bedford, Tim; Daneshkhah, Alireza; Wilson, Kevin J

    2016-04-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets. © 2015 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  1. Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk

    Science.gov (United States)

    Gee, Ken; Lawrence, Scott

    2013-01-01

    For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.

  2. Calibration plots for risk prediction models in the presence of competing risks

    DEFF Research Database (Denmark)

    Gerds, Thomas A; Andersen, Per K; Kattan, Michael W

    2014-01-01

    prediction model is well calibrated. The first is lack of independent validation data, the second is right censoring, and the third is that when the risk scale is continuous, the estimation problem is as difficult as density estimation. To deal with these problems, we propose to estimate calibration curves...... such as death due to other causes. For personalized medicine and patient counseling, it is necessary to check that the model is calibrated in the sense that it provides reliable predictions for all subjects. There are three often encountered practical problems when the aim is to display or test if a risk...

  3. Differences and similarities in breast cancer risk assessment models in clinical practice : which model to choose?

    NARCIS (Netherlands)

    Jacobi, Catharina E.; de Bock, Geertruida H.; Siegerink, Bob; van Asperen, Christi J.

    To show differences and similarities between risk estimation models for breast cancer in healthy women from BRCA1/2-negative or untested families. After a systematic literature search seven models were selected: Gail-2, Claus Model, Claus Tables, BOADICEA, Jonker Model, Claus-Extended Formula, and

  4. Sensitivity of Coastal Flood Risk Assessments to Digital Elevation Models

    Directory of Open Access Journals (Sweden)

    Bas van de Sande

    2012-07-01

    Full Text Available Most coastal flood risk studies make use of a Digital Elevation Model (DEM in addition to a projected flood water level in order to estimate the flood inundation and associated damages to property and livelihoods. The resolution and accuracy of a DEM are critical in a flood risk assessment, as land elevation largely determines whether a location will be flooded or will remain dry during a flood event. Especially in low lying deltaic areas, the land elevation variation is usually in the order of only a few decimeters, and an offset of various decimeters in the elevation data has a significant impact on the accuracy of the risk assessment. Publicly available DEMs are often used in studies for coastal flood risk assessments. The accuracy of these datasets is relatively low, in the order of meters, and is especially low in comparison to the level of accuracy required for a flood risk assessment in a deltaic area. For a coastal zone area in Nigeria (Lagos State an accurate LiDAR DEM dataset was adopted as ground truth concerning terrain elevation. In the case study, the LiDAR DEM was compared to various publicly available DEMs. The coastal flood risk assessment using various publicly available DEMs was compared to a flood risk assessment using LiDAR DEMs. It can be concluded that the publicly available DEMs do not meet the accuracy requirement of coastal flood risk assessments, especially in coastal and deltaic areas. For this particular case study, the publically available DEMs highly overestimated the land elevation Z-values and thereby underestimated the coastal flood risk for the Lagos State area. The findings are of interest when selecting data sets for coastal flood risk assessments in low-lying deltaic areas.

  5. Flood risk assessment model using the fuzzy analytic hierarchy process

    Directory of Open Access Journals (Sweden)

    Marija Kerkez

    2017-07-01

    Full Text Available Sustainable development and natural disasters are closely interlinked. The impact of catastrophic events on the environment is still very difficult to determine, and such losses are generally underestimated. Development is never neutral in relation to catastrophes: it creates, enhances or reduces the risk of their occurrence. Selection of appropriate methods and mathematical models for risk assessment in relation to the specific features and characteristics of the considered system and available information and resources, is a key parameter of reliability assessment. Numerous authors applied AHP methods with flood risk assessment, but very limited literature is avaliable on the use of fuzzy multiobjective analysis in flood studies. In the recent years, the fuzzy approach for flood risk assessments has gained greater importance. In this paper, we present the fuzzy analytic hierarchy process (FAHP model for flood risk assessments. Two flood hazard indexes were defined, one based on natural factors and one based on anthropogenic factors. FAHP is applied to data sets to illustrate a model.

  6. Modeling risk and uncertainty in designing reverse logistics problem

    Directory of Open Access Journals (Sweden)

    Aida Nazari Gooran

    2018-01-01

    Full Text Available Increasing attention to environmental problems and social responsibility lead to appear reverse logistic (RL issues in designing supply chain which, in most recently, has received considerable attention from both academicians and practitioners. In this paper, a multi-product reverse logistic network design model is developed; then a hybrid method including Chance-constrained programming, Genetic algorithm and Monte Carlo simulation, are proposed to solve the developed model. The proposed model is solved for risk-averse and risk-seeking decision makers by conditional value at risk, sum of the excepted value and standard deviation, respectively. Comparisons of the results show that minimizing the costs had no direct relation with the kind of decision makers; however, in the most cases, risk-seeking decision maker gained more return products than risk-averse ones. It is clear that by increasing returned products to the chain, production costs of new products and material will be reduced and also by this act, environmental benefits will be created.

  7. Toxicological risk assessment of complex mixtures through the Wtox model

    Directory of Open Access Journals (Sweden)

    William Gerson Matias

    2015-01-01

    Full Text Available Mathematical models are important tools for environmental management and risk assessment. Predictions about the toxicity of chemical mixtures must be enhanced due to the complexity of eects that can be caused to the living species. In this work, the environmental risk was accessed addressing the need to study the relationship between the organism and xenobiotics. Therefore, ve toxicological endpoints were applied through the WTox Model, and with this methodology we obtained the risk classication of potentially toxic substances. Acute and chronic toxicity, citotoxicity and genotoxicity were observed in the organisms Daphnia magna, Vibrio scheri and Oreochromis niloticus. A case study was conducted with solid wastes from textile, metal-mechanic and pulp and paper industries. The results have shown that several industrial wastes induced mortality, reproductive eects, micronucleus formation and increases in the rate of lipid peroxidation and DNA methylation of the organisms tested. These results, analyzed together through the WTox Model, allowed the classication of the environmental risk of industrial wastes. The evaluation showed that the toxicological environmental risk of the samples analyzed can be classied as signicant or critical.

  8. Validation of Three Scoring Risk Stratification Models for Thyroid Nodules.

    Science.gov (United States)

    Ha, Su Min; Ahn, Hye Shin; Baek, Jung Hwan; Ahn, Hwa Young; Chung, Yun Jae; Cho, Bo Youn; Park, Sung Bin

    2017-11-06

    To minimize potential harm from overuse of fine-needle aspiration, Thyroid Imaging Reporting and Data Systems (TIRADSs) were developed for thyroid nodule risk stratification. The purpose of this study was to perform validation of three scoring risk stratification models for thyroid nodules using ultrasonography features, a web-based malignancy risk stratification system at website (http://www.gap.pe.kr/thyroidnodule.php) and those developed by the Korean Society of Thyroid Radiology (KSThR) and the American College of Radiology (ACR). Using ultrasonography images, radiologists assessed thyroid nodules according to the following criteria: internal content, echogenicity of the solid portion, shape, margin, and calcifications. 954 patients (mean age, 50.8 years; range, 13-86 years) with 1112 nodules were evaluated in our institute from January 2013 to December 2014. The discrimination ability of the three models was assessed by estimating the area under the receiver operating characteristic (ROC) curve. Additionally, Hosmer-Lemeshow goodness-of-fit statistics (calibration ability) were used to evaluate the agreement between the observed and expected number of nodules that were benign or malignant. Thyroid malignancy was present in 37.2% of nodules (414/1112). According to the 14-point web-based scoring risk stratification system, malignancy risk ranged from 4.5% to 100.0% and was positively associated with an increase in risk scores. The areas under the ROC curve of the validation set were 0.884 in the web-based, 0.891 in the KSThR, and 0.875 in the ACR scoring risk stratification models. The Hosmer-Lemeshow goodness-of-fit test indicated that the web-based scoring system showed the best-calibrated result with a p value of 0.078. The three scoring risk stratification models using the ultrasonography features of thyroid nodules to stratify malignancy risk showed acceptable predictive accuracy and similar areas under the curve. The web-based scoring system demonstrated

  9. LIFETIME LUNG CANCER RISKS ASSOCIATED WITH INDOOR RADON EXPOSURE BASED ON VARIOUS RADON RISK MODELS FOR CANADIAN POPULATION.

    Science.gov (United States)

    Chen, Jing

    2017-04-01

    This study calculates and compares the lifetime lung cancer risks associated with indoor radon exposure based on well-known risk models in the literature; two risk models are from joint studies among miners and the other three models were developed from pooling studies on residential radon exposure from China, Europe and North America respectively. The aim of this article is to make clear that the various models are mathematical descriptions of epidemiologically observed real risks in different environmental settings. The risk from exposure to indoor radon is real and it is normal that variations could exist among different risk models even when they were applied to the same dataset. The results show that lifetime risk estimates vary significantly between the various risk models considered here: the model based on the European residential data provides the lowest risk estimates, while models based on the European miners and Chinese residential pooling with complete dosimetry give the highest values. The lifetime risk estimates based on the EPA/BEIR-VI model lie within this range and agree reasonably well with the averages of risk estimates from the five risk models considered in this study. © Crown copyright 2016.

  10. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    Science.gov (United States)

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases.

  11. Covariate selection for the semiparametric additive risk model

    DEFF Research Database (Denmark)

    Martinussen, Torben; Scheike, Thomas

    2009-01-01

    This paper considers covariate selection for the additive hazards model. This model is particularly simple to study theoretically and its practical implementation has several major advantages to the similar methodology for the proportional hazards model. One complication compared with the proport......This paper considers covariate selection for the additive hazards model. This model is particularly simple to study theoretically and its practical implementation has several major advantages to the similar methodology for the proportional hazards model. One complication compared...... of observations. We do this by studying the properties of the so-called Dantzig selector in the setting of the additive risk model. Specifically, we establish a bound on how close the solution is to a true sparse signal in the case where the number of covariates is large. In a simulation study, we also compare...

  12. Affine Monotonic and Risk-Sensitive Models in Dynamic Programming

    OpenAIRE

    Bertsekas, Dimitri

    2016-01-01

    In this paper we consider a broad class of infinite horizon discrete-time optimal control models that involve a nonnegative cost function and an affine mapping in their dynamic programming equation. They include as special cases classical models such as stochastic undiscounted nonnegative cost problems, stochastic multiplicative cost problems, and risk-sensitive problems with exponential cost. We focus on the case where the state space is finite and the control space has some compactness prop...

  13. The Impact of Consumer Phase Models in Microbial Risk Analysis

    DEFF Research Database (Denmark)

    Nauta, Maarten; Christensen, Bjarke Bak

    2011-01-01

    In quantitative microbiological risk assessment (QMRA), the consumer phase model (CPM) describes the part of the food chain between purchase of the food product at retail and exposure. Construction of a CPM is complicated by the large variation in consumer food handling practices and a limited......, where all the CPMs were analyzed using one single input distribution of concentrations at retail, and the same dose-response relationship. It was found that, between CPMs, there may be a considerable difference in the estimated probability of illness per serving. However, the estimated relative risk...... reductions are less different for scenarios modeling the implementation of control measures. For control measures affecting the Campylobacter  prevalence, the relative risk is proportional irrespective of the CPM used. However, for control measures affecting the concentration the CPMs show some difference...

  14. A Contextual Risk Model for the Ellsberg Paradox

    CERN Document Server

    Aerts, Diederik

    2011-01-01

    The Allais and Ellsberg paradoxes show that the expected utility hypothesis and Savage's Sure-Thing Principle are violated in real life decisions. The popular explanation in terms of 'ambiguity aversion' is not completely accepted. On the other hand, we have recently introduced a notion of 'contextual risk' to mathematically capture what is known as 'ambiguity' in the economics literature. Situations in which contextual risk occurs cannot be modeled by Kolmogorovian classical probabilistic structures, but a non-Kolmogorovian framework with a quantum-like structure is needed. We prove in this paper that the contextual risk approach can be applied to the Ellsberg paradox, and elaborate a 'sphere model' within our 'hidden measurement formalism' which reveals that it is the overall conceptual landscape that is responsible of the disagreement between actual human decisions and the predictions of expected utility theory, which generates the paradox. This result points to the presence of a 'quantum conceptual layer'...

  15. Individualized Risk Model for Venous Thromboembolism After Total Joint Arthroplasty.

    Science.gov (United States)

    Parvizi, Javad; Huang, Ronald; Rezapoor, Maryam; Bagheri, Behrad; Maltenfort, Mitchell G

    2016-09-01

    Venous thromboembolism (VTE) after total joint arthroplasty (TJA) is a potentially fatal complication. Currently, a standard protocol for postoperative VTE prophylaxis is used that makes little distinction between patients at varying risks of VTE. We sought to develop a simple scoring system identifying patients at higher risk for VTE in whom more potent anticoagulation may need to be administered. Utilizing the National Inpatient Sample data, 1,721,806 patients undergoing TJA were identified, among whom 15,775 (0.9%) developed VTE after index arthroplasty. Among the cohort, all known potential risk factors for VTE were assessed. An initial logistic regression model using potential predictors for VTE was performed. Predictors with little contribution or poor predictive power were pruned from the data, and the model was refit. After pruning of variables that had little to no contribution to VTE risk, using the logistic regression, all independent predictors of VTE after TJA were identified in the data. Relative weights for each factor were determined. Hypercoagulability, metastatic cancer, stroke, sepsis, and chronic obstructive pulmonary disease had some of the highest points. Patients with any of these conditions had risk for postoperative VTE that exceeded the 3% rate. Based on the model, an iOS (iPhone operating system) application was developed (VTEstimator) that could be used to assign patients into low or high risk for VTE after TJA. We believe individualization of VTE prophylaxis after TJA can improve the efficacy of preventing VTE while minimizing untoward risks associated with the administration of anticoagulation. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. A Soft Intelligent Risk Evaluation Model for Credit Scoring Classification

    Directory of Open Access Journals (Sweden)

    Mehdi Khashei

    2015-09-01

    Full Text Available Risk management is one of the most important branches of business and finance. Classification models are the most popular and widely used analytical group of data mining approaches that can greatly help financial decision makers and managers to tackle credit risk problems. However, the literature clearly indicates that, despite proposing numerous classification models, credit scoring is often a difficult task. On the other hand, there is no universal credit-scoring model in the literature that can be accurately and explanatorily used in all circumstances. Therefore, the research for improving the efficiency of credit-scoring models has never stopped. In this paper, a hybrid soft intelligent classification model is proposed for credit-scoring problems. In the proposed model, the unique advantages of the soft computing techniques are used in order to modify the performance of the traditional artificial neural networks in credit scoring. Empirical results of Australian credit card data classifications indicate that the proposed hybrid model outperforms its components, and also other classification models presented for credit scoring. Therefore, the proposed model can be considered as an appropriate alternative tool for binary decision making in business and finance, especially in high uncertainty conditions.

  17. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  18. Risk Prediction Models for Colorectal Cancer: A Systematic Review.

    Science.gov (United States)

    Usher-Smith, Juliet A; Walter, Fiona M; Emery, Jon D; Win, Aung K; Griffin, Simon J

    2016-01-01

    Colorectal cancer is the second leading cause of cancer-related death in Europe and the United States. Survival is strongly related to stage at diagnosis and population-based screening reduces colorectal cancer incidence and mortality. Stratifying the population by risk offers the potential to improve the efficiency of screening. In this systematic review we searched Medline, EMBASE, and the Cochrane Library for primary research studies reporting or validating models to predict future risk of primary colorectal cancer for asymptomatic individuals. A total of 12,808 papers were identified from the literature search and nine through citation searching. Fifty-two risk models were included. Where reported (n = 37), half the models had acceptable-to-good discrimination (the area under the receiver operating characteristic curve, AUROC >0.7) in the derivation sample. Calibration was less commonly assessed (n = 21), but overall acceptable. In external validation studies, 10 models showed acceptable discrimination (AUROC 0.71-0.78). These include two with only three variables (age, gender, and BMI; age, gender, and family history of colorectal cancer). A small number of prediction models developed from case-control studies of genetic biomarkers also show some promise but require further external validation using population-based samples. Further research should focus on the feasibility and impact of incorporating such models into stratified screening programmes. ©2015 American Association for Cancer Research.

  19. Risk factors and prognostic models for perinatal asphyxia at term

    NARCIS (Netherlands)

    Ensing, S.

    2015-01-01

    This thesis will focus on the risk factors and prognostic models for adverse perinatal outcome at term, with a special focus on perinatal asphyxia and obstetric interventions during labor to reduce adverse pregnancy outcomes. For the majority of the studies in this thesis we were allowed to use data

  20. SEMI-COMPETING RISKS ON A TRIVARIATE WEIBULL SURVIVAL MODEL

    Directory of Open Access Journals (Sweden)

    Jenq-Daw Lee

    2008-07-01

    Full Text Available A setting of a trivairate survival function using semi-competing risks concept is proposed, in which a terminal event can only occur after other events. The Stanford Heart Transplant data is reanalyzed using a trivariate Weibull distribution model with the proposed survival function.

  1. Network Interdependency Modeling for Risk Assessment on Built Infrastructure Systems

    Science.gov (United States)

    2013-10-01

    INCOSE). Hokkanen, J., & Salminen, P. (1997). Choosing a solid waste management system using multicriteria decision analysis . European Journal of...inoperability model (IIM) in the analysis of built infrastructure systems. Previous applications of the IIM characterized infrastructure at the national...infrastructure risk, as a result of interdependency effects and component decay, changes over time. Such an analysis provides insight to

  2. Driving Strategic Risk Planning With Predictive Modelling For Managerial Accounting

    DEFF Research Database (Denmark)

    Nielsen, Steen; Pontoppidan, Iens Christian

    for modelling and computing stochastic input variables; and (iii) illustrate how currently available technology has made this stochastic framework easier. The Global Financial Crisis of the last couple of years has re-accentuated the relevance of a concept of risk, and the need for coherence and interrelations...

  3. Review of methods for modelling forest fire risk and hazard

    African Journals Online (AJOL)

    user

    need to identify a method or combination of methods to help model forest fire risk and hazard to enable the sustainability of the natural resources. ..... fire behaviour through variations in the amount of solar radiation andwind thatdifferentaspects ... drying both the soil and the vegetation. Slope. Slope is an extremely important ...

  4. Application of wildfire simulation models for risk analysis

    Science.gov (United States)

    Alan A. Ager; Mark A. Finney

    2009-01-01

    Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of...

  5. Task-based dermal exposure models for regulatory risk assessment

    NARCIS (Netherlands)

    Warren, N.D.; Marquart, H.; Christopher, Y.; Laitinen, J.; Hemmen, J.J. van

    2006-01-01

    The regulatory risk assessment of chemicals requires the estimation of occupational dermal exposure. Until recently, the models used were either based on limited data or were specific to a particular class of chemical or application. The EU project RISKOFDERM has gathered a considerable number of

  6. Doses and models in risk assessment analysis for bronchial hyperresponsiveness

    NARCIS (Netherlands)

    de Marco, R; Bugiani, M; Zanolin, E; Verlato, G; Rijcken, B

    The aims of this study are: (1) to evaluate whether the estimates of the association of risk factors with bronchial hyperresponsiveness (BHR) depends on the accumulated dose administered in challenge tests; and (2) to verify whether a model developed for survival studies (Weibull regression) is

  7. Event-based prospective memory in children with sickle cell disease: effect of cue distinctiveness.

    Science.gov (United States)

    McCauley, Stephen R; Pedroza, Claudia

    2010-01-01

    Event-based prospective memory (EB-PM) is the formation of an intention and remembering to perform it in response to a specific event. Currently, EB-PM performance in children with sickle cell disease (SCD) is unknown. In this study, we designed a computer-based task of EB-PM; No-Stroke, Silent-Infarct, and Overt-Stroke groups performed significantly below the demographically similar control group without SCD. Cue distinctiveness was varied to determine if EB-PM could be improved. All groups, with the exception of the Overt-Stroke group, performed significantly better with a perceptually distinctive cue. Overall, these results suggest that EB-PM can be improved significantly in many children with SCD.

  8. Pinning cluster synchronization in an array of coupled neural networks under event-based mechanism.

    Science.gov (United States)

    Li, Lulu; Ho, Daniel W C; Cao, Jinde; Lu, Jianquan

    2016-04-01

    Cluster synchronization is a typical collective behavior in coupled dynamical systems, where the synchronization occurs within one group, while there is no synchronization among different groups. In this paper, under event-based mechanism, pinning cluster synchronization in an array of coupled neural networks is studied. A new event-triggered sampled-data transmission strategy, where only local and event-triggering states are utilized to update the broadcasting state of each agent, is proposed to realize cluster synchronization of the coupled neural networks. Furthermore, a self-triggered pinning cluster synchronization algorithm is proposed, and a set of iterative procedures is given to compute the event-triggered time instants. Hence, this will reduce the computational load significantly. Finally, an example is given to demonstrate the effectiveness of the theoretical results. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  9. Stabilization of Networked Distributed Systems with Partial and Event-Based Couplings

    Directory of Open Access Journals (Sweden)

    Sufang Zhang

    2015-01-01

    Full Text Available The stabilization problem of networked distributed systems with partial and event-based couplings is investigated. The channels, which are used to transmit different levels of information of agents, are considered. The channel matrix is introduced to indicate the work state of the channels. An event condition is designed for each channel to govern the sampling instants of the channel. Since the event conditions are separately given for different channels, the sampling instants of channels are mutually independent. To stabilize the system, the state feedback controllers are implemented in the system. The control signals also suffer from the two communication constraints. The sufficient conditions in terms of linear matrix equalities are proposed to ensure the stabilization of the controlled system. Finally, a numerical example is given to demonstrate the advantage of our results.

  10. Application of Catastrophe Risk Modelling to Evacuation Public Policy

    Science.gov (United States)

    Woo, G.

    2009-04-01

    The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a

  11. Risk management in organic coffee supply chains : testing the usefulness of critical risk models

    NARCIS (Netherlands)

    Brusselaers, J.F.; Benninga, J.; Hennen, W.H.G.J.

    2011-01-01

    This report documents the findings of the analysis of the supply chain of organic coffee from Uganda to the Netherlands using a Chain Risk Model (CRM). The CRM considers contamination of organic coffee with chemicals as a threat for the supply chain, and analyses the consequences of contamination in

  12. A Hydrological Modeling Framework for Flood Risk Assessment for Japan

    Science.gov (United States)

    Ashouri, H.; Chinnayakanahalli, K.; Chowdhary, H.; Sen Gupta, A.

    2016-12-01

    Flooding has been the most frequent natural disaster that claims lives and imposes significant economic losses to human societies worldwide. Japan, with an annual rainfall of up to approximately 4000 mm is extremely vulnerable to flooding. The focus of this research is to develop a macroscale hydrologic model for simulating flooding toward an improved understanding and assessment of flood risk across Japan. The framework employs a conceptual hydrological model, known as the Probability Distributed Model (PDM), as well as the Muskingum-Cunge flood routing procedure for simulating streamflow. In addition, a Temperature-Index model is incorporated to account for snowmelt and its contribution to streamflow. For an efficient calibration of the model, in terms of computational timing and convergence of the parameters, a set of A Priori parameters is obtained based on the relationships between the model parameters and the physical properties of watersheds. In this regard, we have implemented a particle tracking algorithm and a statistical model which use high resolution Digital Terrain Models to estimate different time related parameters of the model such as time to peak of the unit hydrograph. In addition, global soil moisture and depth data are used to generate A Priori estimation of maximum soil moisture capacity, an important parameter of the PDM model. Once the model is calibrated, its performance is examined during the Typhoon Nabi which struck Japan in September 2005 and caused severe flooding throughout the country. The model is also validated for the extreme precipitation event in 2012 which affected Kyushu. In both cases, quantitative measures show that simulated streamflow depicts good agreement with gauge-based observations. The model is employed to simulate thousands of possible flood events for the entire Japan which makes a basis for a comprehensive flood risk assessment and loss estimation for the flood insurance industry.

  13. Challenges of Modeling Flood Risk at Large Scales

    Science.gov (United States)

    Guin, J.; Simic, M.; Rowe, J.

    2009-04-01

    Flood risk management is a major concern for many nations and for the insurance sector in places where this peril is insured. A prerequisite for risk management, whether in the public sector or in the private sector is an accurate estimation of the risk. Mitigation measures and traditional flood management techniques are most successful when the problem is viewed at a large regional scale such that all inter-dependencies in a river network are well understood. From an insurance perspective the jury is still out there on whether flood is an insurable peril. However, with advances in modeling techniques and computer power it is possible to develop models that allow proper risk quantification at the scale suitable for a viable insurance market for flood peril. In order to serve the insurance market a model has to be event-simulation based and has to provide financial risk estimation that forms the basis for risk pricing, risk transfer and risk management at all levels of insurance industry at large. In short, for a collection of properties, henceforth referred to as a portfolio, the critical output of the model is an annual probability distribution of economic losses from a single flood occurrence (flood event) or from an aggregation of all events in any given year. In this paper, the challenges of developing such a model are discussed in the context of Great Britain for which a model has been developed. The model comprises of several, physically motivated components so that the primary attributes of the phenomenon are accounted for. The first component, the rainfall generator simulates a continuous series of rainfall events in space and time over thousands of years, which are physically realistic while maintaining the statistical properties of rainfall at all locations over the model domain. A physically based runoff generation module feeds all the rivers in Great Britain, whose total length of stream links amounts to about 60,000 km. A dynamical flow routing

  14. Small scale water recycling systems--risk assessment and modelling.

    Science.gov (United States)

    Diaper, C; Dixon, A; Bulier, D; Fewkes, A; Parsons, S A; Strathern, M; Stephenson, T; Strutt, J

    2001-01-01

    This paper aims to use quantitative risk analysis, risk modelling and simulation modelling tools to assess the performance of a proprietary single house grey water recycling system. A preliminary Hazard and Operability study (HAZOP) identified the main hazards, both health related and economic, associated with installing the recycling system in a domestic environment. The health related consequences of system failure were associated with the presence of increased concentrations of micro-organisms at the point of use, due to failure of the disinfection system and/or the pump. The risk model was used to assess the increase in the probability of infection for a particular genus of micro-organism, Salmonella spp, during disinfection failure. The increase in the number of cases of infection above a base rate rose from 0.001% during normal operation, to 4% for a recycling system with no disinfection. The simulation model was used to examine the possible effects of pump failure. The model indicated that the anaerobic COD release rate in the system storage tank increases over time and dissolved oxygen decreases during this failure mode. These conditions are likely to result in odour problems.

  15. Injury prevention risk communication: A mental models approach

    DEFF Research Database (Denmark)

    Austin, Laurel Cecelia; Fischhoff, Baruch

    2012-01-01

    Individuals' decisions and behaviour can play a critical role in determining both the probability and severity of injury. Behavioural decision research studies peoples' decision-making processes in terms comparable to scientific models of optimal choices, providing a basis for focusing interventi......Individuals' decisions and behaviour can play a critical role in determining both the probability and severity of injury. Behavioural decision research studies peoples' decision-making processes in terms comparable to scientific models of optimal choices, providing a basis for focusing...... interventions on the most critical opportunities to reduce risks. That research often seeks to identify the ‘mental models’ that underlie individuals' interpretations of their circumstances and the outcomes of possible actions. In the context of injury prevention, a mental models approach would ask why people...... and uses examples to discuss how the approach can be used to develop scientifically validated context-sensitive injury risk communications....

  16. Risk-trading in flood management: An economic model.

    Science.gov (United States)

    Chang, Chiung Ting

    2017-09-15

    Although flood management is no longer exclusively a topic of engineering, flood mitigation continues to be associated with hard engineering options. Flood adaptation or the capacity to adapt to flood risk, as well as a demand for internalizing externalities caused by flood risk between regions, complicate flood management activities. Even though integrated river basin management has long been recommended to resolve the above issues, it has proven difficult to apply widely, and sometimes even to bring into existence. This article explores how internalization of externalities as well as the realization of integrated river basin management can be encouraged via the use of a market-based approach, namely a flood risk trading program. In addition to maintaining efficiency of optimal resource allocation, a flood risk trading program may also provide a more equitable distribution of benefits by facilitating decentralization. This article employs a graphical analysis to show how flood risk trading can be implemented to encourage mitigation measures that increase infiltration and storage capacity. A theoretical model is presented to demonstrate the economic conditions necessary for flood risk trading. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Modeling risk of pneumonia epizootics in bighorn sheep

    Science.gov (United States)

    Sells, Sarah N.; Mitchell, Michael S.; Nowak, J. Joshua; Lukacs, Paul M.; Anderson, Neil J.; Ramsey, Jennifer M.; Gude, Justin A.; Krausman, Paul R.

    2015-01-01

    Pneumonia epizootics are a major challenge for management of bighorn sheep (Ovis canadensis) affecting persistence of herds, satisfaction of stakeholders, and allocations of resources by management agencies. Risk factors associated with the disease are poorly understood, making pneumonia epizootics hard to predict; such epizootics are thus managed reactively rather than proactively. We developed a model for herds in Montana that identifies risk factors and addresses biological questions about risk. Using Bayesian logistic regression with repeated measures, we found that private land, weed control using domestic sheep or goats, pneumonia history, and herd density were positively associated with risk of pneumonia epizootics in 43 herds that experienced 22 epizootics out of 637 herd-years from 1979–2013. We defined an area of high risk for pathogen exposure as the area of each herd distribution plus a 14.5-km buffer from that boundary. Within this area, the odds of a pneumonia epizootic increased by >1.5 times per additional unit of private land (unit is the standardized % of private land where global  = 25.58% and SD = 14.53%). Odds were >3.3 times greater if domestic sheep or goats were used for weed control in a herd's area of high risk. If a herd or its neighbors within the area of high risk had a history of a pneumonia epizootic, odds of a subsequent pneumonia epizootic were >10 times greater. Risk greatly increased when herds were at high density, with nearly 15 times greater odds of a pneumonia epizootic compared to when herds were at low density. Odds of a pneumonia epizootic also appeared to decrease following increased spring precipitation (odds = 0.41 per unit increase, global  = 100.18% and SD = 26.97%). Risk was not associated with number of federal sheep and goat allotments, proximity to nearest herds of bighorn sheep, ratio of rams to ewes, percentage of average winter precipitation, or whether herds were of native versus mixed

  18. Modeling risk for SOD nationwide: what are the effects of model choice on risk prediction?

    Science.gov (United States)

    M. Kelly; D. Shaari; Q. Guo; D. Liu

    2006-01-01

    Phytophthora ramorum has the potential to infect many forest types found throughout the United States. Efforts to model the potential habitat for P. ramorum and sudden oak death (SOD) are important for disease regulation and management. Yet, spatial models using identical data can have differing results. In this paper we examine...

  19. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    Energy Technology Data Exchange (ETDEWEB)

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival function and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact

  20. SCORING ASSESSMENT AND FORECASTING MODELS BANKRUPTCY RISK OF COMPANIES

    Directory of Open Access Journals (Sweden)

    SUSU Stefanita

    2014-07-01

    Full Text Available Bankruptcy risk made the subject of many research studies that aim at identifying the time of the bankruptcy, the factors that compete to achieve this state, the indicators that best express this orientation (the bankruptcy. The threats to enterprises require the managers knowledge of continually economic and financial situations, and vulnerable areas with development potential. Managers need to identify and properly manage the threats that would prevent achieving the targets. In terms of methods known in the literature of assessment and evaluation of bankruptcy risk they are static, functional, strategic, and scoring nonfinancial models. This article addresses Altman and Conan-Holder-known internationally as the model developed at national level by two teachers from prestigious universities in our country-the Robu-Mironiuc model. Those models are applied to data released by the profit and loss account and balance sheet Turism Covasna company over which bankruptcy risk analysis is performed. The results of the analysis are interpreted while trying to formulate solutions to the economic and financial viability of the entity.

  1. Systemic Thinking and Requisite Holism in Mastering Logistics Risks: the Model for Identifying Risks in Organisations and Supply Chain

    National Research Council Canada - National Science Library

    Bojan Rosi; Teodora Ivanuša; Borut Jereb

    2013-01-01

    .... In the scope of supply chain risk research, we identified some key issues in the field, the major issue being the lack of standardization and models, which can make risk management in an organization...

  2. SYSTEMIC THINKING AND REQUISITE HOLISM IN MASTERING LOGISTICS RISKS: THE MODEL FOR IDENTIFYING RISKS IN ORGANISATIONS AND SUPPLY CHAIN

    National Research Council Canada - National Science Library

    Borut Jereb; Teodora Ivanusa; Bojan Rosi

    2013-01-01

    .... In the scope of supply chain risk research, we identified some key issues in the field, the major issue being the lack of standardization and models, which can make risk management in an organization...

  3. Agents, Bayes, and Climatic Risks - a modular modelling approach

    Directory of Open Access Journals (Sweden)

    A. Haas

    2005-01-01

    Full Text Available When insurance firms, energy companies, governments, NGOs, and other agents strive to manage climatic risks, it is by no way clear what the aggregate outcome should and will be. As a framework for investigating this subject, we present the LAGOM model family. It is based on modules depicting learning social agents. For managing climate risks, our agents use second order probabilities and update them by means of a Bayesian mechanism while differing in priors and risk aversion. The interactions between these modules and the aggregate outcomes of their actions are implemented using further modules. The software system is implemented as a series of parallel processes using the CIAMn approach. It is possible to couple modules irrespective of the language they are written in, the operating system under which they are run, and the physical location of the machine.

  4. Agents, Bayes, and Climatic Risks - a modular modelling approach

    Science.gov (United States)

    Haas, A.; Jaeger, C.

    2005-08-01

    When insurance firms, energy companies, governments, NGOs, and other agents strive to manage climatic risks, it is by no way clear what the aggregate outcome should and will be. As a framework for investigating this subject, we present the LAGOM model family. It is based on modules depicting learning social agents. For managing climate risks, our agents use second order probabilities and update them by means of a Bayesian mechanism while differing in priors and risk aversion. The interactions between these modules and the aggregate outcomes of their actions are implemented using further modules. The software system is implemented as a series of parallel processes using the CIAMn approach. It is possible to couple modules irrespective of the language they are written in, the operating system under which they are run, and the physical location of the machine.

  5. Probabilistic Modeling and Risk Assessment of Cable Icing

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee

    both in relation to ice induced vibrations to assess the fatigue life and in relation to decision making in risk management of bridges exposed to icing. First a basic and preliminary framework for the assessment of cumulative bridge cable fatigue damage due to wind-induced vibrations is presented...... are influencing the two icing mechanisms and their duration. The model is found to be more sensitive to changes in the discretization levels of the input variables. Thirdly the developed operational probabilistic framework for the assessment of the expected number of occurrences of ice/snow accretion on bridge...... of icing events, which is based on the monitoring of environmental conditions and short term forecasting. Decision problems in risk management can be supported by quantifying the value of structural health monitoring (SHM). The approach for the assessment of the value of SHM takes basis in structural risk...

  6. Risk management in industrial projects using structural equation modeling

    Directory of Open Access Journals (Sweden)

    Mohammad Zaripour

    2016-09-01

    Full Text Available This paper presents an empirical investigation to study the effects of different factors influencing on accomplishment of projects in Iranian oil industry. The proposed study designs a questionnaire consists of 50 questions in Likert scale with seven factors including sanctions, economy, scheduling, contractor management weaknesses, cultural/social, force majeure and contractee. The study considers the effects of these factors in three categories; namely risk of project scheduling, risk in project cost and risk in management weakness. Using structural equation modeling, the study confirms that all three factors influence on the success of oil projects. In other words, The results have indicated that budgeting as well as cost accounting is the most important factor in accomplishment of oil projects followed by weakness in management and having an appropriate scheduling.

  7. Drinking and condom use: results from an event-based daily diary.

    Science.gov (United States)

    Leigh, Barbara C; Vanslyke, Jan Gaylord; Hoppe, Marilyn J; Rainey, Damian T; Morrison, Diane M; Gillmore, Mary Rogers

    2008-01-01

    Although it is often assumed that drinking alcohol interferes with condom use, most studies on this topic do not meet the conditions required for causal interpretation. We examined the association of drinking to condom use using data from diaries of alcohol use and sexual encounters, collected over 8 weeks from college students and clients of a sexually transmitted disease clinic. This method establishes the temporal relationships between drinking and condom use and controls for individual differences by using a within-subjects analysis. Multilevel models that predicted condom use from alcohol use before the sexual encounter, partner type, and the use of other contraception showed that drinking before sex was unrelated to condom use. These results do not support the persistent notion that alcohol causes people to engage in sexual risk that they would avoid when sober; instead, people tend to follow their usual pattern of condom use, regardless of alcohol use.

  8. Regime switching model for financial data: Empirical risk analysis

    Science.gov (United States)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  9. FIRESTORM: Modelling the water quality risk of wildfire.

    Science.gov (United States)

    Mason, C. I.; Sheridan, G. J.; Smith, H. G.; Jones, O.; Chong, D.; Tolhurst, K.

    2012-04-01

    Following wildfire, loss of vegetation and changes to soil properties may result in decreases in infiltration rates, less rainfall interception, and higher overland flow velocities. Rainfall events affecting burn areas before vegetation recovers can cause high magnitude erosion events that impact on downstream water quality. For cities and towns that rely upon fire-prone forest catchments for water supply, wildfire impacts on water quality represent a credible risk to water supply security. Quantifying the risk associated with the occurrence of wildfires and the magnitude of water quality impacts has important implications for managing water supplies. At present, no suitable integrative model exists that considers the probabilistic nature of system inputs as well as the range of processes and scales involved in this problem. We present FIRESTORM, a new model currently in development that aims to determine the range of sediment and associated contaminant loads that may be delivered to water supply reservoirs from the combination of wildfire and subsequent rainfall events. This Monte Carlo model incorporates the probabilistic nature of fire ignition, fire weather and rainfall, and includes deterministic models for fire behaviour and locally dominant erosion processes. FIRESTORM calculates the magnitude and associated annual risk of catchment-scale sediment loads associated with the occurrence of wildfire and rainfall generated by two rain event types. The two event types are localised, high intensity, short-duration convective storms, and widespread, longer duration synoptic-scale rainfall events. Initial application and testing of the model will focus on the two main reservoirs supplying water to Melbourne, Australia, both of which are situated in forest catchments vulnerable to wildfire. Probabilistic fire ignition and weather scenarios have been combined using 40 years of fire records and weather observations. These are used to select from a dataset of over 80

  10. A predictive risk model for medical intractability in epilepsy.

    Science.gov (United States)

    Huang, Lisu; Li, Shi; He, Dake; Bao, Weiqun; Li, Ling

    2014-08-01

    This study aimed to investigate early predictors (6 months after diagnosis) of medical intractability in epilepsy. All children models were performed to determine the risk factors for developing medical intractability. Receiver operating characteristic curve was applied to fit the best compounded predictive model. A total of 649 patients were identified, out of which 119 (18%) met the study definition of intractable epilepsy at 2 years after diagnosis, and the rate of intractable epilepsy in patients with idiopathic syndromes was 12%. Multivariate logistic regression analysis revealed that neurodevelopmental delay, symptomatic etiology, partial seizures, and more than 10 seizures before diagnosis were significant and independent risk factors for intractable epilepsy. The best model to predict medical intractability in epilepsy comprised neurological physical abnormality, age at onset of epilepsy under 1 year, more than 10 seizures before diagnosis, and partial epilepsy, and the area under receiver operating characteristic curve was 0.7797. This model also fitted best in patients with idiopathic syndromes. A predictive model of medically intractable epilepsy composed of only four characteristics is established. This model is comparatively accurate and simple to apply clinically. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Comparison of the Framingham Risk Score, SCORE and WHO/ISH cardiovascular risk prediction models in an Asian population.

    Science.gov (United States)

    Selvarajah, Sharmini; Kaur, Gurpreet; Haniff, Jamaiyah; Cheong, Kee Chee; Hiong, Tee Guat; van der Graaf, Yolanda; Bots, Michiel L

    2014-09-01

    Cardiovascular risk-prediction models are used in clinical practice to identify and treat high-risk populations, and to communicate risk effectively. We assessed the validity and utility of four cardiovascular risk-prediction models in an Asian population of a middle-income country. Data from a national population-based survey of 14,863 participants aged 40 to 65 years, with a follow-up duration of 73,277 person-years was used. The Framingham Risk Score (FRS), SCORE (Systematic COronary Risk Evaluation)-high and -low cardiovascular-risk regions and the World Health Organization/International Society of Hypertension (WHO/ISH) models were assessed. The outcome of interest was 5-year cardiovascular mortality. Discrimination was assessed for all models and calibration for the SCORE models. Cardiovascular risk factors were highly prevalent; smoking 20%, obesity 32%, hypertension 55%, diabetes mellitus 18% and hypercholesterolemia 34%. The FRS and SCORE models showed good agreement in risk stratification. The FRS, SCORE-high and -low models showed good discrimination for cardiovascular mortality, areas under the ROC curve (AUC) were 0.768, 0.774 and 0.775 respectively. The WHO/ISH model showed poor discrimination, AUC=0.613. Calibration of the SCORE-high model was graphically and statistically acceptable for men (χ(2) goodness-of-fit, p=0.097). The SCORE-low model was statistically acceptable for men (χ(2) goodness-of-fit, p=0.067). Both SCORE-models underestimated risk in women (p<0.001). The FRS and SCORE-high models, but not the WHO/ISH model can be used to identify high cardiovascular risk in the Malaysian population. The SCORE-high model predicts risk accurately in men but underestimated it in women. Copyright © 2014. Published by Elsevier Ireland Ltd.

  12. Artificial Systems and Models for Risk Covering Operations

    Directory of Open Access Journals (Sweden)

    Laurenţiu Mihai Treapăt

    2017-04-01

    Full Text Available Mainly, this paper focuses on the roles of artificial intelligence based systems and especially on risk-covering operations. In this context, the paper comes with theoretical explanations on real-life based examples and applications. From a general perspective, the paper enriches its value with a wide discussion on the related subject. The paper aims to revise the volatilities’ estimation models and the correlations between the various time series and also by presenting the Risk Metrics methodology, as explained is a case study. The advantages that the VaR estimation offers, consist of its ability to quantitatively and numerically express the risk level of a portfolio, at a certain moment in time and also the risk of on open position (in titles, in FX, commodities or granted loans, belonging to an economic agent or even individual; hence, its role in a more efficient capital allocation, in the assumed risk delimitation, and also as a performance measurement instrument. In this paper and the study case that completes our work, we aim to prove how we can prevent considerable losses and even bankruptcies if VaR is known and applied accordingly. For this reason, the universities inRomaniashould include or increase their curricula with the study of the VaR model as an artificial intelligence tool. The simplicity of the presented case study, most probably, is the strongest argument of the current work because it can be understood also by the readers that are not necessarily very experienced in the risk management field.

  13. Cost Model for Risk Assessment of Company Operation in Audit

    Directory of Open Access Journals (Sweden)

    S. V.

    2017-12-01

    Full Text Available This article explores the approach to assessing the risk of company activities termination by building a cost model. This model gives auditors information on managers’ understanding of factors influencing change in the value of assets and liabilities, and the methods to identify it in more effective and reliable ways. Based on this information, the auditor can assess the adequacy of use of the assumption on continuity of company operation by management personnel when preparing financial statements. Financial uncertainty entails real manifestations of factors creating risks of the occurrence of costs, revenue losses due their manifestations, which in the long run can be a reason for termination of company operation, and, therefore, need to be foreseen in the auditor’s assessment of the adequacy of use of the continuity assumption when preparing financial statements by company management. The purpose of the study is to explore and develop a methodology for use of cost models to assess the risk of termination of company operation in audit. The issue of methodology for assessing the audit risk through analyzing methods for company valuation has not been dealt with. The review of methodologies for assessing the risks of termination of company operation in course of audit gives grounds for the conclusion that use of cost models can be an effective methodology for identification and assessment of such risks. The analysis of the above methods gives understanding of the existing system for company valuation, integrated into the management system, and the consequences of its use, i. e. comparison of the asset price data with the accounting data and the market value of the asset data. Overvalued or undervalued company assets may be a sign of future sale or liquidation of a company, which may signal on high probability of termination of company operation. A wrong choice or application of valuation methods can be indicative of the risk of non

  14. An animal model of differential genetic risk for methamphetamine intake

    Directory of Open Access Journals (Sweden)

    Tamara ePhillips

    2015-09-01

    Full Text Available The question of whether genetic factors contribute to risk for methamphetamine (MA use and dependence has not been intensively investigated. Compared to human populations, genetic animal models offer the advantages of control over genetic family history and drug exposure. Using selective breeding, we created lines of mice that differ in genetic risk for voluntary MA intake and identified the chromosomal addresses of contributory genes. A quantitative trait locus was identified on chromosome 10 that accounts for more than 50% of the genetic variance in MA intake in the selected mouse lines. In addition, behavioral and physiological screening identified differences corresponding with risk for MA intake that have generated hypotheses that are testable in humans. Heightened sensitivity to aversive and certain physiological effects of MA, such as MA-induced reduction in body temperature, are hallmarks of mice bred for low MA intake. Furthermore, unlike MA-avoiding mice, MA-preferring mice are sensitive to rewarding and reinforcing MA effects, and to MA-induced increases in brain extracellular dopamine levels. Gene expression analyses implicate the importance of a network enriched in transcription factor genes, some of which regulate the mu opioid receptor gene, Oprm1, in risk for MA use. Neuroimmune factors appear to play a role in differential response to MA between the mice bred for high and low intake. In addition, chromosome 10 candidate gene studies provide strong support for a trace amine associated receptor 1 gene, Taar1, polymorphism in risk for MA intake. MA is a trace amine-associated receptor 1 (TAAR1 agonist, and a non-functional Taar1 allele segregates with high MA consumption. Thus, reduced TAAR1 function has the potential to increase risk for MA use. Overall, existing findings support the MA drinking lines as a powerful model for identifying genetic factors involved in determining risk for harmful MA use. Future directions include the

  15. Future bloom and blossom frost risk for Malus domestica considering climate model and impact model uncertainties.

    Science.gov (United States)

    Hoffmann, Holger; Rath, Thomas

    2013-01-01

    The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological ( = impact) models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany) for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021-2050 compared to 1971-2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078-2087. The projected phenophases advanced by 5.5 d K(-1), showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of the following forcing phase in spring. Finally, phenological model performance was improved by considering the length of day.

  16. Future bloom and blossom frost risk for Malus domestica considering climate model and impact model uncertainties.

    Directory of Open Access Journals (Sweden)

    Holger Hoffmann

    Full Text Available The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological ( = impact models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021-2050 compared to 1971-2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078-2087. The projected phenophases advanced by 5.5 d K(-1, showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of the following forcing phase in spring. Finally, phenological model performance was improved by considering the length of day.

  17. Modeling risk-related knowledge in tunneling projects.

    Science.gov (United States)

    Cárdenas, Ibsen Chivatá; Al-Jibouri, Saad S H; Halman, Johannes I M; van Tol, Frits A

    2014-02-01

    Knowledge on failure events and their associated factors, gained from past construction projects, is regarded as potentially extremely useful in risk management. However, a number of circumstances are constraining its wider use. Such knowledge is usually scarce, seldom documented, and even unavailable when it is required. Further, there exists a lack of proven methods to integrate and analyze it in a cost-effective way. This article addresses possible options to overcome these difficulties. Focusing on limited but critical potential failure events, the article demonstrates how knowledge on a number of important potential failure events in tunnel works can be integrated. The problem of unavailable or incomplete information was addressed by gathering judgments from a group of experts. The elicited expert knowledge consisted of failure scenarios and associated probabilistic information. This information was integrated using Bayesian belief-networks-based models that were first customized in order to deal with the expected divergence in judgments caused by epistemic uncertainty of risks. The work described in the article shows that the developed models that integrate risk-related knowledge provide guidance as to the use of specific remedial measures. © 2013 Society for Risk Analysis.

  18. Development of a statistical oil spill model for risk assessment.

    Science.gov (United States)

    Guo, Weijun

    2017-11-01

    To gain a better understanding of the impacts from potential risk sources, we developed an oil spill model using probabilistic method, which simulates numerous oil spill trajectories under varying environmental conditions. The statistical results were quantified from hypothetical oil spills under multiple scenarios, including area affected probability, mean oil slick thickness, and duration of water surface exposed to floating oil. The three sub-indices together with marine area vulnerability are merged to compute the composite index, characterizing the spatial distribution of risk degree. Integral of the index can be used to identify the overall risk from an emission source. The developed model has been successfully applied in comparison to and selection of an appropriate oil port construction location adjacent to a marine protected area for Phoca largha in China. The results highlight the importance of selection of candidates before project construction, since that risk estimation from two adjacent potential sources may turn out to be significantly different regarding hydrodynamic conditions and eco-environmental sensitivity. Copyright © 2017. Published by Elsevier Ltd.

  19. Development of a risk-analysis model. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-01

    This report consists of a main body, which provides a presentation of risk analysis and its general and specific application to the needs of the Office of Buildings and Community Systems of the Department of Energy; and several case studies employing the risk-analysis model developed. The highlights include a discussion of how risk analysis is currently used in the private, regulated, and public sectors and how this methodology can be employed to meet the policy-analysis needs of the Office of Buildings and Community Systems of the Department of Energy (BCS/DOE). After a review of the primary methodologies available for risk analysis, it was determined that Monte Carlo simulation techniques provide the greatest degree of visibility into uncertainty in the decision-making process. Although the data-collection requirements can be demanding, the benefits, when compared to other methods, are substantial. The data-collection problem can be significantly reduced, without sacrificing proprietary-information rights, if prior arrangements are made with RD and D contractors to provide responses to reasonable requests for base-case data. A total of three case studies were performed on BCS technologies: a gas-fired heat pump; a 1000 ton/day anaerobic digestion plant; and a district heating and cooling system. The three case studies plus the risk-analysis methodology were issued as separate reports. It is concluded that, based on the overall research of risk analysis and the case-study experience, that the risk-analysis methodology has significant potential as a policy-evaluation tool within BCS.

  20. Making Risk Models Operational for Situational Awareness and Decision Support

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Patrick R.; Coles, Garill A.; Shoemaker, Steven V.

    2012-06-12

    Modernization of nuclear power operations control systems, in particular the move to digital control systems, creates an opportunity to modernize existing legacy infrastructure and extend plant life. We describe here decision support tools that allow the assessment of different facets of risk and support the optimization of available resources to reduce risk as plants are upgraded and maintained. This methodology could become an integrated part of the design review process and a part of the operations management systems. The methodology can be applied to the design of new reactors such as small nuclear reactors (SMR), and be helpful in assessing the risks of different configurations of the reactors. Our tool provides a low cost evaluation of alternative configurations and provides an expanded safety analysis by considering scenarios while early in the implementation cycle where cost impacts can be minimized. The effects of failures can be modeled and thoroughly vetted to understand their potential impact on risk. The process and tools presented here allow for an integrated assessment of risk by supporting traditional defense in depth approaches while taking into consideration the insertion of new digital instrument and control systems.

  1. Social models of HIV risk among young adults in Lesotho.

    Science.gov (United States)

    Bulled, Nicola L

    2015-01-01

    Extensive research over the past 30 years has revealed that individual and social determinants impact HIV risk. Even so, prevention efforts focus primarily on individual behaviour change, with little recognition of the dynamic interplay of individual and social environment factors that further exacerbate risk engagement. Drawing on long-term research with young adults in Lesotho, I examine how social environment factors contribute to HIV risk. During preliminary ethnographic analysis, I developed novel scales to measure social control, adoption of modernity, and HIV knowledge. In survey research, I examined the effects of individual characteristics (i.e., socioeconomic status, HIV knowledge, adoption of modernity) and social environment (i.e., social control) on HIV risk behaviours. In addition, I measured the impact of altered environments by taking advantage of an existing situation whereby young adults attending a national college are assigned to either a main campus in a metropolitan setting or a satellite campus in a remote setting, irrespective of the environment in which they were socialised as youth. This arbitrary assignment process generates four distinct groups of young adults with altered or constant environments. Regression models show that lower levels of perceived social control and greater adoption of modernity are associated with HIV risk, controlling for other factors. The impact of social control and modernity varies with environment dynamics.

  2. Development and application of urban stormwater risk analysis modeling system

    Directory of Open Access Journals (Sweden)

    Yuwen ZHOU

    2018-02-01

    Full Text Available In order to evaluate the pipeline drainage capacity and urban flood risk objectively, urban storm water risk analysis modeling system (USRAMS is developed based on EPA SWMM and ArcEngine combined with the principle of equal flow time-line method and water balance. The principle and function of USRAMS and the method of assessing pipeline drainage capacity and urban flood risk with USRAMS are described, and the suitability of USRAMS is verified. Taking Cangzhou as an example, USRAMS is used to assess the pipeline drainage capacity and urban flood risk, and the position of bottleneck pipe and the waterlogging risk distribution are expressed in thematic map accurately and directly. The facts show that USRAMS only needs two parameters, namely the runoff coefficient and the catchment time, and adapts to the current situation in china better; two-dimensional earth surface waterlogging simulation adopts hydrologic budget theory, so its calculation is stable and fast. The results of USRAMS may provide reference for urban waterlogging prevention and control and the planning and reforming of stormwater pipe network system.

  3. Estimation of Employee Turnover with Competing Risks Models

    Directory of Open Access Journals (Sweden)

    Grzenda Wioletta

    2015-12-01

    Full Text Available Employee turnover accompanies every business organization, regardless of the industry and size. Nowadays, many companies struggle with problems related to the lack of sufficient information about the nature of employee turnover processes. Therefore, comprehensive analysis of these processes is necessary. This article aims to examine the turnover of employees from a big manufacturing company using competing risks models with covariates and without covariates. This technique allows to incorporate the information about the type of employment contract termination. Moreover, Cox proportional hazard model enables the researcher to analyse simultaneously multiple factors that affect employment duration. One of the major observations is that employee remuneration level differentiates most strongly the risk of job resignation.

  4. Tackling Biocomplexity with Meta-models for Species Risk Assessment

    Directory of Open Access Journals (Sweden)

    Philip J. Nyhus

    2007-06-01

    Full Text Available We describe results of a multi-year effort to strengthen consideration of the human dimension into endangered species risk assessments and to strengthen research capacity to understand biodiversity risk assessment in the context of coupled human-natural systems. A core group of social and biological scientists have worked with a network of more than 50 individuals from four countries to develop a conceptual framework illustrating how human-mediated processes influence biological systems and to develop tools to gather, translate, and incorporate these data into existing simulation models. A central theme of our research focused on (1 the difficulties often encountered in identifying and securing diverse bodies of expertise and information that is necessary to adequately address complex species conservation issues; and (2 the development of quantitative simulation modeling tools that could explicitly link these datasets as a way to gain deeper insight into these issues. To address these important challenges, we promote a "meta-modeling" approach where computational links are constructed between discipline-specific models already in existence. In this approach, each model can function as a powerful stand-alone program, but interaction between applications is achieved by passing data structures describing the state of the system between programs. As one example of this concept, an integrated meta-model of wildlife disease and population biology is described. A goal of this effort is to improve science-based capabilities for decision making by scientists, natural resource managers, and policy makers addressing environmental problems in general, and focusing on biodiversity risk assessment in particular.

  5. A Highly Predictive Risk Model for Pacemaker Implantation After TAVR.

    Science.gov (United States)

    Maeno, Yoshio; Abramowitz, Yigal; Kawamori, Hiroyuki; Kazuno, Yoshio; Kubo, Shunsuke; Takahashi, Nobuyuki; Mangat, Geeteshwar; Okuyama, Kazuaki; Kashif, Mohammad; Chakravarty, Tarun; Nakamura, Mamoo; Cheng, Wen; Friedman, John; Berman, Daniel; Makkar, Raj R; Jilaihawi, Hasan

    2017-10-01

    This study sought to develop a robust and definitive risk model for new permanent pacemaker implantation (PPMI) after SAPIEN 3 (third generation balloon expandable valve) (Edwards Lifesciences, Irvine, California) transcatheter aortic valve replacement (third generation balloon expandable valve TAVR), including calcification in the aortic-valvular complex (AVC). The association between calcium in the AVC and need for PPMI is poorly delineated after third generation balloon expandable valve TAVR. At Cedars-Sinai Heart Institute in Los Angeles, California, a total of 240 patients with severe aortic stenosis underwent third generation balloon expandable valve TAVR and had contrast computed tomography. AVC was characterized precisely by leaflet sector and region. The total new PPMI rate was 14.6%. On multivariate analysis for predictors of PPMI, pre-procedure third generation balloon expandable valve TAVR, right bundle branch block (RBBB), shorter membranous septum (MS) length, and noncoronary cusp device-landing zone calcium volume (NCC-DLZ CA) were included. Predictive probabilities were generated using this logistic regression model. If 3 pre-procedural risk factors were present, the c-statistic of the model for PPMI was area under the curve of 0.88, sensitivity of 77.1%, and specificity of 87.1%; this risk model had high negative predictive value (95.7%). The addition of the procedural factor of device depth to the model, with the parameter of difference between implantation depth and MS length, combined with RBBB and NCC-DLZ CA increased the c-statistic to 0.92, sensitivity to 94.3%, specificity to 83.8%, and negative predictive value to 98.8% CONCLUSIONS: By using a precise characterization of distribution of calcification in the AVC in a single-center, retrospective study, NCC-DLZ CA was found to be an independent predictor of new PPMI post-third generation balloon expandable valve TAVR. The findings also reinforce the importance of short MS length, pre

  6. Asymptotic solutions of diffusion models for risk reserves

    Directory of Open Access Journals (Sweden)

    S. Shao

    2003-01-01

    Full Text Available We study a family of diffusion models for risk reserves which account for the investment income earned and for the inflation experienced on claim amounts. After we defined the process of the conditional probability of ruin over finite time and imposed the appropriate boundary conditions, classical results from the theory of diffusion processes turn the stochastic differential equation to a special class of initial and boundary value problems defined by a linear diffusion equation. Armed with asymptotic analysis and perturbation theory, we obtain the asymptotic solutions of the diffusion models (possibly degenerate governing the conditional probability of ruin over a finite time in terms of interest rate.

  7. Quantitative Risk Modeling of Fire on the International Space Station

    Science.gov (United States)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  8. Modeling the Risk of Secondary Malignancies after Radiotherapy

    Directory of Open Access Journals (Sweden)

    Uwe Schneider

    2011-11-01

    Full Text Available In developed countries, more than half of all cancer patients receive radiotherapy at some stage in the management of their disease. However, a radiation-induced secondary malignancy can be the price of success if the primary cancer is cured or at least controlled. Therefore, there is increasing concern regarding radiation-related second cancer risks in long-term radiotherapy survivors and a corresponding need to be able to predict cancer risks at high radiation doses. Of particular interest are second cancer risk estimates for new radiation treatment modalities such as intensity modulated radiotherapy, intensity modulated arc-therapy, proton and heavy ion radiotherapy. The long term risks from such modern radiotherapy treatment techniques have not yet been determined and are unlikely to become apparent for many years, due to the long latency time for solid tumor induction. Most information on the dose-response of radiation-induced cancer is derived from data on the A-bomb survivors who were exposed to γ-rays and neutrons. Since, for radiation protection purposes, the dose span of main interest is between zero and one Gy, the analysis of the A-bomb survivors is usually focused on this range. With increasing cure rates, estimates of cancer risk for doses larger than one Gy are becoming more important for radiotherapy patients. Therefore in this review, emphasis was placed on doses relevant for radiotherapy with respect to radiation induced solid cancer. Simple radiation protection models should be used only with extreme care for risk estimates in radiotherapy, since they are developed exclusively for low dose. When applied to scatter radiation, such models can predict only a fraction of observed second malignancies. Better semi-empirical models include the effect of dose fractionation and represent the dose-response relationships more accurately. The involved uncertainties are still huge for most of the organs and tissues. A major reason for

  9. Clostridium Difficile Infection Due to Pneumonia Treatment: Mortality Risk Models.

    Science.gov (United States)

    Chmielewska, M; Zycinska, K; Lenartowicz, B; Hadzik-Błaszczyk, M; Cieplak, M; Kur, Z; Wardyn, K A

    2017-01-01

    One of the most common gastrointestinal infection after the antibiotic treatment of community or nosocomial pneumonia is caused by the anaerobic spore Clostridium difficile (C. difficile). The aim of this study was to retrospectively assess mortality due to C. difficile infection (CDI) in patients treated for pneumonia. We identified 94 cases of post-pneumonia CDI out of the 217 patients with CDI. The mortality issue was addressed by creating a mortality risk models using logistic regression and multivariate fractional polynomial analysis. The patients' demographics, clinical features, and laboratory results were taken into consideration. To estimate the influence of the preceding respiratory infection, a pneumonia severity scale was included in the analysis. The analysis showed two statistically significant and clinically relevant mortality models. The model with the highest prognostic strength entailed age, leukocyte count, serum creatinine and urea concentration, hematocrit, coexisting neoplasia or chronic obstructive pulmonary disease. In conclusion, we report on two prognostic models, based on clinically relevant factors, which can be of help in predicting mortality risk in C. difficile infection, secondary to the antibiotic treatment of pneumonia. These models could be useful in preventive tailoring of individual therapy.

  10. Guide for developing conceptual models for ecological risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W., II

    1996-05-01

    Ecological conceptual models are the result of the problem formulation phase of an ecological risk assessment, which is an important component of the Remedial Investigation process. They present hypotheses of how the site contaminants might affect the site ecology. The contaminant sources, routes, media, routes, and endpoint receptors are presented in the form of a flow chart. This guide is for preparing the conceptual models; use of this guide will standardize the models so that they will be of high quality, useful to the assessment process, and sufficiently consistent so that connections between sources of exposure and receptors can be extended across operable units (OU). Generic conceptual models are presented for source, aquatic integrator, groundwater integrator, and terrestrial OUs.

  11. The Risk GP Model: the standard model of prediction in medicine.

    Science.gov (United States)

    Fuller, Jonathan; Flores, Luis J

    2015-12-01

    With the ascent of modern epidemiology in the Twentieth Century came a new standard model of prediction in public health and clinical medicine. In this article, we describe the structure of the model. The standard model uses epidemiological measures-most commonly, risk measures-to predict outcomes (prognosis) and effect sizes (treatment) in a patient population that can then be transformed into probabilities for individual patients. In the first step, a risk measure in a study population is generalized or extrapolated to a target population. In the second step, the risk measure is particularized or transformed to yield probabilistic information relevant to a patient from the target population. Hence, we call the approach the Risk Generalization-Particularization (Risk GP) Model. There are serious problems at both stages, especially with the extent to which the required assumptions will hold and the extent to which we have evidence for the assumptions. Given that there are other models of prediction that use different assumptions, we should not inflexibly commit ourselves to one standard model. Instead, model pluralism should be standard in medical prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Interpreting incremental value of markers added to risk prediction models.

    Science.gov (United States)

    Pencina, Michael J; D'Agostino, Ralph B; Pencina, Karol M; Janssens, A Cecile J W; Greenland, Philip

    2012-09-15

    The discrimination of a risk prediction model measures that model's ability to distinguish between subjects with and without events. The area under the receiver operating characteristic curve (AUC) is a popular measure of discrimination. However, the AUC has recently been criticized for its insensitivity in model comparisons in which the baseline model has performed well. Thus, 2 other measures have been proposed to capture improvement in discrimination for nested models: the integrated discrimination improvement and the continuous net reclassification improvement. In the present study, the authors use mathematical relations and numerical simulations to quantify the improvement in discrimination offered by candidate markers of different strengths as measured by their effect sizes. They demonstrate that the increase in the AUC depends on the strength of the baseline model, which is true to a lesser degree for the integrated discrimination improvement. On the other hand, the continuous net reclassification improvement depends only on the effect size of the candidate variable and its correlation with other predictors. These measures are illustrated using the Framingham model for incident atrial fibrillation. The authors conclude that the increase in the AUC, integrated discrimination improvement, and net reclassification improvement offer complementary information and thus recommend reporting all 3 alongside measures characterizing the performance of the final model.

  13. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2009-01-01

    Full Text Available Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results.

  14. Utilization of time varying event-based customer interruption cost load shedding schemes

    Energy Technology Data Exchange (ETDEWEB)

    Wangdee, Wijarn; Billinton, Roy [University of Saskatchewan, Saskatoon (Canada). Power System Research Group, Department of Electrical Engineering

    2005-12-01

    Load curtailments occurring under emergency conditions can have significant monetary impacts on the system customers. Customer satisfaction is becoming increasingly important in the new deregulated electric utility environment, and the customers in some jurisdictions are beginning to receive monetary compensation for power supply failures. Minimizing the customer interruption costs associated with a load curtailment event is an important factor in maintaining customer satisfaction. Customer interruption costs depend on many factors such as the customer types interrupted, the actual load demand at the time of the outage, the duration of the outage, the time of day and the day in which the outage occurs. This paper focuses on incorporating these interruption cost factors in a load shedding strategy. The load shedding algorithm was developed using an approximate event-based customer interruption cost evaluation technique to identify and determine the priority of the distribution feeders on a given bus during an emergency. The developed algorithm incorporates a time dependent feeder cost priority index (FCP). The optimum load shedding set determined using the FCP is a feeder or group of feeders that meet a capacity deficiency, and result in the lowest customer interruption cost for the specified emergency situation. This paper illustrates the algorithm development for a load shedding scheme and demonstrates the utilization of the technique on a sample load bus. (author)

  15. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  16. Modelling the genetic risk in age-related macular degeneration.

    Directory of Open Access Journals (Sweden)

    Felix Grassmann

    Full Text Available Late-stage age-related macular degeneration (AMD is a common sight-threatening disease of the central retina affecting approximately 1 in 30 Caucasians. Besides age and smoking, genetic variants from several gene loci have reproducibly been associated with this condition and likely explain a large proportion of disease. Here, we developed a genetic risk score (GRS for AMD based on 13 risk variants from eight gene loci. The model exhibited good discriminative accuracy, area-under-curve (AUC of the receiver-operating characteristic of 0.820, which was confirmed in a cross-validation approach. Noteworthy, younger AMD patients aged below 75 had a significantly higher mean GRS (1.87, 95% CI: 1.69-2.05 than patients aged 75 and above (1.45, 95% CI: 1.36-1.54. Based on five equally sized GRS intervals, we present a risk classification with a relative AMD risk of 64.0 (95% CI: 14.11-1131.96 for individuals in the highest category (GRS 3.44-5.18, 0.5% of the general population compared to subjects with the most common genetic background (GRS -0.05-1.70, 40.2% of general population. The highest GRS category identifies AMD patients with a sensitivity of 7.9% and a specificity of 99.9% when compared to the four lower categories. Modeling a general population around 85 years of age, 87.4% of individuals in the highest GRS category would be expected to develop AMD by that age. In contrast, only 2.2% of individuals in the two lowest GRS categories which represent almost 50% of the general population are expected to manifest AMD. Our findings underscore the large proportion of AMD cases explained by genetics particularly for younger AMD patients. The five-category risk classification could be useful for therapeutic stratification or for diagnostic testing purposes once preventive treatment is available.

  17. The temporal version of the pediatric sepsis biomarker risk model.

    Science.gov (United States)

    Wong, Hector R; Weiss, Scott L; Giuliano, John S; Wainwright, Mark S; Cvijanovich, Natalie Z; Thomas, Neal J; Allen, Geoffrey L; Anas, Nick; Bigham, Michael T; Hall, Mark; Freishtat, Robert J; Sen, Anita; Meyer, Keith; Checchia, Paul A; Shanley, Thomas P; Nowak, Jeffrey; Quasney, Michael; Chopra, Arun; Fitzgerald, Julie C; Gedeit, Rainer; Banschbach, Sharon; Beckman, Eileen; Harmon, Kelli; Lahni, Patrick; Lindsell, Christopher J

    2014-01-01

    PERSEVERE is a risk model for estimating mortality probability in pediatric septic shock, using five biomarkers measured within 24 hours of clinical presentation. Here, we derive and test a temporal version of PERSEVERE (tPERSEVERE) that considers biomarker values at the first and third day following presentation to estimate the probability of a "complicated course", defined as persistence of ≥2 organ failures at seven days after meeting criteria for septic shock, or death within 28 days. Biomarkers were measured in the derivation cohort (n = 225) using serum samples obtained during days 1 and 3 of septic shock. Classification and Regression Tree (CART) analysis was used to derive a model to estimate the risk of a complicated course. The derived model was validated in the test cohort (n = 74), and subsequently updated using the combined derivation and test cohorts. A complicated course occurred in 23% of the derivation cohort subjects. The derived model had a sensitivity for a complicated course of 90% (95% CI 78-96), specificity was 70% (62-77), positive predictive value was 47% (37-58), and negative predictive value was 96% (91-99). The area under the receiver operating characteristic curve was 0.85 (0.79-0.90). Similar test characteristics were observed in the test cohort. The updated model had a sensitivity of 91% (81-96), a specificity of 70% (64-76), a positive predictive value of 47% (39-56), and a negative predictive value of 96% (92-99). tPERSEVERE reasonably estimates the probability of a complicated course in children with septic shock. tPERSEVERE could potentially serve as an adjunct to physiological assessments for monitoring how risk for poor outcomes changes during early interventions in pediatric septic shock.

  18. Time-based collision risk modeling for air traffic management

    Science.gov (United States)

    Bell, Alan E.

    Since the emergence of commercial aviation in the early part of last century, economic forces have driven a steadily increasing demand for air transportation. Increasing density of aircraft operating in a finite volume of airspace is accompanied by a corresponding increase in the risk of collision, and in response to a growing number of incidents and accidents involving collisions between aircraft, governments worldwide have developed air traffic control systems and procedures to mitigate this risk. The objective of any collision risk management system is to project conflicts and provide operators with sufficient opportunity to recognize potential collisions and take necessary actions to avoid them. It is therefore the assertion of this research that the currency of collision risk management is time. Future Air Traffic Management Systems are being designed around the foundational principle of four dimensional trajectory based operations, a method that replaces legacy first-come, first-served sequencing priorities with time-based reservations throughout the airspace system. This research will demonstrate that if aircraft are to be sequenced in four dimensions, they must also be separated in four dimensions. In order to separate aircraft in four dimensions, time must emerge as the primary tool by which air traffic is managed. A functional relationship exists between the time-based performance of aircraft, the interval between aircraft scheduled to cross some three dimensional point in space, and the risk of collision. This research models that relationship and presents two key findings. First, a method is developed by which the ability of an aircraft to meet a required time of arrival may be expressed as a robust standard for both industry and operations. Second, a method by which airspace system capacity may be increased while maintaining an acceptable level of collision risk is presented and demonstrated for the purpose of formulating recommendations for procedures

  19. Sensitivity Analysis of the Bone Fracture Risk Model

    Science.gov (United States)

    Lewandowski, Beth; Myers, Jerry; Sibonga, Jean Diane

    2017-01-01

    Introduction: The probability of bone fracture during and after spaceflight is quantified to aid in mission planning, to determine required astronaut fitness standards and training requirements and to inform countermeasure research and design. Probability is quantified with a probabilistic modeling approach where distributions of model parameter values, instead of single deterministic values, capture the parameter variability within the astronaut population and fracture predictions are probability distributions with a mean value and an associated uncertainty. Because of this uncertainty, the model in its current state cannot discern an effect of countermeasures on fracture probability, for example between use and non-use of bisphosphonates or between spaceflight exercise performed with the Advanced Resistive Exercise Device (ARED) or on devices prior to installation of ARED on the International Space Station. This is thought to be due to the inability to measure key contributors to bone strength, for example, geometry and volumetric distributions of bone mass, with areal bone mineral density (BMD) measurement techniques. To further the applicability of model, we performed a parameter sensitivity study aimed at identifying those parameter uncertainties that most effect the model forecasts in order to determine what areas of the model needed enhancements for reducing uncertainty. Methods: The bone fracture risk model (BFxRM), originally published in (Nelson et al) is a probabilistic model that can assess the risk of astronaut bone fracture. This is accomplished by utilizing biomechanical models to assess the applied loads; utilizing models of spaceflight BMD loss in at-risk skeletal locations; quantifying bone strength through a relationship between areal BMD and bone failure load; and relating fracture risk index (FRI), the ratio of applied load to bone strength, to fracture probability. There are many factors associated with these calculations including

  20. Event-Based Impulsive Control of Continuous-Time Dynamic Systems and Its Application to Synchronization of Memristive Neural Networks.

    Science.gov (United States)

    Zhu, Wei; Wang, Dandan; Liu, Lu; Feng, Gang

    2017-08-18

    This paper investigates exponential stabilization of continuous-time dynamic systems (CDSs) via event-based impulsive control (EIC) approaches, where the impulsive instants are determined by certain state-dependent triggering condition. The global exponential stability criteria via EIC are derived for nonlinear and linear CDSs, respectively. It is also shown that there is no Zeno-behavior for the concerned closed loop control system. In addition, the developed event-based impulsive scheme is applied to the synchronization problem of master and slave memristive neural networks. Furthermore, a self-triggered impulsive control scheme is developed to avoid continuous communication between the master system and slave system. Finally, two numerical simulation examples are presented to illustrate the effectiveness of the proposed event-based impulsive controllers.

  1. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  2. How to Decide on Modeling Details: Risk and Benefit Assessment.

    Science.gov (United States)

    Özilgen, Mustafa

    Mathematical models based on thermodynamic, kinetic, heat, and mass transfer analysis are central to this chapter. Microbial growth, death, enzyme inactivation models, and the modeling of material properties, including those pertinent to conduction and convection heating, mass transfer, such as diffusion and convective mass transfer, and thermodynamic properties, such as specific heat, enthalpy, and Gibbs free energy of formation and specific chemical exergy are also needed in this task. The origins, simplifying assumptions, and uses of model equations are discussed in this chapter, together with their benefits. The simplified forms of these models are sometimes referred to as "laws," such as "the first law of thermodynamics" or "Fick's second law." Starting to modeling a study with such "laws" without considering the conditions under which they are valid runs the risk of ending up with erronous conclusions. On the other hand, models started with fundamental concepts and simplified with appropriate considerations may offer explanations for the phenomena which may not be obtained just with measurements or unprocessed experimental data. The discussion presented here is strengthened with case studies and references to the literature.

  3. Climate-based risk models for Fasciola hepatica in Colombia.

    Science.gov (United States)

    Valencia-López, Natalia; Malone, John B; Carmona, Catalina Gómez; Velásquez, Luz E

    2012-09-01

    A predictive Fasciola hepatica model, based on the growing degree day-water budget (GDD-WB) concept and the known biological requirements of the parasite, was developed within a geographical information system (GIS) in Colombia. Climate-based forecast index (CFI) values were calculated and represented in a national-scale, climate grid (18 x 18 km) using ArcGIS 9.3. A mask overlay was used to exclude unsuitable areas where mean annual temperature exceeded 25 °C, the upper threshold for development and propagation of the F. hepatica life cycle. The model was then validated and further developed by studies limited to one department in northwest Colombia. F. hepatica prevalence data was obtained from a 2008-2010 survey in 10 municipalities of 6,016 dairy cattle at 673 herd study sites, for which global positioning system coordinates were recorded. The CFI map results were compared to F. hepatica environmental risk models for the survey data points that had over 5% prevalence (231 of the 673 sites) at the 1 km2 scale using two independent approaches: (i) a GIS map query based on satellite data parameters including elevation, enhanced vegetation index and land surface temperature day-night difference; and (ii) an ecological niche model (MaxEnt), for which geographic point coordinates of F. hepatica survey farms were used with BioClim data as environmental variables to develop a probability map. The predicted risk pattern of both approaches was similar to that seen in the forecast index grid. The temporal risk, evaluated by the monthly CFIs and a daily GDD-WB forecast software for 2007 and 2008, revealed a major July-August to January transmission period with considerable inter-annual differences.

  4. Climate-based risk models for Fasciola hepatica in Colombia

    Directory of Open Access Journals (Sweden)

    Natalia Valencia-López

    2012-09-01

    Full Text Available A predictive Fasciola hepatica model, based on the growing degree day-water budget (GDD-WB concept and the known biological requirements of the parasite, was developed within a geographical information system (GIS in Colombia. Climate-based forecast index (CFI values were calculated and represented in a national-scale, climate grid (18 x 18 km using ArcGIS 9.3. A mask overlay was used to exclude unsuitable areas where mean annual temperature exceeded 25 °C, the upper threshold for development and propagation of the F. hepatica life cycle. The model was then validated and further developed by studies limited to one department in northwest Colombia. F. hepatica prevalence data was obtained from a 2008-2010 survey in 10 municipalities of 6,016 dairy cattle at 673 herd study sites, for which global positioning system coordinates were recorded. The CFI map results were compared to F. hepatica environmental risk models for the survey data points that had over 5% prevalence (231 of the 673 sites at the 1 km2 scale using two independent approaches: (i a GIS map query based on satellite data parameters including elevation, enhanced vegetation index and land surface temperature day-night difference; and (ii an ecological niche model (MaxEnt, for which geographic point coordinates of F. hepatica survey farms were used with BioClim data as environmental variables to develop a probability map. The predicted risk pattern of both approaches was similar to that seen in the forecast index grid. The temporal risk, evaluated by the monthly CFIs and a daily GDD-WB forecast software for 2007 and 2008, revealed a major July-August to January transmission period with considerable inter-annual differences.

  5. A study of preservice elementary teachers enrolled in a discrepant-event-based physical science class

    Science.gov (United States)

    Lilly, James Edward

    This research evaluated the POWERFUL IDEAS IN PHYSICAL SCIENCE (PIiPS) curriculum model used to develop a physical science course taken by preservice elementary teachers. The focus was on the evaluation of discrepant events used to induce conceptual change in relation to students' ideas concerning heat, temperature, and specific heat. Both quantitative and qualitative methodologies were used for the analysis. Data was collected during the 1998 Fall semester using two classes of physical science for elementary school teachers. The traditionally taught class served as the control group and the class using the PIiPS curriculum model was the experimental group. The PIiPS curriculum model was evaluated quantitatively for its influence on students' attitude toward science, anxiety towards teaching science, self efficacy toward teaching science, and content knowledge. An analysis of covariance was performed on the quantitative data to test for significant differences between the means of the posttests for the control and experimental groups while controlling for pretest. It was found that there were no significant differences between the means of the control and experimental groups with respect to changes in their attitude toward science, anxiety toward teaching science and self efficacy toward teaching science. A significant difference between the means of the content examination was found (F(1,28) = 14.202 and p = 0.001), however, the result is questionable. The heat and energy module was the target for qualitative scrutiny. Coding for discrepant events was adapted from Appleton's 1996 work on student's responses to discrepant event science lessons. The following qualitative questions were posed for the investigation: (1) what were the ideas of the preservice elementary students prior to entering the classroom regarding heat and energy, (2) how effective were the discrepant events as presented in the PIiPS heat and energy module, and (3) how much does the "risk taking

  6. Improving the prediction model used in risk equalization: cost and diagnostic information from multiple prior years

    NARCIS (Netherlands)

    S.H.C.M. van Veen (Suzanne); R.C. van Kleef (Richard); W.P.M.M. van de Ven (Wynand); R.C.J.A. van Vliet (René)

    2015-01-01

    markdownabstract__Abstract__ Currently-used risk-equalization models do not adequately compensate insurers for predictable differences in individuals' health care expenses. Consequently, insurers face incentives for risk rating and risk selection, both of which jeopardize affordability of

  7. Model for assessing cardiovascular risk in a Korean population.

    Science.gov (United States)

    Park, Gyung-Min; Han, Seungbong; Kim, Seon Ha; Jo, Min-Woo; Her, Sung Ho; Lee, Jung Bok; Lee, Moo Song; Kim, Hyeon Chang; Ahn, Jung-Min; Lee, Seung-Whan; Kim, Young-Hak; Kim, Beom-Jun; Koh, Jung-Min; Kim, Hong-Kyu; Choe, Jaewon; Park, Seong-Wook; Park, Seung-Jung

    2014-11-01

    A model for predicting cardiovascular disease in Asian populations is limited. In total, 57 393 consecutive asymptomatic Korean individuals aged 30 to 80 years without a prior history of cardiovascular disease who underwent a general health examination were enrolled. Subjects were randomly classified into the train (n=45 914) and validation (n=11 479) cohorts. Thirty-one possible risk factors were assessed. The cardiovascular event was a composite of cardiovascular death, myocardial infarction, and stroke. In the train cohort, the C-index (95% confidence interval) and Akaike Information Criterion were used to develop the best-fitting prediction model. In the validation cohort, the predicted versus the observed cardiovascular event rates were compared by the C-index and Nam and D'Agostino χ(2) statistics. During a median follow-up period of 3.1 (interquartile range, 1.9-4.3) years, 458 subjects had 474 cardiovascular events. In the train cohort, the best-fitting model consisted of age, diabetes mellitus, hypertension, current smoking, family history of coronary heart disease, white blood cell, creatinine, glycohemoglobin, atrial fibrillation, blood pressure, and cholesterol (C-index =0.757 [0.726-0.788] and Akaike Information Criterion =7207). When this model was tested in the validation cohort, it performed well in terms of discrimination and calibration abilities (C-index=0.760 [0.693-0.828] and Nam and D'Agostino χ(2) statistic =0.001 for 3 years; C-index=0.782 [0.719-0.846] and Nam and D'Agostino χ(2) statistic=1.037 for 5 years). A risk model based on traditional clinical and biomarkers has a feasible model performance in predicting cardiovascular events in an asymptomatic Korean population. © 2014 American Heart Association, Inc.

  8. A weight restricted DEA model for FMEA risk prioritization

    Directory of Open Access Journals (Sweden)

    Pauli Adriano de Almada Garcia

    2012-01-01

    Full Text Available In this paper we present a linear programming (LP approach to risk prioritization in failure mode and effects analysis (FMEA. The LP is a data envelopment analysis (DEA-based model considering weight restriction. In a FMEA, we commonly consider three criteria to prioritize the failure modes, occurrence, severity and detectability. These criteria are in an ordinal scale commonly varying from 1 to 10, higher the figure worse the result. Considering the values established for each criteria, in traditional FMEA one adopts a Risk Priority Number, calculated considering the product of criteria, which has been very criticized due to its shortcoming. Through the proposed approach a frontier is established considering the less critical failure modes. Considering this frontier, one can establish how much each failure mode must be improved to become relatively acceptable. A simplified case concerning an AFWS of a two loops PWR power plant is presented to shows the applicability of the proposed approach.

  9. Trust and risk: a model for medical education.

    Science.gov (United States)

    Damodaran, Arvin; Shulruf, Boaz; Jones, Philip

    2017-09-01

    Health care delivery, and therefore medical education, is an inherently risky business. Although control mechanisms, such as external audit and accreditation, are designed to manage risk in clinical settings, another approach is 'trust'. The use of entrustable professional activities (EPAs) represents a deliberate way in which this is operationalised as a workplace-based assessment. Once engaged with the concept, clinical teachers and medical educators may have further questions about trust. This narrative overview of the trust literature explores how risk, trust and control intersect with current thinking in medical education, and makes suggestions for potential directions of enquiry. Beyond EPAs, the importance of trust in health care and medical education is reviewed, followed by a brief history of trust research in the wider literature. Interpersonal and organisational levels of trust and a model of trust from the management literature are used to provide the framework with which to decipher trust decisions in health care and medical education, in which risk and vulnerability are inherent. In workplace learning and assessment, the language of 'trust' may offer a more authentic and practical vocabulary than that of 'competency' because clinical and professional risks are explicitly considered. There are many other trust relationships in health care and medical education. At the most basic level, it is helpful to clearly delineate who is the trustor, the trustee, and for what task. Each relationship has interpersonal and organisational elements. Understanding and considered utilisation of trust and control mechanisms in health care and medical education may lead to systems that maturely manage risk while actively encouraging trust and empowerment. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  10. Agreement between event-based and trend-based glaucoma progression analyses.

    Science.gov (United States)

    Rao, H L; Kumbar, T; Kumar, A U; Babu, J G; Senthil, S; Garudadri, C S

    2013-07-01

    To evaluate the agreement between event- and trend-based analyses to determine visual field (VF) progression in glaucoma. VFs of 175 glaucoma eyes with ≥5 VFs were analyzed by proprietary software of VF analyzer to determine progression. Agreement (κ) between trend-based analysis of VF index (VFI) and event-based analysis (glaucoma progression analysis, GPA) was evaluated. For eyes progressing by event- and trend-based methods, time to progression by two methods was calculated. Median number of VFs per eye was 7 and follow-up 7.5 years. GPA classified 101 eyes (57.7%) as stable, 30 eyes (17.1%) as possible and 44 eyes (25.2%) as likely progression. Trend-based analysis classified 122 eyes (69.7%) as stable (slope >-1% per year or any slope magnitude with P>0.05), 53 eyes (30.3%) as progressing with slope trend-based analysis was 0.48, and between specific criteria of GPA (possible clubbed with no progression) and trend-based analysis was 0.50. In eyes progressing by sensitive criteria of both methods (42 eyes), median time to progression by GPA (4.9 years) was similar (P=0.30) to trend-based method (5.0 years). This was also similar in eyes progressing by specific criteria of both methods (25 eyes; 5.6 years versus 5.9 years, P=0.23). Agreement between event- and trend-based progression analysis was moderate. GPA seemed to detect progression earlier than trend-based analysis, but this wasn't statistically significant.

  11. Modeling the operational risk in Iranian commercial banks: case study of a private bank

    Science.gov (United States)

    Momen, Omid; Kimiagari, Alimohammad; Noorbakhsh, Eaman

    2012-08-01

    The Basel Committee on Banking Supervision from the Bank for International Settlement classifies banking risks into three main categories including credit risk, market risk, and operational risk. The focus of this study is on the operational risk measurement in Iranian banks. Therefore, issues arising when trying to implement operational risk models in Iran are discussed, and then, some solutions are recommended. Moreover, all steps of operational risk measurement based on Loss Distribution Approach with Iran's specific modifications are presented. We employed the approach of this study to model the operational risk of an Iranian private bank. The results are quite reasonable, comparing the scale of bank and other risk categories.

  12. Modelling Counterparty Credit Risk in Czech Interest Rate Swaps

    Directory of Open Access Journals (Sweden)

    Lenka Křivánková

    2017-01-01

    Full Text Available According to the Basel Committee’s estimate, three quarters of counterparty credit risk losses during the financial crisis in 2008 originate from credit valuation adjustment’s losses and not from actual defaults. Therefore, from 2015, the Third Basel Accord (EU, 2013a and (EU, 2013b instructed banks to calculate the capital requirement for the risk of credit valuation adjustment (CVA. Banks are trying to model CVA to hold the prescribed standards and also reach the lowest possible impact on their profit. In this paper, we try to model CVA using methods that are in compliance with the prescribed standards and also achieve the smallest possible impact on the bank’s earnings. To do so, a data set of interest rate swaps from 2015 is used. The interest rate term structure is simulated using the Hull-White one-factor model and Monte Carlo methods. Then, the probability of default for each counterparty is constructed. A safe level of CVA is reached in spite of the calculated the CVA achieving a lower level than CVA previously used by the bank. This allows a reduction of capital requirements for banks.

  13. Permafrost degradation risk zone assessment using simulation models

    Directory of Open Access Journals (Sweden)

    R. P. Daanen

    2011-11-01

    Full Text Available In this proof-of-concept study we focus on linking large scale climate and permafrost simulations to small scale engineering projects by bridging the gap between climate and permafrost sciences on the one hand and on the other technical recommendation for adaptation of planned infrastructures to climate change in a region generally underlain by permafrost. We present the current and future state of permafrost in Greenland as modelled numerically with the GIPL model driven by HIRHAM climate projections up to 2080. We develop a concept called Permafrost Thaw Potential (PTP, defined as the potential active layer increase due to climate warming and surface alterations. PTP is then used in a simple risk assessment procedure useful for engineering applications. The modelling shows that climate warming will result in continuing wide-spread permafrost warming and degradation in Greenland, in agreement with present observations. We provide examples of application of the risk zone assessment approach for the two towns of Sisimiut and Ilulissat, both classified with high PTP.

  14. Systemic Thinking and Requisite Holism in Mastering Logistics Risks: the Model for Identifying Risks in Organisations and Supply Chain

    OpenAIRE

    Bojan Rosi; Teodora Ivanuša; Borut Jereb

    2013-01-01

    Risks in logistic processes represent one of the major issues in supply chain management nowadays. Every organization strives for success, and uninterrupted operations are the key factors in achieving this goal, which cannot be achieved without efficient risk management. In the scope of supply chain risk research, we identified some key issues in the field, the major issue being the lack of standardization and models, which can make risk management in an organization easier and more efficient...

  15. Documentation of the Ecological Risk Assessment Computer Model ECORSK.5

    Energy Technology Data Exchange (ETDEWEB)

    Anthony F. Gallegos; Gilbert J. Gonzales

    1999-06-01

    The FORTRAN77 ecological risk computer model--ECORSK.5--has been used to estimate the potential toxicity of surficial deposits of radioactive and non-radioactive contaminants to several threatened and endangered (T and E) species at the Los Alamos National Laboratory (LANL). These analyses to date include preliminary toxicity estimates for the Mexican spotted owl, the American peregrine falcon, the bald eagle, and the southwestern willow flycatcher. This work has been performed as required for the Record of Decision for the construction of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility at LANL as part of the Environmental Impact Statement. The model is dependent on the use of the geographic information system and associated software--ARC/INFO--and has been used in conjunction with LANL's Facility for Information Management and Display (FIMAD) contaminant database. The integration of FIMAD data and ARC/INFO using ECORSK.5 allows the generation of spatial information from a gridded area of potential exposure called an Ecological Exposure Unit. ECORSK.5 was used to simulate exposures using a modified Environmental Protection Agency Quotient Method. The model can handle a large number of contaminants within the home range of T and E species. This integration results in the production of hazard indices which, when compared to risk evaluation criteria, estimate the potential for impact from consumption of contaminants in food and ingestion of soil. The assessment is considered a Tier-2 type of analysis. This report summarizes and documents the ECORSK.5 code, the mathematical models used in the development of ECORSK.5, and the input and other requirements for its operation. Other auxiliary FORTRAN 77 codes used for processing and graphing output from ECORSK.5 are also discussed. The reader may refer to reports cited in the introduction to obtain greater detail on past applications of ECORSK.5 and assumptions used in deriving model parameters.

  16. Permafrost Degradation Risk Zone Assessment using Simulation Models

    DEFF Research Database (Denmark)

    Daanen, R.P.; Ingeman-Nielsen, Thomas; Marchenko, S.

    2011-01-01

    as the potential active layer increase due to climate warming and surface alterations. PTP is then used in a simple risk assessment procedure useful for engineering applications. The modelling shows that climate warming will result in continuing wide-spread permafrost warming and degradation in Greenland......In this proof-of-concept study we focus on linking large scale climate and permafrost simulations to small scale engineering projects by bridging the gap between climate and permafrost sciences on the one hand and on the other technical recommendation for adaptation of planned infrastructures...

  17. Model-based risk analysis of coupled process steps.

    Science.gov (United States)

    Westerberg, Karin; Broberg-Hansen, Ernst; Sejergaard, Lars; Nilsson, Bernt

    2013-09-01

    A section of a biopharmaceutical manufacturing process involving the enzymatic coupling of a polymer to a therapeutic protein was characterized with regards to the process parameter sensitivity and design space. To minimize the formation of unwanted by-products in the enzymatic reaction, the substrate was added in small amounts and unreacted protein was separated using size-exclusion chromatography (SEC) and recycled to the reactor. The quality of the final recovered product was thus a result of the conditions in both the reactor and the SEC, and a design space had to be established for both processes together. This was achieved by developing mechanistic models of the reaction and SEC steps, establishing the causal links between process conditions and product quality. Model analysis was used to complement the qualitative risk assessment, and design space and critical process parameters were identified. The simulation results gave an experimental plan focusing on the "worst-case regions" in terms of product quality and yield. In this way, the experiments could be used to verify both the suggested process and the model results. This work demonstrates the necessary steps of model-assisted process analysis, from model development through experimental verification. Copyright © 2013 Wiley Periodicals, Inc.

  18. Modelling surface water flood risk using coupled numerical and physical modelling techniques

    Science.gov (United States)

    Green, D. L.; Pattison, I.; Yu, D.

    2015-12-01

    Surface water (pluvial) flooding occurs due to intense precipitation events where rainfall cannot infiltrate into the sub-surface or drain via storm water systems. The perceived risk appears to have increased in recent years with pluvial flood events seeming more severe and frequent within the UK. Surface water flood risk currently accounts for one third of all UK flood risk, with approximately two million people living in urban areas being at risk of a 1 in 200 year flood event. Surface water flooding research often focuses upon using 1D, 2D or 1D-2D coupled numerical modelling techniques to understand the extent, depth and severity of actual or hypothetical flood scenarios. Although much research has been conducted using numerical modelling, field data available for model calibration and validation is limited due to the complexities associated with data collection in surface water flood conditions. Ultimately, the data which numerical models are based upon is often erroneous and inconclusive. Physical models offer an alternative and innovative environment to collect data within. A controlled, closed system allows independent variables to be altered individually to investigate cause and effect relationships. Despite this, physical modelling approaches are seldom used in surface water flooding research. Scaled laboratory experiments using a 9m2, two-tiered physical model consisting of: (i) a mist nozzle type rainfall simulator able to simulate a range of rainfall intensities similar to those observed within the United Kingdom, and; (ii) a fully interchangeable, scaled plot surface have been conducted to investigate and quantify the influence of factors such as slope, impermeability, building density/configuration and storm dynamics on overland flow and rainfall-runoff patterns within a range of terrestrial surface conditions. Results obtained within the physical modelling environment will be compared with numerical modelling results using FloodMap (Yu & Lane, 2006

  19. Modeling urban flood risk territories for Riga city

    Science.gov (United States)

    Piliksere, A.; Sennikovs, J.; Virbulis, J.; Bethers, U.; Bethers, P.; Valainis, A.

    2012-04-01

    Riga, the capital of Latvia, is located on River Daugava at the Gulf of Riga. The main flooding risks of Riga city are: (1) storm caused water setup in South part of Gulf of Riga (storm event), (2) water level increase caused by Daugava River discharge maximums (spring snow melting event) and (3) strong rainfall or rapid snow melting in densely populated urban areas. The first two flooding factors were discussed previously (Piliksere et al, 2011). The aims of the study were (1) the identification of the flood risk situations in densely populated areas, (2) the quantification of the flooding scenarios caused by rain and snow melting events of different return periods nowadays, in the near future (2021-2050), far future (2071-2100) taking into account the projections of climate change, (3) estimation of groundwater level for Riga city, (4) the building and calibration of the hydrological mathematical model based on SWMM (EPA, 2004) for the domain potentially vulnerable for rain and snow melt flooding events, (5) the calculation of rain and snow melting flood events with different return periods, (6) mapping the potentially flooded areas on a fine grid. The time series of short term precipitation events during warm time period of year (id est. rain events) were analyzed for 35 year long time period. Annual maxima of precipitation intensity for events with different duration (5 min; 15 min; 1h; 3h; 6h; 12h; 1 day; 2 days; 4 days; 10 days) were calculated. The time series of long term simultaneous precipitation data and observations of the reduction of thickness of snow cover were analyzed for 27 year long time period. Snow thawing periods were detected and maximum of snow melting intensity for events with different intensity (1day; 2 days; 4 days; 7 days; 10 days) were calculated. According to the occurrence probability six scenarios for each event for nowadays, near and far future with return period once in 5, 10, 20, 50, 100 and 200 years were constructed based on

  20. Pediatric Sepsis Biomarker Risk Model-II: Redefining the Pediatric Sepsis Biomarker Risk Model With Septic Shock Phenotype.

    Science.gov (United States)

    Wong, Hector R; Cvijanovich, Natalie Z; Anas, Nick; Allen, Geoffrey L; Thomas, Neal J; Bigham, Michael T; Weiss, Scott L; Fitzgerald, Julie; Checchia, Paul A; Meyer, Keith; Quasney, Michael; Hall, Mark; Gedeit, Rainer; Freishtat, Robert J; Nowak, Jeffrey; Raj, Shekhar S; Gertz, Shira; Howard, Kelli; Harmon, Kelli; Lahni, Patrick; Frank, Erin; Hart, Kimberly W; Nguyen, Trung C; Lindsell, Christopher J

    2016-11-01

    The Pediatric Sepsis Biomarker Risk Model (PERSEVERE), a pediatric sepsis risk model, uses biomarkers to estimate baseline mortality risk for pediatric septic shock. It is unknown how PERSEVERE performs within distinct septic shock phenotypes. We tested PERSEVERE in children with septic shock and thrombocytopenia-associated multiple organ failure (TAMOF), and in those without new onset thrombocytopenia but with multiple organ failure (MOF). PERSEVERE-based mortality risk was generated for each study subject (n = 660). A priori, we determined that if PERSEVERE did not perform well in both the TAMOF and the MOF cohorts, we would revise PERSEVERE to incorporate admission platelet counts. Multiple PICUs in the United States. Standard care. PERSEVERE performed well in the TAMOF cohort (areas under the receiver operating characteristic curves [AUC], 0.84 [95% CI, 0.77-0.90]), but less well in the MOF cohort (AUC, 0.71 [0.61-0.80]). PERSEVERE was revised using 424 subjects previously reported in the derivation phase. PERSEVERE-II had an AUC of 0.89 (0.85-0.93) and performed equally well across TAMOF and MOF cohorts. PERSEVERE-II performed well when tested in 236 newly enrolled subjects. Sample size calculations for a clinical trial testing the efficacy of plasma exchange for children with septic shock and TAMOF indicated PERSEVERE-II-based stratification could substantially reduce the number of patients necessary, when compared with no stratification. Testing PERSEVERE in the context of septic shock phenotypes prompted a revision incorporating platelet count. PERSEVERE-II performs well upon testing, independent of TAMOF or MOF status. PERSEVERE-II could potentially serve as a prognostic enrichment tool.

  1. Study of Event-based Sampling Techniques and Their Influence on Greenhouse Climate Control with Wireless Sensors Network

    OpenAIRE

    Pawlowski, Andrzej; Guzman, Jose L.; Rodriguez, Francisco; Berenguel, Manuel; Sanchez, Jose; Dormido, Sebastian

    2010-01-01

    This paper presents a study of event-based sampling techniques and their application to the greenhouse climate control problem. It was possible to obtain important information about data transmission and control performance for all techniques. As conclusion, it was deduced

  2. Performance Models and Risk Management in Communications Systems

    CERN Document Server

    Harrison, Peter; Rüstem, Berç

    2011-01-01

    This volume covers recent developments in the design, operation, and management of telecommunication and computer network systems in performance engineering and addresses issues of uncertainty, robustness, and risk. Uncertainty regarding loading and system parameters leads to challenging optimization and robustness issues. Stochastic modeling combined with optimization theory ensures the optimum end-to-end performance of telecommunication or computer network systems. In view of the diverse design options possible, supporting models have many adjustable parameters and choosing the best set for a particular performance objective is delicate and time-consuming. An optimization based approach determines the optimal possible allocation for these parameters. Researchers and graduate students working at the interface of telecommunications and operations research will benefit from this book. Due to the practical approach, this book will also serve as a reference tool for scientists and engineers in telecommunication ...

  3. Applications of the International Space Station Probabilistic Risk Assessment Model

    Science.gov (United States)

    Grant, Warren; Lutomski, Michael G.

    2011-01-01

    Recently the International Space Station (ISS) has incorporated more Probabilistic Risk Assessments (PRAs) in the decision making process for significant issues. Future PRAs will have major impact to ISS and future spacecraft development and operations. These PRAs will have their foundation in the current complete ISS PRA model and the current PRA trade studies that are being analyzed as requested by ISS Program stakeholders. ISS PRAs have recently helped in the decision making process for determining reliability requirements for future NASA spacecraft and commercial spacecraft, making crew rescue decisions, as well as making operational requirements for ISS orbital orientation, planning Extravehicular activities (EVAs) and robotic operations. This paper will describe some applications of the ISS PRA model and how they impacted the final decision. This paper will discuss future analysis topics such as life extension, requirements of new commercial vehicles visiting ISS.

  4. Multifocality and recurrence risk: a quantitative model of field cancerization.

    Science.gov (United States)

    Foo, Jasmine; Leder, Kevin; Ryser, Marc D

    2014-08-21

    Primary tumors often emerge within genetically altered fields of premalignant cells that appear histologically normal but have a high chance of progression to malignancy. Clinical observations have suggested that these premalignant fields pose high risks for emergence of recurrent tumors if left behind after surgical removal of the primary tumor. In this work, we develop a spatio-temporal stochastic model of epithelial carcinogenesis, combining cellular dynamics with a general framework for multi-stage genetic progression to cancer. Using the model, we investigate how various properties of the premalignant fields depend on microscopic cellular properties of the tissue. In particular, we provide analytic results for the size-distribution of the histologically undetectable premalignant fields at the time of diagnosis, and investigate how the extent and the geometry of these fields depend upon key groups of parameters associated with the tissue and genetic pathways. We also derive analytical results for the relative risks of local vs. distant secondary tumors for different parameter regimes, a critical aspect for the optimal choice of post-operative therapy in carcinoma patients. This study contributes to a growing literature seeking to obtain a quantitative understanding of the spatial dynamics in cancer initiation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Modeling human risk: Cell & molecular biology in context

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    It is anticipated that early in the next century manned missions into outer space will occur, with a mission to Mars scheduled between 2015 and 2020. However, before such missions can be undertaken, a realistic estimation of the potential risks to the flight crews is required. One of the uncertainties remaining in this risk estimation is that posed by the effects of exposure to the radiation environment of outer space. Although the composition of this environment is fairly well understood, the biological effects arising from exposure to it are not. The reasons for this are three-fold: (1) A small but highly significant component of the radiation spectrum in outer space consists of highly charged, high energy (HZE) particles which are not routinely experienced on earth, and for which there are insufficient data on biological effects; (2) Most studies on the biological effects of radiation to date have been high-dose, high dose-rate, whereas in space, with the exception of solar particle events, radiation exposures will be low-dose, low dose-rate; (3) Although it has been established that the virtual absence of gravity in space has a profound effect on human physiology, it is not clear whether these effects will act synergistically with those of radiation exposure. A select panel will evaluate the utilizing experiments and models to accurately predict the risks associated with exposure to HZE particles. Topics of research include cellular and tissue response, health effects associated with radiation damage, model animal systems, and critical markers of Radiation response.

  6. Assessing distractors and teamwork during surgery: developing an event-based method for direct observation.

    Science.gov (United States)

    Seelandt, Julia C; Tschan, Franziska; Keller, Sandra; Beldi, Guido; Jenni, Nadja; Kurmann, Anita; Candinas, Daniel; Semmer, Norbert K

    2014-11-01

    To develop a behavioural observation method to simultaneously assess distractors and communication/teamwork during surgical procedures through direct, on-site observations; to establish the reliability of the method for long (>3 h) procedures. Observational categories for an event-based coding system were developed based on expert interviews, observations and a literature review. Using Cohen's κ and the intraclass correlation coefficient, interobserver agreement was assessed for 29 procedures. Agreement was calculated for the entire surgery, and for the 1st hour. In addition, interobserver agreement was assessed between two tired observers and between a tired and a non-tired observer after 3 h of surgery. The observational system has five codes for distractors (door openings, noise distractors, technical distractors, side conversations and interruptions), eight codes for communication/teamwork (case-relevant communication, teaching, leadership, problem solving, case-irrelevant communication, laughter, tension and communication with external visitors) and five contextual codes (incision, last stitch, personnel changes in the sterile team, location changes around the table and incidents). Based on 5-min intervals, Cohen's κ was good to excellent for distractors (0.74-0.98) and for communication/teamwork (0.70-1). Based on frequency counts, intraclass correlation coefficient was excellent for distractors (0.86-0.99) and good to excellent for communication/teamwork (0.45-0.99). After 3 h of surgery, Cohen's κ was 0.78-0.93 for distractors, and 0.79-1 for communication/teamwork. The observational method developed allows a single observer to simultaneously assess distractors and communication/teamwork. Even for long procedures, high interobserver agreement can be achieved. Data collected with this method allow for investigating separate or combined effects of distractions and communication/teamwork on surgical performance and patient outcomes. Published by the

  7. An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.

    Science.gov (United States)

    Nguyen, Ngan; Watson, William D; Dominguez, Edward

    2016-01-01

    Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4

  8. Lifetime and 5 years risk of breast cancer and attributable risk factor according to Gail model in Iranian women.

    Science.gov (United States)

    Mohammadbeigi, Abolfazl; Mohammadsalehi, Narges; Valizadeh, Razieh; Momtaheni, Zeinab; Mokhtari, Mohsen; Ansari, Hossein

    2015-01-01

    Breast cancer is the most commonly diagnosed cancers in women worldwide and in Iran. It is expected to account for 29% of all new cancers in women at 2015. This study aimed to assess the 5 years and lifetime risk of breast cancer according to Gail model, and to evaluate the effect of other additional risk factors on the Gail risk. A cross sectional study conducted on 296 women aged more than 34-year-old in Qom, Center of Iran. Breast Cancer Risk Assessment Tool calculated the Gail risk for each subject. Data were analyzed by paired t-test, independent t-test, and analysis of variance in bivariate approach to evaluate the effect of each factor on Gail risk. Multiple linear regression models with stepwise method were used to predict the effect of each variable on the Gail risk. The mean age of the participants was 47.8 ± 8.8-year-old and 47% have Fars ethnicity. The 5 years and lifetime risk was 0.37 ± 0.18 and 4.48 ± 0.925%, respectively. It was lower than the average risk in same race and age women (P model was lower than the same race and age. Moreover, by comparison with national epidemiologic indicators about morbidity and mortality of breast cancer, it seems that the Gail model overestimate the risk of breast cancer in Iranian women.

  9. Assessing the Safety of Children at Risk of Maltreatment: Decision-Making Models.

    Science.gov (United States)

    DePanfilis, Diane; Scannapieco, Maria

    1994-01-01

    Noting that risk assessment models are controversial and vary in definition, purposes, and risk criteria, reviews and contrasts 10 models used to guide decision making concerning the placement of maltreated children in out-of-home care. Found that models varied on pertinent criteria for assessing child risk, and identified six areas for further…

  10. Time-to-Compromise Model for Cyber Risk Reduction Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2005-09-01

    We propose a new model for estimating the time to compromise a system component that is visible to an attacker. The model provides an estimate of the expected value of the time-to-compromise as a function of known and visible vulnerabilities, and attacker skill level. The time-to-compromise random process model is a composite of three subprocesses associated with attacker actions aimed at the exploitation of vulnerabilities. In a case study, the model was used to aid in a risk reduction estimate between a baseline Supervisory Control and Data Acquisition (SCADA) system and the baseline system enhanced through a specific set of control system security remedial actions. For our case study, the total number of system vulnerabilities was reduced by 86% but the dominant attack path was through a component where the number of vulnerabilities was reduced by only 42% and the time-to-compromise of that component was increased by only 13% to 30% depending on attacker skill level.

  11. Spatial Modeling of Risk in Natural Resource Management

    Directory of Open Access Journals (Sweden)

    Peter Jones

    2002-01-01

    Full Text Available Making decisions in natural resource management involves an understanding of the risk and uncertainty of the outcomes, such as crop failure or cattle starvation, and of the normal spread of the expected production. Hedging against poor outcomes often means lack of investment and slow adoption of new methods. At the household level, production instability can have serious effects on income and food security. At the national level, it can have social and economic impacts that may affect all sectors of society. Crop models such as CERES-Maize are excellent tools for assessing weather-related production variability. WATBAL is a water balance model that can provide robust estimates of the potential growing days for a pasture. These models require large quantities of daily weather data that are rarely available. MarkSim is an application for generating synthetic daily weather files by estimating the third-order Markov model parameters from interpolated climate surfaces. The models can then be run for each distinct point on the map. This paper examines the growth of maize and pasture in dryland agriculture in southern Africa. Weather simulators produce independent estimates for each point on the map; however, we know that a spatial coherence of weather exists. We investigated a method of incorporating spatial coherence into MarkSim and show that it increases the variance of production. This means that all of the farmers in a coherent area share poor yields, with important consequences for food security, markets, transport, and shared grazing lands. The long-term aspects of risk are associated with global climate change. We used the results of a Global Circulation Model to extrapolate to the year 2055. We found that low maize yields would become more likely in the marginal areas, whereas they may actually increase in some areas. The same trend was found with pasture growth. We outline areas where further work is required before these tools and methods

  12. Hierarchical Modelling of Flood Risk for Engineering Decision Analysis

    OpenAIRE

    Custer, Rocco

    2015-01-01

    Societies around the world are faced with flood risk, prompting authorities and decision makers to manage risk to protect population and assets. With climate change, urbanisation and population growth, flood risk changes constantly, requiring flood risk management strategies that are flexible and robust. Traditional risk management solutions, e.g. dike construction, are not particularly flexible, as they are difficult to adapt to changing risk. Conversely, the recent concept of integrated flo...

  13. Combining engineering and data-driven approaches: Development of a generic fire risk model facilitating calibration

    DEFF Research Database (Denmark)

    De Sanctis, G.; Fischer, K.; Kohler, J.

    2014-01-01

    are not detailed enough. Engineering risk models, on the other hand, may be detailed but typically involve assumptions that may result in a biased risk assessment and make a cost-benefit study problematic. In two related papers it is shown how engineering and data-driven modeling can be combined by developing......Fire risk models support decision making for engineering problems under the consistent consideration of the associated uncertainties. Empirical approaches can be used for cost-benefit studies when enough data about the decision problem are available. But often the empirical approaches...... a generic risk model that is calibrated to observed fire loss data. Generic risk models assess the risk of buildings based on specific risk indicators and support risk assessment at a portfolio level. After an introduction to the principles of generic risk assessment, the focus of the present paper...

  14. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    Science.gov (United States)

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  15. Systemic Thinking and Requisite Holism in Mastering Logistics Risks: the Model for Identifying Risks in Organisations and Supply Chain

    Directory of Open Access Journals (Sweden)

    Bojan Rosi

    2013-02-01

    Full Text Available Risks in logistic processes represent one of the major issues in supply chain management nowadays. Every organization strives for success, and uninterrupted operations are the key factors in achieving this goal, which cannot be achieved without efficient risk management. In the scope of supply chain risk research, we identified some key issues in the field, the major issue being the lack of standardization and models, which can make risk management in an organization easier and more efficient. Consequently, we developed a model, which captures and identifies risks in an organization and its supply chain. It is in accordance with the general risk management standard – ISO 31000, and incorporates some relevant recent findings from general and supply chain risk management, especially from the viewpoint of public segmentation. This experimental catalogue (which is also published online can serve as a checklist and a starting point of supply chain risk management in organizations. Its main idea is cooperation between experts from the area in order to compile an ever-growing list of possible risks and to provide an insight in the model and its value in practice, for which reason input and opinions of anyone who uses our model are greatly appreciated and included in the catalogue.

  16. Application of wildfire simulation models for risk analysis

    Science.gov (United States)

    Ager, A.; Finney, M.

    2009-04-01

    Wildfire simulation models are being widely used by fire and fuels specialists in the U.S. to support tactical and strategic decisions related to the mitigation of wildfire risk. Much of this application has resulted from the development of a minimum travel time (MTT) fire spread algorithm (M. Finney) that makes it computationally feasible to simulate thousands of fires and generate burn probability and intensity maps over large areas (10,000 - 2,000,000 ha). The MTT algorithm is parallelized for multi-threaded processing and is imbedded in a number of research and applied fire modeling applications. High performance computers (e.g., 32-way 64 bit SMP) are typically used for MTT simulations, although the algorithm is also implemented in the 32 bit desktop FlamMap3 program (www.fire.org). Extensive testing has shown that this algorithm can replicate large fire boundaries in the heterogeneous landscapes that typify much of the wildlands in the western U.S. In this paper, we describe the application of the MTT algorithm to understand spatial patterns of burn probability (BP), and to analyze wildfire risk to key human and ecological values. The work is focused on a federally-managed 2,000,000 ha landscape in the central interior region of Oregon State, USA. The fire-prone study area encompasses a wide array of topography and fuel types and a number of highly valued resources that are susceptible to fire. We quantitatively defined risk as the product of the probability of a fire and the resulting consequence. Burn probabilities at specific intensity classes were estimated for each 100 x 100 m pixel by simulating 100,000 wildfires under burn conditions that replicated recent severe wildfire events that occurred under conditions where fire suppression was generally ineffective (97th percentile, August weather). We repeated the simulation under milder weather (70th percentile, August weather) to replicate a "wildland fire use scenario" where suppression is minimized to

  17. Linking GIS and storm water modeling for emergency risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Newkirk, R.T. [Univ. of Waterloo, Ontario (Canada)

    1995-12-31

    Many emergencies involve the deposition of chemical contaminants on land either as a direct event or as a secondary byproduct. GIS can be useful in estimating the initial deposition area. Chemical product attribute data bases can be accessed to determine the degree that the contaminants might be transportable in a water medium. An important issue is to estimate the potential impact of the deposition on surface and subsurface water flows. This particularly important since millions of people rely on subsurface ground water as their main source of potable water. Thus, a modeling system is needed by planners and emergency managers to assess the potential for short and long term risks to communities due to storm water transport of deposited contaminants. GIS itself cannot provide the complete analysis. A prototype system to assist in estimating the flows of contaminants related to an emergency has been developed by linking an Arc/Info database, Digital Terrain Model, and SWMM the storm water management modeling system. This system also has important planning applications in assessing alternative land development plans for their impact on ground water recharge and management of storm water.

  18. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa (κ): 0

  19. Recent Enhancements to the Genetic Risk Prediction Model BRCAPRO.

    Science.gov (United States)

    Mazzola, Emanuele; Blackford, Amanda; Parmigiani, Giovanni; Biswas, Swati

    2015-01-01

    BRCAPRO is a widely used model for genetic risk prediction of breast cancer. It is a function within the R package BayesMendel and is used to calculate the probabilities of being a carrier of a deleterious mutation in one or both of the BRCA genes, as well as the probability of being affected with breast and ovarian cancer within a defined time window. Both predictions are based on information contained in the counselee's family history of cancer. During the last decade, BRCAPRO has undergone several rounds of successive refinements: the current version is part of release 2.1 of BayesMendel. In this review, we showcase some of the most notable features of the software resulting from these recent changes. We provide examples highlighting each feature, using artificial pedigrees motivated by complex clinical examples. We illustrate how BRCAPRO is a comprehensive software for genetic risk prediction with many useful features that allow users the flexibility to incorporate varying amounts of available information.

  20. Shifting the risk in pricing and reimbursement schemes. A new model of risk-sharing agreements for innovative drugs

    OpenAIRE

    Stefano Capri; Rossella Levaggi

    2010-01-01

    Risk sharing is becoming increasingly an increasingly popular instrument to regulate the price of new drugs. In the recent past, forms of risk-sharing agreements between the public regulator and the industry have been proposed and implemented, but their effects on price and profits are still controversial. Methods: We develop a model aimed at studying the effects on price and expected pro.t of several risk-sharing agreement between a regulator and the industry, based on the ex post effectiven...

  1. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  2. Modified social ecological model: a tool to guide the assessment of the risks and risk contexts of HIV epidemics.

    Science.gov (United States)

    Baral, Stefan; Logie, Carmen H; Grosso, Ashley; Wirtz, Andrea L; Beyrer, Chris

    2013-05-17

    Social and structural factors are now well accepted as determinants of HIV vulnerabilities. These factors are representative of social, economic, organizational and political inequities. Associated with an improved understanding of multiple levels of HIV risk has been the recognition of the need to implement multi-level HIV prevention strategies. Prevention sciences research and programming aiming to decrease HIV incidence requires epidemiologic studies to collect data on multiple levels of risk to inform combination HIV prevention packages. Proximal individual-level risks, such as sharing injection devices and unprotected penile-vaginal or penile-anal sex, are necessary in mediating HIV acquisition and transmission. However, higher order social and structural-level risks can facilitate or reduce HIV transmission on population levels. Data characterizing these risks is often far more actionable than characterizing individual-level risks. We propose a modified social ecological model (MSEM) to help visualize multi-level domains of HIV infection risks and guide the development of epidemiologic HIV studies. Such a model may inform research in epidemiology and prevention sciences, particularly for key populations including men who have sex with men (MSM), people who inject drugs (PID), and sex workers. The MSEM builds on existing frameworks by examining multi-level risk contexts for HIV infection and situating individual HIV infection risks within wider network, community, and public policy contexts as well as epidemic stage. The utility of the MSEM is demonstrated with case studies of HIV risk among PID and MSM. The MSEM is a flexible model for guiding epidemiologic studies among key populations at risk for HIV in diverse sociocultural contexts. Successful HIV prevention strategies for key populations require effective integration of evidence-based biomedical, behavioral, and structural interventions. While the focus of epidemiologic studies has traditionally been on

  3. Cumulative Risk and Early Cognitive Development: A Comparison of Statistical Risk Models.

    Science.gov (United States)

    Burchinal, Margaret R.; Roberts, Joanne E.; Hooper, Stephen; Zeisel, Susan A.

    2000-01-01

    Examined analytic methods for describing children's social risk. Found that the individual-risk-variables approach provided better overall prediction of developmental outcomes at a particular age. The risk-factor approach provided good prediction of developmental trajectories with moderate to large sample sizes. The risk-index was useful for…

  4. Event-based sampling for reducing communication load in realtime human motion analysis by wireless inertial sensor networks

    Directory of Open Access Journals (Sweden)

    Laidig Daniel

    2016-09-01

    Full Text Available We examine the usefulness of event-based sampling approaches for reducing communication in inertial-sensor-based analysis of human motion. To this end we consider realtime measurement of the knee joint angle during walking, employing a recently developed sensor fusion algorithm. We simulate the effects of different event-based sampling methods on a large set of experimental data with ground truth obtained from an external motion capture system. This results in a reduced wireless communication load at the cost of a slightly increased error in the calculated angles. The proposed methods are compared in terms of best balance of these two aspects. We show that the transmitted data can be reduced by 66% while maintaining the same level of accuracy.

  5. Event-based prospective memory among veterans: The role of posttraumatic stress disorder symptom severity in executing intentions.

    Science.gov (United States)

    McFarland, Craig P; Clark, Justin B; Lee, Lewina O; Grande, Laura J; Marx, Brian P; Vasterling, Jennifer J

    2016-01-01

    Posttraumatic stress disorder (PTSD) has been linked with neuropsychological deficits in several areas, including attention, learning and memory, and cognitive inhibition. Although memory dysfunction is among the most commonly documented deficits associated with PTSD, our existing knowledge pertains only to retrospective memory. The current study investigated the relationship between PTSD symptom severity and event-based prospective memory (PM). Forty veterans completed a computerized event-based PM task, a self-report measure of PTSD, and measures of retrospective memory. Hierarchical regression analysis results revealed that PTSD symptom severity accounted for 16% of the variance in PM performance, F(3, 36) = 3.47, p memory. Additionally, each of the three PTSD symptom clusters was related, to varying degrees, with PM performance. Results suggest that elevated PTSD symptoms may be associated with more difficulties completing tasks requiring PM. Further examination of PM in PTSD is warranted, especially in regard to its impact on everyday functioning.

  6. Identifying and assessing critical uncertainty thresholds in a forest pest risk model

    Science.gov (United States)

    Frank H. Koch; Denys Yemshanov

    2015-01-01

    Pest risk maps can provide helpful decision support for invasive alien species management, but often fail to address adequately the uncertainty associated with their predicted risk values. Th is chapter explores how increased uncertainty in a risk model’s numeric assumptions (i.e. its principal parameters) might aff ect the resulting risk map. We used a spatial...

  7. An absolute risk model to identify individuals at elevated risk for pancreatic cancer in the general population.

    Science.gov (United States)

    Klein, Alison P; Lindström, Sara; Mendelsohn, Julie B; Steplowski, Emily; Arslan, Alan A; Bueno-de-Mesquita, H Bas; Fuchs, Charles S; Gallinger, Steven; Gross, Myron; Helzlsouer, Kathy; Holly, Elizabeth A; Jacobs, Eric J; Lacroix, Andrea; Li, Donghui; Mandelson, Margaret T; Olson, Sara H; Petersen, Gloria M; Risch, Harvey A; Stolzenberg-Solomon, Rachael Z; Zheng, Wei; Amundadottir, Laufey; Albanes, Demetrius; Allen, Naomi E; Bamlet, William R; Boutron-Ruault, Marie-Christine; Buring, Julie E; Bracci, Paige M; Canzian, Federico; Clipp, Sandra; Cotterchio, Michelle; Duell, Eric J; Elena, Joanne; Gaziano, J Michael; Giovannucci, Edward L; Goggins, Michael; Hallmans, Göran; Hassan, Manal; Hutchinson, Amy; Hunter, David J; Kooperberg, Charles; Kurtz, Robert C; Liu, Simin; Overvad, Kim; Palli, Domenico; Patel, Alpa V; Rabe, Kari G; Shu, Xiao-Ou; Slimani, Nadia; Tobias, Geoffrey S; Trichopoulos, Dimitrios; Van Den Eeden, Stephen K; Vineis, Paolo; Virtamo, Jarmo; Wactawski-Wende, Jean; Wolpin, Brian M; Yu, Herbert; Yu, Kai; Zeleniuch-Jacquotte, Anne; Chanock, Stephen J; Hoover, Robert N; Hartge, Patricia; Kraft, Peter

    2013-01-01

    We developed an absolute risk model to identify individuals in the general population at elevated risk of pancreatic cancer. Using data on 3,349 cases and 3,654 controls from the PanScan Consortium, we developed a relative risk model for men and women of European ancestry based on non-genetic and genetic risk factors for pancreatic cancer. We estimated absolute risks based on these relative risks and population incidence rates. Our risk model included current smoking (multivariable adjusted odds ratio (OR) and 95% confidence interval: 2.20 [1.84-2.62]), heavy alcohol use (>3 drinks/day) (OR: 1.45 [1.19-1.76]), obesity (body mass index >30 kg/m(2)) (OR: 1.26 [1.09-1.45]), diabetes >3 years (nested case-control OR: 1.57 [1.13-2.18], case-control OR: 1.80 [1.40-2.32]), family history of pancreatic cancer (OR: 1.60 [1.20-2.12]), non-O ABO genotype (AO vs. OO genotype) (OR: 1.23 [1.10-1.37]) to (BB vs. OO genotype) (OR 1.58 [0.97-2.59]), rs3790844(chr1q32.1) (OR: 1.29 [1.19-1.40]), rs401681(5p15.33) (OR: 1.18 [1.10-1.26]) and rs9543325(13q22.1) (OR: 1.27 [1.18-1.36]). The areas under the ROC curve for risk models including only non-genetic factors, only genetic factors, and both non-genetic and genetic factors were 58%, 57% and 61%, respectively. We estimate that fewer than 3/1,000 U.S. non-Hispanic whites have more than a 5% predicted lifetime absolute risk. Although absolute risk modeling using established risk factors may help to identify a group of individuals at higher than average risk of pancreatic cancer, the immediate clinical utility of our model is limited. However, a risk model can increase awareness of the various risk factors for pancreatic cancer, including modifiable behaviors.

  8. An absolute risk model to identify individuals at elevated risk for pancreatic cancer in the general population.

    Directory of Open Access Journals (Sweden)

    Alison P Klein

    Full Text Available We developed an absolute risk model to identify individuals in the general population at elevated risk of pancreatic cancer.Using data on 3,349 cases and 3,654 controls from the PanScan Consortium, we developed a relative risk model for men and women of European ancestry based on non-genetic and genetic risk factors for pancreatic cancer. We estimated absolute risks based on these relative risks and population incidence rates.Our risk model included current smoking (multivariable adjusted odds ratio (OR and 95% confidence interval: 2.20 [1.84-2.62], heavy alcohol use (>3 drinks/day (OR: 1.45 [1.19-1.76], obesity (body mass index >30 kg/m(2 (OR: 1.26 [1.09-1.45], diabetes >3 years (nested case-control OR: 1.57 [1.13-2.18], case-control OR: 1.80 [1.40-2.32], family history of pancreatic cancer (OR: 1.60 [1.20-2.12], non-O ABO genotype (AO vs. OO genotype (OR: 1.23 [1.10-1.37] to (BB vs. OO genotype (OR 1.58 [0.97-2.59], rs3790844(chr1q32.1 (OR: 1.29 [1.19-1.40], rs401681(5p15.33 (OR: 1.18 [1.10-1.26] and rs9543325(13q22.1 (OR: 1.27 [1.18-1.36]. The areas under the ROC curve for risk models including only non-genetic factors, only genetic factors, and both non-genetic and genetic factors were 58%, 57% and 61%, respectively. We estimate that fewer than 3/1,000 U.S. non-Hispanic whites have more than a 5% predicted lifetime absolute risk.Although absolute risk modeling using established risk factors may help to identify a group of individuals at higher than average risk of pancreatic cancer, the immediate clinical utility of our model is limited. However, a risk model can increase awareness of the various risk factors for pancreatic cancer, including modifiable behaviors.

  9. Fuzzy Risk Graph Model for Determining Safety Integrity Level

    Directory of Open Access Journals (Sweden)

    R. Nait-Said

    2008-01-01

    Full Text Available The risk graph is one of the most popular methods used to determine the safety integrity level for safety instrumented functions. However, conventional risk graph as described in the IEC 61508 standard is subjective and suffers from an interpretation problem of risk parameters. Thus, it can lead to inconsistent outcomes that may result in conservative SIL's. To overcome this difficulty, a modified risk graph using fuzzy rule-based system is proposed. This novel version of risk graph uses fuzzy scales to assess risk parameters, and calibration may be made by varying risk parameter values. Furthermore, the outcomes which are numerical values of risk reduction factor (the inverse of the probability of failure on demand can be compared directly with those given by quantitative and semiquantitative methods such as fault tree analysis (FTA, quantitative risk assessment (QRA, and layers of protection analysis (LOPA.

  10. Using toxicokinetic-toxicodynamic modeling as an acute risk assessment refinement approach in vertebrate ecological risk assessment.

    Science.gov (United States)

    Ducrot, Virginie; Ashauer, Roman; Bednarska, Agnieszka J; Hinarejos, Silvia; Thorbek, Pernille; Weyman, Gabriel

    2016-01-01

    Recent guidance identified toxicokinetic-toxicodynamic (TK-TD) modeling as a relevant approach for risk assessment refinement. Yet, its added value compared to other refinement options is not detailed, and how to conduct the modeling appropriately is not explained. This case study addresses these issues through 2 examples of individual-level risk assessment for 2 hypothetical plant protection products: 1) evaluating the risk for small granivorous birds and small omnivorous mammals of a single application, as a seed treatment in winter cereals, and 2) evaluating the risk for fish after a pulsed treatment in the edge-of-field zone. Using acute test data, we conducted the first tier risk assessment as defined in the European Food Safety Authority (EFSA) guidance. When first tier risk assessment highlighted a concern, refinement options were discussed. Cases where the use of models should be preferred over other existing refinement approaches were highlighted. We then practically conducted the risk assessment refinement by using 2 different models as examples. In example 1, a TK model accounting for toxicokinetics and relevant feeding patterns in the skylark and in the wood mouse was used to predict internal doses of the hypothetical active ingredient in individuals, based on relevant feeding patterns in an in-crop situation, and identify the residue levels leading to mortality. In example 2, a TK-TD model accounting for toxicokinetics, toxicodynamics, and relevant exposure patterns in the fathead minnow was used to predict the time-course of fish survival for relevant FOCUS SW exposure scenarios and identify which scenarios might lead to mortality. Models were calibrated using available standard data and implemented to simulate the time-course of internal dose of active ingredient or survival for different exposure scenarios. Simulation results were discussed and used to derive the risk assessment refinement endpoints used for decision. Finally, we compared the

  11. Modelling the risk of airborne infectious disease using exhaled air.

    Science.gov (United States)

    Issarow, Chacha M; Mulder, Nicola; Wood, Robin

    2015-05-07

    In this paper we develop and demonstrate a flexible mathematical model that predicts the risk of airborne infectious diseases, such as tuberculosis under steady state and non-steady state conditions by monitoring exhaled air by infectors in a confined space. In the development of this model, we used the rebreathed air accumulation rate concept to directly determine the average volume fraction of exhaled air in a given space. From a biological point of view, exhaled air by infectors contains airborne infectious particles that cause airborne infectious diseases such as tuberculosis in confined spaces. Since not all infectious particles can reach the target infection site, we took into account that the infectious particles that commence the infection are determined by respiratory deposition fraction, which is the probability of each infectious particle reaching the target infection site of the respiratory tracts and causing infection. Furthermore, we compute the quantity of carbon dioxide as a marker of exhaled air, which can be inhaled in the room with high likelihood of causing airborne infectious disease given the presence of infectors. We demonstrated mathematically and schematically the correlation between TB transmission probability and airborne infectious particle generation rate, ventilation rate, average volume fraction of exhaled air, TB prevalence and duration of exposure to infectors in a confined space. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Training Systems Modelers through the Development of a Multi-scale Chagas Disease Risk Model

    Science.gov (United States)

    Hanley, J.; Stevens-Goodnight, S.; Kulkarni, S.; Bustamante, D.; Fytilis, N.; Goff, P.; Monroy, C.; Morrissey, L. A.; Orantes, L.; Stevens, L.; Dorn, P.; Lucero, D.; Rios, J.; Rizzo, D. M.

    2012-12-01

    The goal of our NSF-sponsored Division of Behavioral and Cognitive Sciences grant is to create a multidisciplinary approach to develop spatially explicit models of vector-borne disease risk using Chagas disease as our model. Chagas disease is a parasitic disease endemic to Latin America that afflicts an estimated 10 million people. The causative agent (Trypanosoma cruzi) is most commonly transmitted to humans by blood feeding triatomine insect vectors. Our objectives are: (1) advance knowledge on the multiple interacting factors affecting the transmission of Chagas disease, and (2) provide next generation genomic and spatial analysis tools applicable to the study of other vector-borne diseases worldwide. This funding is a collaborative effort between the RSENR (UVM), the School of Engineering (UVM), the Department of Biology (UVM), the Department of Biological Sciences (Loyola (New Orleans)) and the Laboratory of Applied Entomology and Parasitology (Universidad de San Carlos). Throughout this five-year study, multi-educational groups (i.e., high school, undergraduate, graduate, and postdoctoral) will be trained in systems modeling. This systems approach challenges students to incorporate environmental, social, and economic as well as technical aspects and enables modelers to simulate and visualize topics that would either be too expensive, complex or difficult to study directly (Yasar and Landau 2003). We launch this research by developing a set of multi-scale, epidemiological models of Chagas disease risk using STELLA® software v.9.1.3 (isee systems, inc., Lebanon, NH). We use this particular system dynamics software as a starting point because of its simple graphical user interface (e.g., behavior-over-time graphs, stock/flow diagrams, and causal loops). To date, high school and undergraduate students have created a set of multi-scale (i.e., homestead, village, and regional) disease models. Modeling the system at multiple spatial scales forces recognition that

  13. Using multi-state markov models to identify credit card risk

    Directory of Open Access Journals (Sweden)

    Daniel Evangelista Régis

    2016-06-01

    Full Text Available Abstract The main interest of this work is to analyze the application of multi-state Markov models to evaluate credit card risk by investigating the characteristics of different state transitions in client-institution relationships over time, thereby generating score models for various purposes. We also used logistic regression models to compare the results with those obtained using multi-state Markov models. The models were applied to an actual database of a Brazilian financial institution. In this application, multi-state Markov models performed better than logistic regression models in predicting default risk, and logistic regression models performed better in predicting cancellation risk.

  14. Nitrogen Risk Assessment Model for Scotland: II. Hydrological transport and model testing

    Directory of Open Access Journals (Sweden)

    S. M. Dunn

    2004-01-01

    Full Text Available The amount and concentration of N in catchment runoff is strongly controlled by a number of hydrological influences, such as leaching rates and the rate of transport of N from the land to surface water bodies. This paper describes how the principal hydrological controls at a catchment scale have been represented within the Nitrogen Risk Assessment Model for Scotland (NIRAMS; it demonstrates their influence through application of the model to eight Scottish catchments, contrasting in terms of their land use, climate and topography. Calculation of N leaching rates, described in the preceding paper (Dunn et al., 2004, is based on soil water content determined by application of a weekly water balance model. This model uses national scale datasets and has been developed and applied to the whole of Scotland using five years of historical meteorological data. A catchment scale transport model, constructed from a 50m digital elevation model, routes flows of N through the sub-surface and groundwater to the stream system. The results of the simulations carried out for eight different catchments demonstrate that the NIRAMS model is capable of predicting time-series of weekly stream flows and N concentrations, to an acceptable degree of accuracy. The model provides an appropriate framework for risk assessment applications requiring predictions in ungauged catchments and at a national scale. Analysis of the model behaviour shows that streamwater N concentrations are controlled both by the rate of supply of N from leaching as well as the rate of transport of N from the land to the water. Keywords: nitrogen, diffuse pollution, hydrology, model, transport, catchment

  15. Modelling of Risk Management for Product Development of Yogurt Drink Using House of Risk (HOR Method

    Directory of Open Access Journals (Sweden)

    Nur Eko Wahyudin

    2016-12-01

    Full Text Available Abstract. Product Development is essential for the company to stay steady or advance in the market competition. However, the main challenge in product development is associated with the uncertaincy risks that may appeared during the design process. Such risks may affect the success rate of the product development and may contribute either to a small or a huge loss, in which the sustainability of the company can also be affected. To mitigate the risks in product development, risk management is critical. CV. XYZ is a company producing dairy-based products such as mozzarella cheese and yogurt. This research is aimed to identify the potential appeared risks, to arrange the priority order of risk agents and to conceptualize the risk mitigation strategy to be applied. A Yogurt drink product development is needed by CV. XYZ to support the company goals of the product and market expansion. House of Risk (HOR method was used in this research. Two phases included in the identification process, namely marketing and product development design. The research results have examined 20 risks with 27 identified risk agents. Using the Aggregate Risk Potential (ARP value, and Pareto 80:20 principal, this study provides a strategic guideline as how to mitigate the top-three identified risk agents. Keywords: House of Risk (HOR, Risk Management, Product Development, Yogurt Drink

  16. Development of relative risk model for regional groundwater risk assessment: a case study in the lower Liaohe River Plain, China.

    Directory of Open Access Journals (Sweden)

    Xianbo Li

    Full Text Available Increasing pressure on water supply worldwide, especially in arid areas, has resulted in groundwater overexploitation and contamination, and subsequent deterioration of the groundwater quality and threats to public health. Environmental risk assessment of regional groundwater is an important tool for groundwater protection. This study presents a new approach for assessing the environmental risk assessment of regional groundwater. It was carried out with a relative risk model (RRM coupled with a series of indices, such as a groundwater vulnerability index, which includes receptor analysis, risk source analysis, risk exposure and hazard analysis, risk characterization, and management of groundwater. The risk map is a product of the probability of environmental contamination and impact. The reliability of the RRM was verified using Monte Carlo analysis. This approach was applied to the lower Liaohe River Plain (LLRP, northeastern China, which covers 23604 km2. A spatial analysis tool within GIS which was used to interpolate and manipulate the data to develop environmental risk maps of regional groundwater, divided the level of risk from high to low into five ranks (V, IV, III, II, I. The results indicate that areas of relative risk rank (RRR V cover 2324 km2, covering 9.8% of the area; RRR IV covers 3986 km2, accounting for 16.9% of the area. It is a new and appropriate method for regional groundwater resource management and land use planning, and is a rapid and effective tool for improving strategic decision making to protect groundwater and reduce environmental risk.

  17. Development of Relative Risk Model for Regional Groundwater Risk Assessment: A Case Study in the Lower Liaohe River Plain, China

    Science.gov (United States)

    Li, Xianbo; Zuo, Rui; Teng, Yanguo; Wang, Jinsheng; Wang, Bin

    2015-01-01

    Increasing pressure on water supply worldwide, especially in arid areas, has resulted in groundwater overexploitation and contamination, and subsequent deterioration of the groundwater quality and threats to public health. Environmental risk assessment of regional groundwater is an important tool for groundwater protection. This study presents a new approach for assessing the environmental risk assessment of regional groundwater. It was carried out with a relative risk model (RRM) coupled with a series of indices, such as a groundwater vulnerability index, which includes receptor analysis, risk source analysis, risk exposure and hazard analysis, risk characterization, and management of groundwater. The risk map is a product of the probability of environmental contamination and impact. The reliability of the RRM was verified using Monte Carlo analysis. This approach was applied to the lower Liaohe River Plain (LLRP), northeastern China, which covers 23604 km2. A spatial analysis tool within GIS which was used to interpolate and manipulate the data to develop environmental risk maps of regional groundwater, divided the level of risk from high to low into five ranks (V, IV, III, II, I). The results indicate that areas of relative risk rank (RRR) V cover 2324 km2, covering 9.8% of the area; RRR IV covers 3986 km2, accounting for 16.9% of the area. It is a new and appropriate method for regional groundwater resource management and land use planning, and is a rapid and effective tool for improving strategic decision making to protect groundwater and reduce environmental risk. PMID:26020518

  18. Climate and weather risk in natural resource models

    Science.gov (United States)

    Merrill, Nathaniel Henry

    This work, consisting of three manuscripts, addresses natural resource management under risk due to variation in climate and weather. In three distinct but theoretically related applications, I quantify the role of natural resources in stabilizing economic outcomes. In Manuscript 1, we address policy designed to effect the risk of cyanobacteria blooms in a drinking water reservoir through watershed wide policy. Combining a hydrologic and economic model for a watershed in Rhode Island, we solve for the efficient allocation of best management practices (BMPs) on livestock pastures to meet a monthly risk-based as well as mean-based water quality objective. In order to solve for the efficient allocations of nutrient control effort, we optimize a probabilistically constrained integer-programming problem representing the choices made on each farm and the resultant conditions that support cyanobacteria blooms. In doing so, we employ a genetic algorithm (GA). We hypothesize that management based on controlling the upper tail of the probability distribution of phosphorus loading implies different efficient management actions as compared to controlling mean loading. We find a shift to more intense effort on fewer acres when a probabilistic objective is specified with cost savings of meeting risk levels of up to 25% over mean loading based policies. Additionally, we illustrate the relative cost effectiveness of various policies designed to meet this risk-based objective. Rainfall and the subsequent overland runoff is the source of transportation of nutrients to a receiving water body, with larger amounts of phosphorus moving in more intense rainfall events. We highlight the importance of this transportation mechanism by comparing policies under climate change scenarios, where the intensity of rainfall is projected to increase and the time series process of rainfall to change. In Manuscript 2, we introduce a new economic groundwater model that incorporates the gradual shift

  19. A model for the optimal risk management of farm firms

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    2012-01-01

    by a description of procedures for coordinated and economical application of resources to control the probability and/or impact of unfortunate events. Besides identifying the major risk factors and tools for risk management in agricultural production, the paper will look critically into the current methods......Risk management is an integrated part of business or firm management and deals with the problem of how to avoid the risk of economic losses when the objective is to maximize expected profit. This paper will focus on the identification, assessment, and prioritization of risks in agriculture followed...... for risk management Risk management is typically based on numerical analysis and the concept of efficiency. None of the methods developed so far actually solve the basic question of how the individual manager should behave so as to optimise the balance between expected profit/income and risk. In the paper...

  20. Modeling Commercial Turbofan Engine Icing Risk With Ice Crystal Ingestion

    Science.gov (United States)

    Jorgenson, Philip C. E.; Veres, Joseph P.

    2013-01-01

    The occurrence of ice accretion within commercial high bypass aircraft turbine engines has been reported under certain atmospheric conditions. Engine anomalies have taken place at high altitudes that have been attributed to ice crystal ingestion, partially melting, and ice accretion on the compression system components. The result was degraded engine performance, and one or more of the following: loss of thrust control (roll back), compressor surge or stall, and flameout of the combustor. As ice crystals are ingested into the fan and low pressure compression system, the increase in air temperature causes a portion of the ice crystals to melt. It is hypothesized that this allows the ice-water mixture to cover the metal surfaces of the compressor stationary components which leads to ice accretion through evaporative cooling. Ice accretion causes a blockage which subsequently results in the deterioration in performance of the compressor and engine. The focus of this research is to apply an engine icing computational tool to simulate the flow through a turbofan engine and assess the risk of ice accretion. The tool is comprised of an engine system thermodynamic cycle code, a compressor flow analysis code, and an ice particle melt code that has the capability of determining the rate of sublimation, melting, and evaporation through the compressor flow path, without modeling the actual ice accretion. A commercial turbofan engine which has previously experienced icing events during operation in a high altitude ice crystal environment has been tested in the Propulsion Systems Laboratory (PSL) altitude test facility at NASA Glenn Research Center. The PSL has the capability to produce a continuous ice cloud which are ingested by the engine during operation over a range of altitude conditions. The PSL test results confirmed that there was ice accretion in the engine due to ice crystal ingestion, at the same simulated altitude operating conditions as experienced previously in

  1. Software Development Risk Management Model – A Goal Driven Approach

    OpenAIRE

    Islam, Shareeful

    2009-01-01

    Software development project is often faced with unanticipated problems which pose any potential risks within the development environment. Controlling these risks arises from both the technical and non-technical development components already from the early stages of the development is crucial to arrive at a successful project. Therefore, software development risk management is becoming recognized as a best practice in the software industry for reducing these risks before they occur. This the...

  2. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  3. Enterprise Risk Management and firm performance: an integrated model for the banking sector

    National Research Council Canada - National Science Library

    Alaa Soliman; Mukhtar Adam

    2017-01-01

    This study investigates how the implementation of Enterprise Risk Management program affects the performance of firms using an Enterprise Risk Management model for the banking sector and an integrated...

  4. Using Probablilistic Risk Assessment to Model Medication System Failures in Long-Term Care Facilities

    National Research Council Canada - National Science Library

    Comden, Sharon C; Marx, David; Murphy-Carley, Margaret; Hale, Misti

    2005-01-01

    Objective: State agencies and Oregon's long-term care providers cosponsored this developmental study to explore the creation of two statewide medication system risk models using sociotechnical probabilistic risk assessment (ST-PRA...

  5. Risk prediction models for acute kidney injury following major noncardiac surgery: systematic review

    National Research Council Canada - National Science Library

    Wilson, Todd; Quan, Samuel; Cheema, Kim; Zarnke, Kelly; Quinn, Rob; de Koning, Lawrence; Dixon, Elijah; Pannu, Neesh; James, Matthew T

    Acute kidney injury (AKI) is a serious complication of major noncardiac surgery. Risk prediction models for AKI following noncardiac surgery may be useful for identifying high-risk patients to target with prevention strategies...

  6. I-RaCM: A Fully Integrated Risk and Lifecycle Cost Model Project

    Data.gov (United States)

    National Aeronautics and Space Administration — SpaceWorks Engineering, Inc. (SEI) proposes development of the Integrated Risk and Cost Model I-RaCM, as the innovation to meet the need for integrated cost and risk...

  7. Modelling and Prioritization of System Risks in Early Project Phases

    NARCIS (Netherlands)

    Rajabali Nejad, Mohammadreza

    2016-01-01

    The rising complexity of product and systems demands further attention to potential Risks. While researchers explore tools and methods to identify system risk, its prioritization remains a challenging task in a multistakeholder environment. Hazard is the source of risk and causes harm. Harm may have

  8. Modeling Linkage Disequilibrium Increases Accuracy of Polygenic Risk Scores

    DEFF Research Database (Denmark)

    Vilhjálmsson, Bjarni J; Yang, Jian; Finucane, Hilary K

    2015-01-01

    Polygenic risk scores have shown great promise in predicting complex disease risk and will become more accurate as training sample sizes increase. The standard approach for calculating risk scores involves linkage disequilibrium (LD)-based marker pruning and applying a p value threshold to associ...

  9. Modeling Linkage Disequilibrium Increases Accuracy of Polygenic Risk Scores

    NARCIS (Netherlands)

    Vilhjálmsson, Bjarni J.; Yang, Jian; Finucane, Hilary K.; Gusev, Alexander; Lindström, Sara; Ripke, Stephan; Genovese, Giulio; Loh, Po-Ru; Bhatia, Gaurav; Do, Ron; Hayeck, Tristan; Won, Hong-Hee; Kathiresan, Sekar; Pato, Michele; Pato, Carlos; Tamimi, Rulla; Stahl, Eli; Zaitlen, Noah; Pasaniuc, Bogdan; Belbin, Gillian; Kenny, Eimear E.; Schierup, Mikkel H.; de Jager, Philip; Patsopoulos, Nikolaos A.; McCarroll, Steve; Daly, Mark; Purcell, Shaun; Chasman, Daniel; Neale, Benjamin; Goddard, Michael; Visscher, Peter M.; Kraft, Peter; Patterson, Nick; Price, Alkes L.; Neale, Benjamin M.; Corvin, Aiden; Walters, James T. R.; Farh, Kai-How; Holmans, Peter A.; Lee, Phil; Bulik-Sullivan, Brendan; Collier, David A.; Huang, Hailiang; Pers, Tune H.; Agartz, Ingrid; Agerbo, Esben; Albus, Margot; Alexander, Madeline; Amin, Farooq; Bacanu, Silviu A.; Begemann, Martin; Belliveau, Richard A.; Bene, Judit; Bergen, Sarah E.; Bevilacqua, Elizabeth; Bigdeli, Tim B.; Black, Donald W.; Bruggeman, Richard; Buccola, Nancy G.; Buckner, Randy L.; Byerley, William; Cahn, Wiepke; Cai, Guiqing; Campion, Dominique; Cantor, Rita M.; Carr, Vaughan J.; Carrera, Noa; Catts, Stanley V.; Chambert, Kimberly D.; Chan, Raymond C. K.; Chen, Ronald Y. L.; Chen, Eric Y. H.; Cheng, Wei; Cheung, Eric F. C.; Chong, Siow Ann; Cloninger, C. Robert; Cohen, David; Cohen, Nadine; Cormican, Paul; Craddock, Nick; Crowley, James J.; Curtis, David; Davidson, Michael; Davis, Kenneth L.; Degenhardt, Franziska; del Favero, Jurgen; DeLisi, Lynn E.; Demontis, Ditte; Dikeos, Dimitris; Dinan, Timothy; Djurovic, Srdjan; Donohoe, Gary; Drapeau, Elodie; Duan, Jubao; Dudbridge, Frank; Durmishi, Naser; Eichhammer, Peter; Eriksson, Johan; Escott-Price, Valentina; Essioux, Laurent; Fanous, Ayman H.; Farrell, Martilias S.; Frank, Josef; Franke, Lude; Freedman, Robert; Freimer, Nelson B.; Friedl, Marion; Friedman, Joseph I.; Fromer, Menachem; Georgieva, Lyudmila; Gershon, Elliot S.; Giegling, Ina; Giusti-Rodrguez, Paola; Godard, Stephanie; Goldstein, Jacqueline I.; Golimbet, Vera; Gopal, Srihari; Gratten, Jacob; Grove, Jakob; de Haan, Lieuwe; Hammer, Christian; Hamshere, Marian L.; Hansen, Mark; Hansen, Thomas; Haroutunian, Vahram; Hartmann, Annette M.; Henskens, Frans A.; Herms, Stefan; Hirschhorn, Joel N.; Hoffmann, Per; Hofman, Andrea; Hollegaard, Mads V.; Hougaard, David M.; Ikeda, Masashi; Joa, Inge; Julia, Antonio; Kahn, Rene S.; Kalaydjieva, Luba; Karachanak-Yankova, Sena; Karjalainen, Juha; Kavanagh, David; Keller, Matthew C.; Kelly, Brian J.; Kennedy, James L.; Khrunin, Andrey; Kim, Yunjung; Klovins, Janis; Knowles, James A.; Konte, Bettina; Kucinskas, Vaidutis; Kucinskiene, Zita Ausrele; Kuzelova-Ptackova, Hana; Kahler, Anna K.; Laurent, Claudine; Keong, Jimmy Lee Chee; Lee, S. Hong; Legge, Sophie E.; Lerer, Bernard; Li, Miaoxin; Li, Tao; Liang, Kung-Yee; Lieberman, Jeffrey; Limborska, Svetlana; Loughland, Carmel M.; Lubinski, Jan; Lnnqvist, Jouko; Macek, Milan; Magnusson, Patrik K. E.; Maher, Brion S.; Maier, Wolfgang; Mallet, Jacques; Marsal, Sara; Mattheisen, Manuel; Mattingsdal, Morten; McCarley, Robert W.; McDonald, Colm; McIntosh, Andrew M.; Meier, Sandra; Meijer, Carin J.; Melegh, Bela; Melle, Ingrid; Mesholam-Gately, Raquelle I.; Metspalu, Andres; Michie, Patricia T.; Milani, Lili; Milanova, Vihra; Mokrab, Younes; Morris, Derek W.; Mors, Ole; Mortensen, Preben B.; Murphy, Kieran C.; Murray, Robin M.; Myin-Germeys, Inez; Mller-Myhsok, Bertram; Nelis, Mari; Nenadic, Igor; Nertney, Deborah A.; Nestadt, Gerald; Nicodemus, Kristin K.; Nikitina-Zake, Liene; Nisenbaum, Laura; Nordin, Annelie; O'Callaghan, Eadbhard; O'Dushlaine, Colm; O'Neill, F. Anthony; Oh, Sang-Yun; Olincy, Ann; Olsen, Line; van Os, Jim; Pantelis, Christos; Papadimitriou, George N.; Papiol, Sergi; Parkhomenko, Elena; Pato, Michele T.; Paunio, Tiina; Pejovic-Milovancevic, Milica; Perkins, Diana O.; Pietilinen, Olli; Pimm, Jonathan; Pocklington, Andrew J.; Powell, John; Price, Alkes; Pulver, Ann E.; Purcell, Shaun M.; Quested, Digby; Rasmussen, Henrik B.; Reichenberg, Abraham; Reimers, Mark A.; Richards, Alexander L.; Roffman, Joshua L.; Roussos, Panos; Ruderfer, Douglas M.; Salomaa, Veikko; Sanders, Alan R.; Schall, Ulrich; Schubert, Christian R.; Schulze, Thomas G.; Schwab, Sibylle G.; Scolnick, Edward M.; Scott, Rodney J.; Seidman, Larry J.; Shi, Jianxin; Sigurdsson, Engilbert; Silagadze, Teimuraz; Silverman, Jeremy M.; Sim, Kang; Slominsky, Petr; Smoller, Jordan W.; So, Hon-Cheong; Spencer, Chris C. A.; Stahl, Eli A.; Stefansson, Hreinn; Steinberg, Stacy; Stogmann, Elisabeth; Straub, Richard E.; Strengman, Eric; Strohmaier, Jana; Stroup, T. Scott; Subramaniam, Mythily; Suvisaari, Jaana; Svrakic, Dragan M.; Szatkiewicz, Jin P.; Sderman, Erik; Thirumalai, Srinivas; Toncheva, Draga; Tooney, Paul A.; Tosato, Sarah; Veijola, Juha; Waddington, John; Walsh, Dermot; Wang, Dai; Wang, Qiang; Webb, Bradley T.; Weiser, Mark; Wildenauer, Dieter B.; Williams, Nigel M.; Williams, Stephanie; Witt, Stephanie H.; Wolen, Aaron R.; Wong, Emily H. M.; Wormley, Brandon K.; Wu, Jing Qin; Xi, Hualin Simon; Zai, Clement C.; Zheng, Xuebin; Zimprich, Fritz; Wray, Naomi R.; Stefansson, Kari; Adolfsson, Rolf; Andreassen, Ole A.; Blackwood, Douglas H. R.; Bramon, Elvira; Buxbaum, Joseph D.; Børglum, Anders D.; Cichon, Sven; Darvasi, Ariel; Domenici, Enrico; Ehrenreich, Hannelore; Esko, Tonu; Gejman, Pablo V.; Gill, Michael; Gurling, Hugh; Hultman, Christina M.; Iwata, Nakao; Jablensky, Assen V.; Jonsson, Erik G.; Kendler, Kenneth S.; Kirov, George; Knight, Jo; Lencz, Todd; Levinson, Douglas F.; Li, Qingqin S.; Liu, Jianjun; Malhotra, Anil K.; McCarroll, Steven A.; McQuillin, Andrew; Moran, Jennifer L.; Mowry, Bryan J.; Nthen, Markus M.; Ophoff, Roel A.; Owen, Michael J.; Palotie, Aarno; Pato, Carlos N.; Petryshen, Tracey L.; Posthuma, Danielle; Rietschel, Marcella; Riley, Brien P.; Rujescu, Dan; Sham, Pak C.; Sklar, Pamela; St Clair, David; Weinberger, Daniel R.; Wendland, Jens R.; Werge, Thomas; Daly, Mark J.; Sullivan, Patrick F.; O'Donovan, Michael C.; Hunter, David J.; Adank, Muriel; Ahsan, Habibul; Aittomäki, Kristiina; Baglietto, Laura; Berndt, Sonja; Blomquist, Carl; Canzian, Federico; Chang-Claude, Jenny; Chanock, Stephen J.; Crisponi, Laura; Czene, Kamila; Dahmen, Norbert; Silva, Isabel Dos Santos; Easton, Douglas; Eliassen, A. Heather; Figueroa, Jonine; Fletcher, Olivia; Garcia-Closas, Montserrat; Gaudet, Mia M.; Gibson, Lorna; Haiman, Christopher A.; Hall, Per; Hazra, Aditi; Hein, Rebecca; Henderson, Brian E.; Hofman, Albert; Hopper, John L.; Irwanto, Astrid; Johansson, Mattias; Kaaks, Rudolf; Kibriya, Muhammad G.; Lichtner, Peter; Lund, Eiliv; Makalic, Enes; Meindl, Alfons; Meijers-Heijboer, Hanne; Müller-Myhsok, Bertram; Muranen, Taru A.; Nevanlinna, Heli; Peeters, Petra H.; Peto, Julian; Prentice, Ross L.; Rahman, Nazneen; Sánchez, María José; Schmidt, Daniel F.; Schmutzler, Rita K.; Southey, Melissa C.; Travis, Ruth; Turnbull, Clare; Uitterlinden, Andre G.; van der Luijt, Rob B.; Waisfisz, Quinten; Wang, Zhaoming; Whittemore, Alice S.; Yang, Rose; Zheng, Wei

    2015-01-01

    Polygenic risk scores have shown great promise in predicting complex disease risk and will become more accurate as training sample sizes increase. The standard approach for calculating risk scores involves linkage disequilibrium (LD)-based marker pruning and applying a p value threshold to

  10. Risk and Management Control: A Partial Least Square Modelling Approach

    DEFF Research Database (Denmark)

    Nielsen, Steen; Pontoppidan, Iens Christian

    Risk and economic theory goes many year back (e.g. to Keynes & Knight 1921) and risk/uncertainty belong to one of the explanations for the existence of the firm (Coarse, 1937). The present financial crisis going on in the past years have re-accentuated risk and the need of coherence and interrela...... techniques are in many respects a major impact factor....

  11. An integrated breast cancer risk assessment and management model based on fuzzy cognitive maps.

    Science.gov (United States)

    Subramanian, Jayashree; Karmegam, Akila; Papageorgiou, Elpiniki; Papandrianos, Nikolaos; Vasukie, A

    2015-03-01

    There is a growing demand for women to be classified into different risk groups of developing breast cancer (BC). The focus of the reported work is on the development of an integrated risk prediction model using a two-level fuzzy cognitive map (FCM) model. The proposed model combines the results of the initial screening mammogram of the given woman with her demographic risk factors to predict the post-screening risk of developing BC. The level-1 FCM models the demographic risk profile. A nonlinear Hebbian learning algorithm is used to train this model and thus to help on predicting the BC risk grade based on demographic risk factors identified by domain experts. The risk grades estimated by the proposed model are validated using two standard BC risk assessment models viz. Gail and Tyrer-Cuzick. The level-2 FCM models the features of the screening mammogram concerning normal, benign and malignant cases. The data driven Hebbian learning algorithm (DDNHL) is used to train this model in order to predict the BC risk grade based on these mammographic image features. An overall risk grade is calculated by combining the outcomes of these two FCMs. The main limitation of the Gail model of underestimating the risk level of women with strong family history is overcome by the proposed model. IBIS is a hard computing tool based on the Tyrer-Cuzick model that is comprehensive enough in covering a wide range of demographic risk factors including family history, but it generates results in terms of numeric risk score based on predefined formulae. Thus the outcome is difficult to interpret by naive users. Besides these models are based only on the demographic details and do not take into account the findings of the screening mammogram. The proposed integrated model overcomes the above described limitations of the existing models and predicts the risk level in terms of qualitative grades. The predictions of the proposed NHL-FCM model comply with the Tyrer-Cuzick model for 36 out of

  12. Gender attitudes, sexual power, HIV risk: a model for understanding HIV risk behavior of South African men.

    Science.gov (United States)

    Kaufman, Michelle R; Shefer, Tamara; Crawford, Mary; Simbayi, Leickness C; Kalichman, Seth C

    2008-04-01

    The Gender Attitudes-Power-Risk (GAPR) model of HIV risk behavior was tested using survey data collected from among 309 men who were attending STI services in a primary health care clinic in Cape Town, South Africa. Results showed that negative attitudes towards women were significantly positively associated with a high level of HIV risk behavior, and that endorsement of traditional male roles was negatively associated with HIV risk behavior. Endorsement of traditional male gender roles was also inversely related to relationship control but positively to a high degree of decision-making dominance in one's relationship. Sexual relationship power did not significantly mediate the relationships between gender attitudes and HIV risk behavior. A better understanding of gender roles and ideologies in combination with one's power in sexual relationships as they relate to HIV risk behavior among men could better inform future HIV prevention interventions.

  13. Qualitative Event-based Diagnosis with Possible Conflicts: Case Study on the Third International Diagnostic Competition

    Data.gov (United States)

    National Aeronautics and Space Administration — We describe two model-based diagnosis algo- rithms entered into the Third International Diag- nostic Competition. We focus on the first diag- nostic problem of the...

  14. People's Risk Recognition Preceding Evacuation and Its Role in Demand Modeling and Planning.

    Science.gov (United States)

    Urata, Junji; Pel, Adam J

    2017-10-30

    Evacuation planning and management involves estimating the travel demand in the event that such action is required. This is usually done as a function of people's decision to evacuate, which we show is strongly linked to their risk awareness. We use an empirical data set, which shows tsunami evacuation behavior, to demonstrate that risk recognition is not synonymous with objective risk, but is instead determined by a combination of factors including risk education, information, and sociodemographics, and that it changes dynamically over time. Based on these findings, we formulate an ordered logit model to describe risk recognition combined with a latent class model to describe evacuation choices. Our proposed evacuation choice model along with a risk recognition class can evaluate quantitatively the influence of disaster mitigation measures, risk education, and risk information. The results obtained from the risk recognition model show that risk information has a greater impact in the sense that people recognize their high risk. The results of the evacuation choice model show that people who are unaware of their risk take a longer time to evacuate. © 2017 Society for Risk Analysis.

  15. Tsunami Risk Assessment Modelling in Chabahar Port, Iran

    Science.gov (United States)

    Delavar, M. R.; Mohammadi, H.; Sharifi, M. A.; Pirooz, M. D.

    2017-09-01

    The well-known historical tsunami in the Makran Subduction Zone (MSZ) region was generated by the earthquake of November 28, 1945 in Makran Coast in the North of Oman Sea. This destructive tsunami killed over 4,000 people in Southern Pakistan and India, caused great loss of life and devastation along the coasts of Western India, Iran and Oman. According to the report of "Remembering the 1945 Makran Tsunami", compiled by the Intergovernmental Oceanographic Commission (UNESCO/IOC), the maximum inundation of Chabahar port was 367 m toward the dry land, which had a height of 3.6 meters from the sea level. In addition, the maximum amount of inundation at Pasni (Pakistan) reached to 3 km from the coastline. For the two beaches of Gujarat (India) and Oman the maximum run-up height was 3 m from the sea level. In this paper, we first use Makran 1945 seismic parameters to simulate the tsunami in generation, propagation and inundation phases. The effect of tsunami on Chabahar port is simulated using the ComMIT model which is based on the Method of Splitting Tsunami (MOST). In this process the results are compared with the documented eyewitnesses and some reports from researchers for calibration and validation of the result. Next we have used the model to perform risk assessment for Chabahar port in the south of Iran with the worst case scenario of the tsunami. The simulated results showed that the tsunami waves will reach Chabahar coastline 11 minutes after generation and 9 minutes later, over 9.4 Km2 of the dry land will be flooded with maximum wave amplitude reaching up to 30 meters.

  16. Geographical modeling of exposure risk to cyanobacteria for epidemiological purposes.

    Science.gov (United States)

    Serrano, Tania; Dupas, Rémi; Upegui, Erika; Buscail, Camille; Grimaldi, Catherine; Viel, Jean François

    2015-08-01

    The cyanobacteria-derived neurotoxin β-methylamino-L-alanine (BMAA) represents a plausible environmental trigger for amyotrophic lateral sclerosis (ALS), a debilitating and fatal neuromuscular disease. With the eutrophication of water bodies, cyanobacterial blooms and their toxins are becoming increasingly prevalent in France, especially in the Brittany region. Cyanobacteria are monitored at only a few recreational sites, preventing an estimation of exposure of the human population. By contrast, phosphorus, a limiting nutrient for cyanobacterial growth and thus considered a good proxy for cyanobacteria exposure, is monitored in many but not all surface water bodies. Our goal was to develop a geographic exposure indicator that could be used in epidemiological research. We considered the total phosphorus (TP) concentration (mg/L) of samples collected between October 2007 and September 2012 at 179 monitoring stations distributed throughout the Brittany region. Using readily available spatial data, we computed environmental descriptors at the watershed level with a Geographic Information System. Then, these descriptors were introduced into a backward stepwise linear regression model to predict the median TP concentration in unmonitored surface water bodies. TP concentrations in surface water follow an increasing gradient from West to East and inland to coast. The empirical concentration model included five predictor variables with a fair coefficient of determination (R(2) = 0.51). The specific total runoff and the watershed slope correlated negatively with the TP concentrations (p = 0.01 and pcyanobacteria exposure that can be used along with other risk factors in further ALS epidemiologic case-control studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. TSUNAMI RISK ASSESSMENT MODELLING IN CHABAHAR PORT, IRAN

    Directory of Open Access Journals (Sweden)

    M. R. Delavar

    2017-09-01

    Full Text Available The well-known historical tsunami in the Makran Subduction Zone (MSZ region was generated by the earthquake of November 28, 1945 in Makran Coast in the North of Oman Sea. This destructive tsunami killed over 4,000 people in Southern Pakistan and India, caused great loss of life and devastation along the coasts of Western India, Iran and Oman. According to the report of "Remembering the 1945 Makran Tsunami", compiled by the Intergovernmental Oceanographic Commission (UNESCO/IOC, the maximum inundation of Chabahar port was 367 m toward the dry land, which had a height of 3.6 meters from the sea level. In addition, the maximum amount of inundation at Pasni (Pakistan reached to 3 km from the coastline. For the two beaches of Gujarat (India and Oman the maximum run-up height was 3 m from the sea level. In this paper, we first use Makran 1945 seismic parameters to simulate the tsunami in generation, propagation and inundation phases. The effect of tsunami on Chabahar port is simulated using the ComMIT model which is based on the Method of Splitting Tsunami (MOST. In this process the results are compared with the documented eyewitnesses and some reports from researchers for calibration and validation of the result. Next we have used the model to perform risk assessment for Chabahar port in the south of Iran with the worst case scenario of the tsunami. The simulated results showed that the tsunami waves will reach Chabahar coastline 11 minutes after generation and 9 minutes later, over 9.4 Km2 of the dry land will be flooded with maximum wave amplitude reaching up to 30 meters.

  18. GIS modeling for canine dirofilariosis risk assessment in central Italy

    Directory of Open Access Journals (Sweden)

    Michele Mortarino

    2008-05-01

    Full Text Available A survey was conducted in an area of central Italy in order to study the prevalence of Dirofilaria immitis and D. repens in dogs. Blood samples were collected from 283 dogs and examined using a modified Knott’s technique. In addition, in order to detect D. immitis occult infection, 203 serum samples were also analysed for D. immitis antigen detection. The results were analyzed in order to evaluate the behavioural and attitudinal risk factors. A geographical information system (GIS for the study area was constructed, utilizing the following data layers: administrative boundaries, elevation, temperature, rainfall and humidity. Microfilariae were detected in 32 of the 283 dogs surveyed, constituting a total Dirofilaria prevalence of 11.3%. In particular, 20 dogs (7.1% were positive for D. immitis and 12 dogs (4.2% for D. repens microfilariae. One case of D. immitis occult infection was also detected. Choroplethic municipal maps were drawn within the GIS in order to display the distribution of each Dirofilaria species in the study area. Statistical analysis showed a significant association between Dirofilaria infection and animal attitude (hunting/truffle dogs showed a higher prevalence compared to guard/pet dogs. A higher prevalence was also recorded in 2 to 5-years old dogs. Furthermore a GIS-based modelling of climatic data, collected from 5 meteorological stations in the study area, was performed to estimate the yearly number of D. immitis generations in the mosquito vector. The results of the model as depicted by GIS analysis was highly concordant with the territorial distribution of positive dogs and showed that D. immitis spreading is markedly influenced by season. The potential transmission period in the study area was found to be confined to summer months with a peak in July and August, as expected for a temperate region where summer season is the most favourable period for the parasite.

  19. Ecological models for regulatory risk assessments of pesticides: Developing a strategy for the future.

    NARCIS (Netherlands)

    Thorbek, P.; Forbes, V.; Heimbach, F.; Hommen, U.; Thulke, H.H.; Brink, van den P.J.

    2010-01-01

    Ecological Models for Regulatory Risk Assessments of Pesticides: Developing a Strategy for the Future provides a coherent, science-based view on ecological modeling for regulatory risk assessments. It discusses the benefits of modeling in the context of registrations, identifies the obstacles that

  20. Identifiability of the Sign of Covariate Effects in the Competing Risks Model

    DEFF Research Database (Denmark)

    Lo, Simon M.S.; Wilke, Ralf

    2017-01-01

    We present a new framework for the identification of competing risks models, which also include Roy models. We show that by establishing a Hicksian-type decomposition, the direction of covariate effects on the marginal distributions of the competing risks model can be identified under weak restri...

  1. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  2. Identifiability of the Sign of Covariate Effects in the Competing Risks Model

    DEFF Research Database (Denmark)

    Lo, Simon M.S.; Wilke, Ralf

    2017-01-01

    We present a new framework for the identification of competing risks models, which also include Roy models. We show that by establishing a Hicksian-type decomposition, the direction of covariate effects on the marginal distributions of the competing risks model can be identified under weak...

  3. Lifetime and 5 years risk of breast cancer and attributable risk factor according to Gail model in Iranian women

    Directory of Open Access Journals (Sweden)

    Abolfazl Mohammadbeigi

    2015-01-01

    Full Text Available Introduction: Breast cancer is the most commonly diagnosed cancers in women worldwide and in Iran. It is expected to account for 29% of all new cancers in women at 2015. This study aimed to assess the 5 years and lifetime risk of breast cancer according to Gail model, and to evaluate the effect of other additional risk factors on the Gail risk. Materials and Methods: A cross sectional study conducted on 296 women aged more than 34-year-old in Qom, Center of Iran. Breast Cancer Risk Assessment Tool calculated the Gail risk for each subject. Data were analyzed by paired t-test, independent t-test, and analysis of variance in bivariate approach to evaluate the effect of each factor on Gail risk. Multiple linear regression models with stepwise method were used to predict the effect of each variable on the Gail risk. Results: The mean age of the participants was 47.8 ± 8.8-year-old and 47% have Fars ethnicity. The 5 years and lifetime risk was 0.37 ± 0.18 and 4.48 ± 0.925%, respectively. It was lower than the average risk in same race and age women (P < 0.001. Being single, positive family history of breast cancer, positive history of biopsy, and radiotherapy as well as using nonhormonal contraceptives were related to higher lifetime risk (P < 0.05. Moreover, a significant direct correlation observed between lifetime risk and body mass index, age of first live birth, and menarche age. While an inversely correlation observed between lifetimes risk of breast cancer and total month of breast feeding duration and age. Conclusion: Based on our results, the 5 years and lifetime risk of breast cancer according to Gail model was lower than the same race and age. Moreover, by comparison with national epidemiologic indicators about morbidity and mortality of breast cancer, it seems that the Gail model overestimate the risk of breast cancer in Iranian women.

  4. Identifying High-Risk Women for Endometrial Cancer Prevention Strategies: Proposal of an Endometrial Cancer Risk Prediction Model.

    Science.gov (United States)

    Kitson, Sarah J; Evans, D Gareth; Crosbie, Emma J

    2017-01-01

    Already the fourth most common cancer in women in the developed world, the incidence of endometrial cancer is increasing rapidly, in line with the increasing prevalence of obesity. Relatively few studies have been undertaken of risk-reducing interventions aimed at limiting the impact of the disease on both individuals and the health service. Those that have been performed have demonstrated only modest results due to their application in relatively unselected populations. A validated risk prediction model is therefore urgently required to identify individuals at particularly high risk of endometrial cancer who may benefit from targeted primary prevention strategies and to guide trial eligibility. On the basis of a systematic review of the literature, the evidence for inclusion of measures of obesity, reproduction, insulin resistance, and genetic risk in such a model is discussed, and the strength of association between these risk factors and endometrial cancer is used to guide the development of a pragmatic risk prediction scoring system that could be implemented in the general population. Provisional cutoff values are described pending refinement of the model and external validation in large prospective cohorts. Potential risk-reducing interventions are suggested, highlighting the need for future studies in this area if the increasing tide of endometrial cancer is to be stemmed. Cancer Prev Res; 10(1); 1-13. ©2016 AACR. ©2016 American Association for Cancer Research.

  5. Study on the Confidence and Reliability of the Mean Seismic Probability Risk Model

    OpenAIRE

    Wang, Xiao-Lei; Lu, Da-Gang

    2017-01-01

    The mean seismic probability risk model has widely been used in seismic design and safety evaluation of critical infrastructures. In this paper, the confidence levels analysis and error equations derivation of the mean seismic probability risk model are conducted. It has been found that the confidence levels and error values of the mean seismic probability risk model are changed for different sites and that the confidence levels are low and the error values are large for most sites. Meanwhile...

  6. Assessing portfolio market risk in the BRICS economies: use of multivariate GARCH models

    OpenAIRE

    Bonga-Bonga, Lumengo; Nleya, Lebogang

    2016-01-01

    This paper compares the performance of the different models used to estimate portfolio value-at-risk (VaR) in the BRICS economies. Portfolio VaR is estimated with three different multivariate risk models, namely the constant conditional correlation (CCC), the dynamic conditional correlation (DCC) and asymmetric DCC (ADCC) GARCH models. Risk performance measures such as the average deviations, quadratic probability function score and the root mean square error are used to back-test the perform...

  7. A Social Ecological Model of Syndemic Risk affecting Women with and At-Risk for HIV in Impoverished Urban Communities.

    Science.gov (United States)

    Batchelder, A W; Gonzalez, J S; Palma, A; Schoenbaum, E; Lounsbury, D W

    2015-12-01

    Syndemic risk is an ecological construct, defined by co-occurring interdependent socio-environmental, interpersonal and intrapersonal determinants. We posited syndemic risk to be a function of violence, substance use, perceived financial hardship, emotional distress and self-worth among women with and at-risk for HIV in an impoverished urban community. In order to better understand these interrelationships, we developed and validated a system dynamics (SD) model based upon peer-reviewed literature; secondary data analyses of a cohort dataset including women living with and at-risk of HIV in Bronx, NY (N = 620); and input from a Bronx-based community advisory board. Simulated model output revealed divergent levels and patterns of syndemic risk over time across different sample profiles. Outputs generated new insights about how to effectively explore multicomponent multi-level programs in order to strategically develop more effective services for this population. Specifically, the model indicated that effective multi-level interventions might bolster women's resilience by increasing self-worth, which may result in decreased perceived financial hardship and risk of violence. Overall, our stakeholder-informed model depicts how self-worth may be a major driver of vulnerability and a meaningful addition to syndemic theory affecting this population.

  8. Bayesian network as a modelling tool for risk management in agriculture

    DEFF Research Database (Denmark)

    Rasmussen, Svend; Madsen, Anders Læsø; Lund, Mogens

    The importance of risk management increases as farmers become more exposed to risk. But risk management is a difficult topic because income risk is the result of the complex interaction of multiple risk factors combined with the effect of an increasing array of possible risk management tools....... In this paper we use Bayesian networks as an integrated modelling approach for representing uncertainty and analysing risk management in agriculture. It is shown how historical farm account data may be efficiently used to estimate conditional probabilities, which are the core elements in Bayesian network models....... We further show how the Bayesian network model RiBay is used for stochastic simulation of farm income, and we demonstrate how RiBay can be used to simulate risk management at the farm level. It is concluded that the key strength of a Bayesian network is the transparency of assumptions...

  9. Value-at-risk modeling and forecasting with range-based volatility models: empirical evidence

    Directory of Open Access Journals (Sweden)

    Leandro dos Santos Maciel

    Full Text Available ABSTRACT This article considers range-based volatility modeling for identifying and forecasting conditional volatility models based on returns. It suggests the inclusion of range measuring, defined as the difference between the maximum and minimum price of an asset within a time interval, as an exogenous variable in generalized autoregressive conditional heteroscedasticity (GARCH models. The motivation is evaluating whether range provides additional information to the volatility process (intraday variability and improves forecasting, when compared to GARCH-type approaches and the conditional autoregressive range (CARR model. The empirical analysis uses data from the main stock market indexes for the U.S. and Brazilian economies, i.e. S&P 500 and IBOVESPA, respectively, within the period from January 2004 to December 2014. Performance is compared in terms of accuracy, by means of value-at-risk (VaR modeling and forecasting. The out-of-sample results indicate that range-based volatility models provide more accurate VaR forecasts than GARCH models.

  10. Systemic risk: the dynamics of model banking systems.

    Science.gov (United States)

    May, Robert M; Arinaminpathy, Nimalan

    2010-05-06

    The recent banking crises have made it clear that increasingly complex strategies for managing risk in individual banks have not been matched by corresponding attention to overall systemic risks. We explore some simple mathematical caricatures for 'banking ecosystems', with emphasis on the interplay between the characteristics of individual banks (capital reserves in relation to total assets, etc.) and the overall dynamical behaviour of the system. The results are discussed in relation to potential regulations aimed at reducing systemic risk.

  11. Measuring the coupled risks: A copula-based CVaR model

    Science.gov (United States)

    He, Xubiao; Gong, Pu

    2009-01-01

    Integrated risk management for financial institutions requires an approach for aggregating risk types (such as market and credit) whose distributional shapes vary considerably. The financial institutions often ignore risks' coupling influence so as to underestimate the financial risks. We constructed a copula-based Conditional Value-at-Risk (CVaR) model for market and credit risks. This technique allows us to incorporate realistic marginal distributions that capture essential empirical features of these risks, such as skewness and fat-tails while allowing for a rich dependence structure. Finally, the numerical simulation method is used to implement the model. Our results indicate that the coupled risks for the listed company's stock maybe are undervalued if credit risk is ignored, especially for the listed company with bad credit quality.

  12. A Fuzzy Comprehensive Evaluation Model for Sustainability Risk Evaluation of PPP Projects

    Directory of Open Access Journals (Sweden)

    Libiao Bai

    2017-10-01

    Full Text Available Evaluating the sustainability risk level of public–private partnership (PPP projects can reduce project risk incidents and achieve the sustainable development of the organization. However, the existing studies about PPP projects risk management mainly focus on exploring the impact of financial and revenue risks but ignore the sustainability risks, causing the concept of “sustainability” to be missing while evaluating the risk level of PPP projects. To evaluate the sustainability risk level and achieve the most important objective of providing a reference for the public and private sectors when making decisions on PPP project management, this paper constructs a factor system of sustainability risk of PPP projects based on an extensive literature review and develops a mathematical model based on the methods of fuzzy comprehensive evaluation model (FCEM and failure mode, effects and criticality analysis (FMECA for evaluating the sustainability risk level of PPP projects. In addition, this paper conducts computational experiment based on a questionnaire survey to verify the effectiveness and feasibility of this proposed model. The results suggest that this model is reasonable for evaluating the sustainability risk level of PPP projects. To our knowledge, this paper is the first study to evaluate the sustainability risk of PPP projects, which would not only enrich the theories of project risk management, but also serve as a reference for the public and private sectors for the sustainable planning and development. Keywords: sustainability risk eva

  13. Probabilistic forecasting of extreme weather events based on extreme value theory

    Science.gov (United States)

    Van De Vyver, Hans; Van Schaeybroeck, Bert

    2016-04-01

    Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic

  14. Space Weather Influence on Power Systems: Prediction, Risk Analysis, and Modeling

    Science.gov (United States)

    Yatsenko, Vitaliy

    2016-04-01

    This report concentrates on dynamic probabilistic risk analysis of optical elements for complex characterization of damages using physical model of solid state lasers and predictable level of ionizing radiation and space weather. The following main subjects will be covered by our report: (a) solid-state laser model; (b) mathematical models for dynamic probabilistic risk assessment; and (c) software for modeling and prediction of ionizing radiation. A probabilistic risk assessment method for solid-state lasers is presented with consideration of some deterministic and stochastic factors. Probabilistic risk assessment is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in solid-state lasers for the purpose of cost-e®ectively improving their safety and performance. This method based on the Conditional Value-at-Risk measure (CVaR) and the expected loss exceeding Value-at-Risk (VaR). We propose to use a new dynamical-information approach for radiation damage risk assessment of laser elements by cosmic radiation. Our approach includes the following steps: laser modeling, modeling of ionizing radiation in°uences on laser elements, probabilistic risk assessment methods, and risk minimization. For computer simulation of damage processes at microscopic and macroscopic levels the following methods are used: () statistical; (b) dynamical; (c) optimization; (d) acceleration modeling, and (e) mathematical modeling of laser functioning. Mathematical models of space ionizing radiation in°uence on laser elements were developed for risk assessment in laser safety analysis. This is a so-called `black box' or `input-output' models, which seeks only to reproduce the behaviour of the system's output in response to changes in its inputs. The model inputs are radiation in°uences on laser systems and output parameters are dynamical characteristics of the solid laser. Algorithms and software for optimal structure and parameters of

  15. How crucial is it to account for the antecedent moisture conditions in flood forecasting? Comparison of event-based and continuous approaches on 178 catchments

    Directory of Open Access Journals (Sweden)

    L. Berthet

    2009-06-01

    Full Text Available This paper compares event-based and continuous hydrological modelling approaches for real-time forecasting of river flows. Both approaches are compared using a lumped hydrologic model (whose structure includes a soil moisture accounting (SMA store and a routing store on a data set of 178 French catchments. The main focus of this study was to investigate the actual impact of soil moisture initial conditions on the performance of flood forecasting models and the possible compensations with updating techniques. The rainfall-runoff model assimilation technique we used does not impact the SMA component of the model but only its routing part. Tests were made by running the SMA store continuously or on event basis, everything else being equal. The results show that the continuous approach remains the reference to ensure good forecasting performances. We show, however, that the possibility to assimilate the last observed flow considerably reduces the differences in performance. Last, we present a robust alternative to initialize the SMA store where continuous approaches are impossible because of data availability problems.

  16. Adolescent mental health and academic functioning: empirical support for contrasting models of risk and vulnerability.

    Science.gov (United States)

    Lucier-Greer, Mallory; O'Neal, Catherine W; Arnold, A Laura; Mancini, Jay A; Wickrama, Kandauda K A S

    2014-11-01

    Adolescents in military families contend with normative stressors that are universal and exist across social contexts (minority status, family disruptions, and social isolation) as well as stressors reflective of their military life context (e.g., parental deployment, school transitions, and living outside the United States). This study utilizes a social ecological perspective and a stress process lens to examine the relationship between multiple risk factors and relevant indicators of youth well-being, namely depressive symptoms and academic performance, as well as the mediating role of self-efficacy (N = 1,036). Three risk models were tested: an additive effects model (each risk factor uniquely influences outcomes), a full cumulative effects model (the collection of risk factors influences outcomes), a comparative model (a cumulative effects model exploring the differential effects of normative and military-related risks). This design allowed for the simultaneous examination of multiple risk factors and a comparison of alternative perspectives on measuring risk. Each model was predictive of depressive symptoms and academic performance through persistence; however, each model provides unique findings about the relationship between risk factors and youth outcomes. Discussion is provided pertinent to service providers and researchers on how risk is conceptualized and suggestions for identifying at-risk youth. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  17. Event-based nonpoint source pollution prediction in a scarce data catchment

    Science.gov (United States)

    Chen, Lei; Sun, Cheng; Wang, Guobo; Xie, Hui; Shen, Zhenyao

    2017-09-01

    Quantifying the rainfall-runoff-pollutant (R-R-P) process is key to regulating non-point source (NPS) pollution; however, the impacts of scarce measured data on R-R-P simulations have not yet been reported. In this study, we conducted a comprehensive study of scarce data that addressed both rainfall-runoff and runoff-pollutant processes, whereby the impacts of data scarcity on two commonly used methods, including Unit Hydrograph (UH) and Loads Estimator (LOADEST), were quantified. A case study was performed in a typical small catchment of the Three Gorges Reservoir Region (TGRR) of China. Based on our results, the classification of rainfall patterns should be carried out first when analyzing modeling results. Compared to data based on a missing rate and a missing location, key information generates more impacts on the simulated flow and NPS loads. When the scarcity rate exceeds a certain threshold (20% in this study), measured data scarcity level has clear impacts on the model's accuracy. As the model of total nitrogen (TN) always performs better under different data scarcity conditions, researchers are encouraged to pay more attention to continuous the monitoring of total phosphorus (TP) for better NPS-TP predictions. The results of this study serve as baseline information for hydrologic forecasting and for the further control of NPS pollutants.

  18. Monetary Incentive Effects on Event-Based Prospective Memory Three Months after Traumatic Brain Injury in Children

    OpenAIRE

    McCauley, Stephen R.; Pedroza, Claudia; Chapman, Sandra B.; Cook, Lori G.; Vásquez, Ana C.; Levin, Harvey S.

    2011-01-01

    Information regarding the remediation of event-based prospective memory (EB-PM) impairments following pediatric traumatic brain injury (TBI) is scarce. Addressing this, two levels of monetary incentives were used to improve EB-PM in children ages 7 to 16 years with orthopedic injuries (OI, n = 51), or moderate (n = 25), and severe (n = 39) TBI at approximately three months postinjury. The EB-PM task consisted of the child giving a specific verbal response to a verbal cue from the examiner whi...

  19. A probabilistic model for the seismic risk of buildings: application to assess the seismic risk of buildings in urban areas

    OpenAIRE

    Aguilar, Armando; Pujades Beneit, Lluís; Barbat Barbat, Horia Alejandro; Lantada Zarzosa, Maria de Las Nieves

    2010-01-01

    A probabilistic model to estimate the seismic risk of buildings is evaluated. For this purpose a specific methodology is proposed. The developed methodology allows explicitly consider important uncertainties that are present in the main elements, that are used to estimate the seismic risk of buildings. One of these elements is the seismic vulnerability of each building, which is mainly represented in the proposed methodology through probability density functions that describ...

  20. Risk assessment and model for community-based construction ...

    African Journals Online (AJOL)

    However, the involvement of local communities introduces a number of risks during the execution of projects as the individuals involved may not be conversant with construction and the procedures involved in the procurement processes of projects. The consequences of not assessing and managing construction risks are ...

  1. Modeling non-monotone risk aversion using SAHARA utility functions

    NARCIS (Netherlands)

    Chen, A.; Pelsser, A.; Vellekoop, M.

    2011-01-01

    We develop a new class of utility functions, SAHARA utility, with the distinguishing feature that it allows absolute risk aversion to be non-monotone and implements the assumption that agents may become less risk averse for very low values of wealth. The class contains the well-known exponential and

  2. Risk-Averse Newsvendor Model with Strategic Consumer Behavior

    Directory of Open Access Journals (Sweden)

    Tie Wang

    2013-01-01

    Full Text Available The classic newsvendor problem focuses on maximizing the expected profit or minimizing the expected cost when the newsvendor faces myopic customers. However, it ignores the customer’s bargain-hunting behavior and risk preference measure of the newsvendor. As a result, we carry out the rational expectation (RE equilibrium analysis for risk-averse newsvendor facing forward-looking customers who anticipate future sales and choose purchasing timing to maximize their expected surplus. We propose the equations satisfied by the RE equilibrium price and quantity for the risk-averse retailer in general setting and the explicit equilibrium decisions for the case where demand follows the uniform distribution and utility is a general power function. We identify the impacts of the system parameters on the RE equilibrium for this specific situation. In particular, we show that the RE equilibrium price for some risk-averse newsvendors is lower than for a risk-neutral retailer and the RE equilibrium stocking quantity for some risk-averse newsvendors is higher than for a risk-neutral retailer. We also find that the RE equilibrium sale price for a risk-averse newsvendor is decreasing in salvage price in some situations.

  3. Validation Techniques of the Intern Models for Credit Risk

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-09-01

    Provided the development of complex methodologies of risk measurement and management, on a large scale, by credit institutions, simple and static rules of the first accord have become less and less relevant during the last years. And so, the need of setting up a own funds adequacy framework which is much more risk sensitive and provides incentives to credit institutions on what concerns the improvement of risk measurement and management systems was met by approval of the Basel II Accord, which will, therefore, lead to the strengthening of financial stability. The revisal of the Accord was mainly focused on the increase of risk analysis and internal measurement and the changes made to their estimation allow banks to create their own methodological framework to calculate capital requirements (also considering each credit institution’ risk appetite.

  4. Probabilistic Modeling Of Ocular Biomechanics In VIIP: Risk Stratification

    Science.gov (United States)

    Feola, A.; Myers, J. G.; Raykin, J.; Nelson, E. S.; Mulugeta, L.; Samuels, B.; Ethier, C. R.

    2016-01-01

    the peak strains, we ranked and then normalized these coefficients, considering that normalized values 0.5 implied a substantial influence on the range of the peak strains in the optic nerve head (ONH). IOP and ICP were found to have a major influence on the peak strains in the ONH, as did optic nerve and LC stiffness. Interestingly, the stiffness of the sclera far from the scleral canal did not have a large influence on peak strains in ONH tissues; however, the collagen fiber stiffness in the peripapillary sclera and annular ring both influenced the peak strains within the ONH. We have created a physiologically relevant model that incorporated collagen fibers to study the effects of elevated ICP. Elevated ICP resulted in strains in the optic nerve that are not predicted to occur on earth: the upright or supine conditions. We found that IOP, ICP, lamina cribrosa stiffness and optic nerve stiffness had the highest association with these extreme strains in the ONH. These extreme strains may activate mechanosensitive cells that induce tissue remodeling and are a risk factor for the development of VIIP.

  5. The Non-Parametric Identification of the Mixed Proportional Hazards Competing Risks Model

    NARCIS (Netherlands)

    Abbring, J.H.; Berg, van den G.J.

    2000-01-01

    We prove identification of dependent competing risks models in whicheach risk has a mixed proportional hazard specification with regressors, and the risks are dependent by way of the unobserved heterogeneity, or frailty, components. We show that the conditions for non-parametric identification given

  6. Adolescent Risk-Taking and the Five-Factor Model of Personality.

    Science.gov (United States)

    Gullone, Eleonora; Moore, Susan

    2000-01-01

    Investigates the links between adolescent risk-taking and personality, as conceptualized using the Five-factor Model of personality (N=459). Results reveal that risk judgments, personality factors, age and sex were significant predictors of risk behaviors; however, the personality factor of significance was found to differ depending upon the risk…

  7. Students' Mental Models with Respect to Flood Risk in the Netherlands

    Science.gov (United States)

    Bosschaart, Adwin; Kuiper, Wilmad; van der Schee, Joop

    2015-01-01

    Until now various quantitative studies have shown that adults and students in the Netherlands have low flood risk perceptions. In this study we interviewed fifty 15-year-old students in two different flood prone areas. In order to find out how they think and reason about the risk of flooding, the mental model approach was used. Flood risk turned…

  8. Abdominal wound dehiscence in adults: Development and validation of a risk model

    NARCIS (Netherlands)

    G.H. van Ramshorst (Gabrielle); J. Nieuwenhuizen (Jeroen); W.C.J. Hop (Wim); P. Arends (Pauline); J. Boom (Johan); J. Jeekel (Hans); J.F. Lange (Johan)

    2010-01-01

    textabstractBackground: Several studies have been performed to identify risk factors for abdominal wound dehiscence. No risk model had yet been developed for the general surgical population. The objective of the present study was to identify independent risk factors for abdominal wound dehiscence

  9. Risk assessment and food allergy: the probabilistic model applied to allergens

    NARCIS (Netherlands)

    Spanjersberg, M.Q.I.; Kruizinga, A.G.; Rennen, M.A.J.; Houben, G.F.

    2007-01-01

    In order to assess the risk of unintended exposure to food allergens, traditional deterministic risk assessment is usually applied, leading to inconsequential conclusions as 'an allergic reaction cannot be excluded'. TNO therefore developed a quantitative risk assessment model for allergens based on

  10. Skin cancer risks avoided by the Montreal Protocol--worldwide modeling integrating coupled climate-chemistry models with a risk model for UV.

    Science.gov (United States)

    van Dijk, Arjan; Slaper, Harry; den Outer, Peter N; Morgenstern, Olaf; Braesicke, Peter; Pyle, John A; Garny, Hella; Stenke, Andrea; Dameris, Martin; Kazantzidis, Andreas; Tourpali, Kleareti; Bais, Alkiviadis F

    2013-01-01

    The assessment model for ultraviolet radiation and risk "AMOUR" is applied to output from two chemistry-climate models (CCMs). Results from the UK Chemistry and Aerosols CCM are used to quantify the worldwide skin cancer risk avoided by the Montreal Protocol and its amendments: by the year 2030, two million cases of skin cancer have been prevented yearly, which is 14% fewer skin cancer cases per year. In the "World Avoided," excess skin cancer incidence will continue to grow dramatically after 2030. Results from the CCM E39C-A are used to estimate skin cancer risk that had already been inevitably committed once ozone depletion was recognized: excess incidence will peak mid 21st century and then recover or even super-recover at the end of the century. When compared with a "No Depletion" scenario, with ozone undepleted and cloud characteristics as in the 1960s throughout, excess incidence (extra yearly cases skin cancer per million people) of the "Full Compliance with Montreal Protocol" scenario is in the ranges: New Zealand: 100-150, Congo: -10-0, Patagonia: 20-50, Western Europe: 30-40, China: 90-120, South-West USA: 80-110, Mediterranean: 90-100 and North-East Australia: 170-200. This is up to 4% of total local incidence in the Full Compliance scenario in the peak year. © 2012 Wiley Periodicals, Inc. Photochemistry and Photobiology © 2012 The American Society of Photobiology.

  11. Motion Entropy Feature and Its Applications to Event-Based Segmentation of Sports Video

    Science.gov (United States)

    Chen, Chen-Yu; Wang, Jia-Ching; Wang, Jhing-Fa; Hu, Yu-Hen

    2008-12-01

    An entropy-based criterion is proposed to characterize the pattern and intensity of object motion in a video sequence as a function of time. By applying a homoscedastic error model-based time series change point detection algorithm to this motion entropy curve, one is able to segment the corresponding video sequence into individual sections, each consisting of a semantically relevant event. The proposed method is tested on six hours of sports videos including basketball, soccer, and tennis. Excellent experimental results are observed.

  12. Development and validation of a lung cancer risk prediction model for African-Americans.

    Science.gov (United States)

    Etzel, Carol J; Kachroo, Sumesh; Liu, Mei; D'Amelio, Anthony; Dong, Qiong; Cote, Michele L; Wenzlaff, Angela S; Hong, Waun Ki; Greisinger, Anthony J; Schwartz, Ann G; Spitz, Margaret R

    2008-09-01

    Because existing risk prediction models for lung cancer were developed in white populations, they may not be appropriate for predicting risk among African-Americans. Therefore, a need exists to construct and validate a risk prediction model for lung cancer that is specific to African-Americans. We analyzed data from 491 African-Americans with lung cancer and 497 matched African-American controls to identify specific risks and incorporate them into a multivariable risk model for lung cancer and estimate the 5-year absolute risk of lung cancer. We performed internal and external validations of the risk model using data on additional cases and controls from the same ongoing multiracial/ethnic lung cancer case-control study from which the model-building data were obtained as well as data from two different lung cancer studies in metropolitan Detroit, respectively. We also compared our African-American model with our previously developed risk prediction model for whites. The final risk model included smoking-related variables [smoking status, pack-years smoked, age at smoking cessation (former smokers), and number of years since smoking cessation (former smokers)], self-reported physician diagnoses of chronic obstructive pulmonary disease or hay fever, and exposures to asbestos or wood dusts. Our risk prediction model for African-Americans exhibited good discrimination [75% (95% confidence interval, 0.67-0.82)] for our internal data and moderate discrimination [63% (95% confidence interval, 0.57-0.69)] for the external data group, which is an improvement over the Spitz model for white subjects. Existing lung cancer prediction models may not be appropriate for predicting risk for African-Americans because (a) they were developed using white populations, (b) level of risk is different for risk factors that African-American share with whites, and (c) unique group-specific risk factors exist for African-Americans. This study developed and validated a risk prediction model

  13. Testing Models Relating Rejection, Depression, Interpersonal Needs, and Psychache to Suicide Risk in Nonclinical Individuals.

    Science.gov (United States)

    Campos, Rui C; Holden, Ronald R

    2015-10-01

    Using structural equation modeling, we tested a primary model of suicide risk and 3 competing, alternative models based on 4 psychological variables deemed important in the literature (perception of parental rejection, depression, interpersonal needs comprising perceived burdensomeness and thwarted belongingness, and psychache), in a nonclinical sample of Portuguese adults. A convenience sample of 203 adults (100 men, 103 women; aged 18-65 years) participated in this study. Analyses demonstrated that the proposed primary model had the best fit to the observed data. The differences in fit indexes for this model and one of the alternative models, however, were not substantial. Perceived parental rejection related directly to suicide risk and indirectly via depression and interpersonal needs. Depression linked indirectly to suicide risk via interpersonal needs and psychache. Interpersonal needs related directly to suicide risk and indirectly via psychache, which related directly to suicide risk. © 2015 Wiley Periodicals, Inc.

  14. An events based algorithm for distributing concurrent tasks on multi-core architectures

    Science.gov (United States)

    Holmes, David W.; Williams, John R.; Tilke, Peter

    2010-02-01

    In this paper, a programming model is presented which enables scalable parallel performance on multi-core shared memory architectures. The model has been developed for application to a wide range of numerical simulation problems. Such problems involve time stepping or iteration algorithms where synchronization of multiple threads of execution is required. It is shown that traditional approaches to parallelism including message passing and scatter-gather can be improved upon in terms of speed-up and memory management. Using spatial decomposition to create orthogonal computational tasks, a new task management algorithm called H-Dispatch is developed. This algorithm makes efficient use of memory resources by limiting the need for garbage collection and takes optimal advantage of multiple cores by employing a "hungry" pull strategy. The technique is demonstrated on a simple finite difference solver and results are compared to traditional MPI and scatter-gather approaches. The H-Dispatch approach achieves near linear speed-up with results for efficiency of 85% on a 24-core machine. It is noted that the H-Dispatch algorithm is quite general and can be applied to a wide class of computational tasks on heterogeneous architectures involving multi-core and GPGPU hardware.

  15. A TCP model for external beam treatment of intermediate-risk prostate cancer.

    LENUS (Irish Health Repository)

    Walsh, Seán

    2013-03-01

    Biological models offer the ability to predict clinical outcomes. The authors describe a model to predict the clinical response of intermediate-risk prostate cancer to external beam radiotherapy for a variety of fractionation regimes.

  16. Development of a Coronary Heart Disease Risk Prediction Model for Type 1 Diabetes: The Pittsburgh CHD in Type 1 Diabetes Risk Model

    Science.gov (United States)

    Zgibor, Janice C.; Ruppert, Kristine; Orchard, Trevor J.; Soedamah-Muthu, Sabita S.; Fuller, John; Chaturvedi, Nish; Roberts, Mark S.

    2010-01-01

    Aim To create a Coronary Heart Disease (CHD) risk prediction model specific to type 1 diabetes. Methods Development of the model used data from the Pittsburgh Epidemiology of Diabetes Complications Study (EDC). EDC subjects had type 1 diabetes diagnosed between 1950 and 1980, received their first study exam between 1986 and 1988, and have been followed biennially since. The final cohort for model development consisted of 603 subjects and 46 incident events. Hard CHD was defined as CHD death, fatal/non-fatal MI or Q-waves. Baseline CHD risk factors were tested bivariately and introduced into a Weibull model. The prediction model was externally validated in the EURODIAB Prospective Complications Study. Results In males, predictors were higher white blood cell count, micro or macroalbuminuira, lower HDLc and longer diabetes duration. In females, larger waist/hip ratio, higher non-HDLc, higher systolic blood pressure, use of blood pressure medication, and longer diabetes duration were included. Models were robust to internal and external validation procedures. Conclusions CHD risk prediction models for hard CHD in those with type 1 diabetes should include risk factors not considered by existing models. Using models specifically developed for predicting CHD in type 1 diabetes may allow for more targeted prevention strategies. PMID:20236721

  17. Validation of risk stratification models in acute myeloid leukemia using sequencing-based molecular profiling.

    Science.gov (United States)

    Wang, M; Lindberg, J; Klevebring, D; Nilsson, C; Mer, A S; Rantalainen, M; Lehmann, S; Grönberg, H

    2017-10-01

    Risk stratification of acute myeloid leukemia (AML) patients needs improvement. Several AML risk classification models based on somatic mutations or gene-expression profiling have been proposed. However, systematic and independent validation of these models is required for future clinical implementation. We performed whole-transcriptome RNA-sequencing and panel-based deep DNA sequencing of 23 genes in 274 intensively treated AML patients (Clinseq-AML). We also utilized the The Cancer Genome Atlas (TCGA)-AML study (N=142) as a second validation cohort. We evaluated six previously proposed molecular-based models for AML risk stratification and two revised risk classification systems combining molecular- and clinical data. Risk groups stratified by five out of six models showed different overall survival in cytogenetic normal-AML patients in the Clinseq-AML cohort (P-value0.5). Risk classification systems integrating mutational or gene-expression data were found to add prognostic value to the current European Leukemia Net (ELN) risk classification. The prognostic value varied between models and across cohorts, highlighting the importance of independent validation to establish evidence of efficacy and general applicability. All but one model replicated in the Clinseq-AML cohort, indicating the potential for molecular-based AML risk models. Risk classification based on a combination of molecular and clinical data holds promise for improved AML patient stratification in the future.

  18. Wide Area Protection Scheme Preventing Cascading Events based on Improved Impedance relay

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun

    2013-01-01

    Load flow transferring after an initial contingency is regarded as one of the main reasons of causing unexpected cascading trips. A multi agent system (MAS) based wide area protection strategy is proposed in this paper to predict the load flow transferring from the point of view of impedance relays......, and prevent the unexpected relay operations accordingly. The variations of node injections during the post fault transient period will be also considered in the prediction algorithm. The power system of Eastern Denmark modeled in real time digital simulator (RTDS) will be used to demonstrate the proposed...... strategy. The simulation results indicate this strategy can successfully predict and prevent the unexpected relay operation caused by load flow transferring....

  19. Portfolio Sensitivity Model for Analyzing Credit Risk Caused by Structural and Macroeconomic Changes

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2008-12-01

    Full Text Available This paper proposes a new model for portfolio sensitivity analysis. The model is suitable for decision support in financial institutions, specifically for portfolio planning and portfolio management. The basic advantage of the model is the ability to create simulations for credit risk predictions in cases when we virtually change portfolio structure and/or macroeconomic factors. The model takes a holistic approach to portfolio management consolidating all organizational segments in the process such as marketing, retail and risk.

  20. Cardiovascular disease risk score prediction models for women and its applicability to Asians

    Directory of Open Access Journals (Sweden)

    Goh LGH

    2014-03-01

    Full Text Available Louise GH Goh,1 Satvinder S Dhaliwal,1 Timothy A Welborn,2 Peter L Thompson,2–4 Bruce R Maycock,1 Deborah A Kerr,1 Andy H Lee,1 Dean Bertolatti,1 Karin M Clark,1 Rakhshanda Naheed,1 Ranil Coorey,1 Phillip R Della5 1School of Public Health, Curtin Health Innovation Research Institute, Curtin University, Perth, WA, Australia; 2Sir Charles Gairdner Hospital, Nedlands, Perth, WA, Australia; 3School of Population Health, University of Western Australia, Perth, WA, Australia; 4Harry Perkins Institute for Medical Research, Perth, WA, Australia; 5School of Nursing and Midwifery, Curtin Health Innovation Research Institute, Curtin University, Perth, WA, Australia Purpose: Although elevated cardiovascular disease (CVD risk factors are associated with a higher risk of developing heart conditions across all ethnic groups, variations exist between groups in the distribution and association of risk factors, and also risk levels. This study assessed the 10-year predicted risk in a multiethnic cohort of women and compared the differences in risk between Asian and Caucasian women. Methods: Information on demographics, medical conditions and treatment, smoking behavior, dietary behavior, and exercise patterns were collected. Physical measurements were also taken. The 10-year risk was calculated using the Framingham model, SCORE (Systematic COronary Risk Evaluation risk chart for low risk and high risk regions, the general CVD, and simplified general CVD risk score models in 4,354 females aged 20–69 years with no heart disease, diabetes, or stroke at baseline from the third Australian Risk Factor Prevalence Study. Country of birth was used as a surrogate for ethnicity. Nonparametric statistics were used to compare risk levels between ethnic groups. Results: Asian women generally had lower risk of CVD when compared to Caucasian women. The 10-year predicted risk was, however, similar between Asian and Australian women, for some models. These findings were