WorldWideScience

Sample records for gcr event-based risk

  1. Development of a GCR Event-based Risk Model

    Science.gov (United States)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  2. Mixed-field GCR Simulations for Radiobiological Research Using Ground Based Accelerators

    Science.gov (United States)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis A.

    2014-01-01

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20% accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  3. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    Science.gov (United States)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  4. Skin-Based DNA Repair Phenotype for Cancer Risk from GCR in Genetically Diverse Populations

    Science.gov (United States)

    Guiet, Elodie; Viger, Louise; Snijders, Antoine; Costes, Sylvian V.

    2017-01-01

    Predicting cancer risk associated with cosmic radiation remains a mission-critical challenge for NASA radiation health scientists and mission planners. Epidemiological data are lacking and risk methods do not take individual radiation sensitivity into account. In our approach we hypothesize that genetic factors strongly influence risk of cancer from space radiation and that biomarkers reflecting DNA damage and cell death are ideal tools to predict risk and monitor potential health effects post-flight. At this workshop, we will be reporting the work we have done over the first 9 months of this proposal. Skin cells from 15 different strains of mice already characterized for radiation-induced cancer sensitivity (B6C3F; BALB/cByJ, C57BL/6J, CBA/CaJ, C3H/HeMsNrsf), and 10 strains from the DOE collaborative cross-mouse model were expanded from ear biopsy and cultivated until Passage 3. On average, 3 males and 3 females for each strain were expanded and frozen for further characterization at the NSRL beam line during the NSRL16C run for three LET (350 MeV/n Si, 350 MeV/n Ar and 600 MeV/n Fe) and two ion fluences (1 and 3 particles per cell). The mice work has established new metrics for the usage of Radiation Induced Foci as a marker for various aspect of DNA repair deficiencies. In year 2, we propose to continue characterization of the mouse lines with low LET to identify loci specific to high- versus low- LET and establish genetic linkage for the various DNA repair biomarkers. Correlation with cancer risk from each animals strain and gender will also be investigated. On the human side, we will start characterizing the DNA damage response induced ex-vivo in 200 human's blood donors for radiation sensitivity with a tentative 500 donors by the end of this project. All ex-vivo phenotypic data will be correlated to genetic characterization of each individual human donors using SNP arrays characterization as done for mice. Similarly, ex-vivo phenotypic features from mice will

  5. EQRM: An open-source event-based earthquake risk modeling program

    Science.gov (United States)

    Robinson, D. J.; Dhu, T.; Row, P.

    2007-12-01

    Geoscience Australia's Earthquake Risk Model (EQRM) is an event-based tool for earthquake scenario ground motion and scenario loss modeling as well as probabilistic seismic hazard (PSHA) and risk (PSRA) modeling. It has been used to conduct PSHA and PSRA for many of Australia's largest cities and it has become an important tool for the emergency management community which use it for scneario response planning. It has the potential to link with earthquake monitoring programs to provide automatic loss estimates from network recorded events. An open-source alpha-release version of the software is freely available on SourceForge. It can be used for hazard or risk analyses in any region of the world by supplying appropriately formatted input files. Source code is also supplied so advanced users can modify individual components to suit their needs.

  6. Personalized Event-Based Surveillance and Alerting Support for the Assessment of Risk

    CERN Document Server

    Stewar, Avaré; Diaz-Aviles, Ernesto; Dolog, Peter

    2011-01-01

    In a typical Event-Based Surveillance setting, a stream of web documents is continuously monitored for disease reporting. A structured representation of the disease reporting events is extracted from the raw text, and the events are then aggregated to produce signals, which are intended to represent early warnings against potential public health threats. To public health officials, these warnings represent an overwhelming list of "one-size-fits-all" information for risk assessment. To reduce this overload, two techniques are proposed. First, filtering signals according to the user's preferences (e.g., location, disease, symptoms, etc.) helps reduce the undesired noise. Second, re-ranking the filtered signals, according to an individual's feedback and annotation, allows a user-specific, prioritized ranking of the most relevant warnings. We introduce an approach that takes into account this two-step process of: 1) filtering and 2) re-ranking the results of reporting signals. For this, Collaborative Filtering an...

  7. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    Science.gov (United States)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  8. GCR Environmental Models I: Sensitivity Analysis for GCR Environments

    Science.gov (United States)

    Slaba, Tony C.; Blattnig, Steve R.

    2014-01-01

    Accurate galactic cosmic ray (GCR) models are required to assess crew exposure during long-duration missions to the Moon or Mars. Many of these models have been developed and compared to available measurements, with uncertainty estimates usually stated to be less than 15%. However, when the models are evaluated over a common epoch and propagated through to effective dose, relative differences exceeding 50% are observed. This indicates that the metrics used to communicate GCR model uncertainty can be better tied to exposure quantities of interest for shielding applications. This is the first of three papers focused on addressing this need. In this work, the focus is on quantifying the extent to which each GCR ion and energy group, prior to entering any shielding material or body tissue, contributes to effective dose behind shielding. Results can be used to more accurately calibrate model-free parameters and provide a mechanism for refocusing validation efforts on measurements taken over important energy regions. Results can also be used as references to guide future nuclear cross-section measurements and radiobiology experiments. It is found that GCR with Z>2 and boundary energies below 500 MeV/n induce less than 5% of the total effective dose behind shielding. This finding is important given that most of the GCR models are developed and validated against Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer (ACE/CRIS) measurements taken below 500 MeV/n. It is therefore possible for two models to very accurately reproduce the ACE/CRIS data while inducing very different effective dose values behind shielding.

  9. Some implications of an event-based definition of exposure to the risk of road accident

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes a new definition of exposure to the risk of road accident as any event, limited in space and time, representing a potential for an accident to occur by bringing road users close to each other in time or space of by requiring a road user to take action to avoid leaving...... the roadway. A typology of events representing a potential for an accident is proposed. Each event can be interpreted as a trial as defined in probability theory. Risk is the proportion of events that result in an accident. Defining exposure as events demanding the attention of road users implies that road...

  10. Assessment of the risk factors of coronary heart events based on data mining with decision trees.

    Science.gov (United States)

    Karaolis, Minas A; Moutiris, Joseph A; Hadjipanayi, Demetra; Pattichis, Constantinos S

    2010-05-01

    Coronary heart disease (CHD) is one of the major causes of disability in adults as well as one of the main causes of death in the developed countries. Although significant progress has been made in the diagnosis and treatment of CHD, further investigation is still needed. The objective of this study was to develop a data-mining system for the assessment of heart event-related risk factors targeting in the reduction of CHD events. The risk factors investigated were: 1) before the event: a) nonmodifiable-age, sex, and family history for premature CHD, b) modifiable-smoking before the event, history of hypertension, and history of diabetes; and 2) after the event: modifiable-smoking after the event, systolic blood pressure, diastolic blood pressure, total cholesterol, high-density lipoprotein, low-density lipoprotein, triglycerides, and glucose. The events investigated were: myocardial infarction (MI), percutaneous coronary intervention (PCI), and coronary artery bypass graft surgery (CABG). A total of 528 cases were collected from the Paphos district in Cyprus, most of them with more than one event. Data-mining analysis was carried out using the C4.5 decision tree algorithm for the aforementioned three events using five different splitting criteria. The most important risk factors, as extracted from the classification rules analysis were: 1) for MI, age, smoking, and history of hypertension; 2) for PCI, family history, history of hypertension, and history of diabetes; and 3) for CABG, age, history of hypertension, and smoking. Most of these risk factors were also extracted by other investigators. The highest percentages of correct classifications achieved were 66%, 75%, and 75% for the MI, PCI, and CABG models, respectively. It is anticipated that data mining could help in the identification of high and low risk subgroups of subjects, a decisive factor for the selection of therapy, i.e., medical or surgical. However, further investigation with larger datasets is

  11. GCR Environmental Models III: GCR Model Validation and Propagated Uncertainties in Effective Dose

    Science.gov (United States)

    Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.

    2014-01-01

    This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z>2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.

  12. Event-based Lateral Collision Risk Modeling Research%基于事件的航空器侧向碰撞危险模型研究

    Institute of Scientific and Technical Information of China (English)

    吴金栋; 聂润兔

    2011-01-01

    Airspace capacity has grown to be one of the major restrictions to the rapid development of aviation industry. Under the present situation with the current equipment, reducing separation minimum is the best way to expand the airspace capacity. Researching the deviation from the planned route of an aircraft and modeling collision risk are the base theory of separation minimum research. This paper analyzes the Reich model, and suggests establishing an event-based collision risk model, which obtains the same result to the Reich model. Meanwhile, it also pays much attention to the factors that exert influence to the model.%空域容量日益成为制约航空业快速发展的瓶颈之一,在现有设备条件下,缩小间隔标准,提高单位空城的容载量是提供空域容量的重要方法.对于飞机航线飞行中偏离航线情况进行研究,建立碰撞危险模型是间隔标准研究的基础理论.本文对Reich碰撞危险模型进行了分析,提出了建立基于事件的碰撞危险模型,获得了与Reich模型类似的效果,同时在影响因素上考虑的更为全面.

  13. Directly spheroidizing during hot deformation in GCr15 steels

    Institute of Scientific and Technical Information of China (English)

    Guo-hui ZHU; Gang ZHENG

    2008-01-01

    The spheroidizing heat treatment is normally required prior to the cold forming in GCr15 steel in order to improve its machinability. In the conventional spher-oidizing process, very long annealing time, generally more than 10 h, is needed to assure proper spheroidizing. It results in low productivity, high cost, and especially high energy consumption. Therefore, the possibility of directly spheroidizing during hot deformation in GCr15 steel is preliminarily explored. The effect of hot deformation parameters on the final microstructure and hardness is investigated systematically in order to develop a directly spheroidizing technology. Experimental results illustrate that low deformation temperature and slow cooling rate is the favorite in directly softening and/or spheroidizing dur-ing hot deformation, which allows the properties of as-rolled GCr15 to be applicable for post-machining without requirement of prior annealing.

  14. Evaluation of abrasion of a modified drainage mixture with rubber waste crushed (GCR

    Directory of Open Access Journals (Sweden)

    Yee Wan Yung Vargas

    2017-02-01

    Conclusion: The results showed that there is a highlighted influence of mix temperature (between asphalt and GCR and compaction temperature (modified asphalt and aggregate on the behavior of the MD modified with GCR.

  15. GCR Transport in the Brain: Assessment of Self-Shielding, Columnar Damage, and Nuclear Reactions on Cell Inactivation Rates

    Science.gov (United States)

    Shavers, M. R.; Atwell, W.; Cucinotta, F. A.; Badhwar, G. D. (Technical Monitor)

    1999-01-01

    Radiation shield design is driven by the need to limit radiation risks while optimizing risk reduction with launch mass/expense penalties. Both limitation and optimization objectives require the development of accurate and complete means for evaluating the effectiveness of various shield materials and body-self shielding. For galactic cosmic rays (GCR), biophysical response models indicate that track structure effects lead to substantially different assessments of shielding effectiveness relative to assessments based on LET-dependent quality factors. Methods for assessing risk to the central nervous system (CNS) from heavy ions are poorly understood at this time. High-energy and charge (HZE) ion can produce tissue events resulting in damage to clusters of cells in a columnar fashion, especially for stopping heavy ions. Grahn (1973) and Todd (1986) have discussed a microlesion concept or model of stochastic tissue events in analyzing damage from HZE's. Some tissues, including the CNS, maybe sensitive to microlesion's or stochastic tissue events in a manner not illuminated by either conventional dosimetry or fluence-based risk factors. HZE ions may also produce important lateral damage to adjacent cells. Fluences of high-energy proton and alpha particles in the GCR are many times higher than HZE ions. Behind spacecraft and body self-shielding the ratio of protons, alpha particles, and neutrons to HZE ions increases several-fold from free-space values. Models of GCR damage behind shielding have placed large concern on the role of target fragments produced from tissue atoms. The self-shielding of the brain reduces the number of heavy ions reaching the interior regions by a large amount and the remaining light particle environment (protons, neutrons, deuterons. and alpha particles) may be the greatest concern. Tracks of high-energy proton produce nuclear reactions in tissue, which can deposit doses of more than 1 Gv within 5 - 10 cell layers. Information on rates of

  16. Event-Based Activity Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2004-01-01

    We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related activit...

  17. GCR as a source for Inner radiation belt of Saturn.

    Science.gov (United States)

    Kotova, A.; Roussos, E.; Krupp, N.; Dandouras, I. S.

    2014-12-01

    During the insertion orbit of Cassini in 2004 the Ion and Neutron Camera measured significant fluxes of the energetic neutral atoms (ENA) coming from the area between the D-ring and the Saturn's atmosphere, what brought up the idea of the possible existence of the innermost radiation belt in this narrow gap (1). There are two main sources of energetic charged particles for such inner radiation belt: the interaction of the Galactic Cosmic Rays (GCR) with the Saturn's atmosphere and rings, which due to CRAND process can produce the keV-MeV ions or electrons in the region, and the double charge exchange of the ENAs, coming from the middle magnetosphere, what can bring the keV ions to the region of our interest. Using the particles tracer, which was developed in our group, and GEANT4 software, we study in details those two processes. With a particle tracer we evaluate the GCR access to the Saturn atmosphere and rings. Simulation of the GCR trajectories allows to calculate the energy spectra of the arriving energetic particles, which is much more accurate, compare to the analytically predicted spectra using the Stoermer theory, since simulation includes effects of the ring shadow and non-dipolar processes in the magnetosphere. Using the GEANT4 software the penetration of the GCR through the matter of rings was simulated, and the production of secondaries particles was estimated. Finally, the motion of secondaries was simulated using the particles tracer, and evaluation of the energy spectrum of neutrons the decay of which leads to the production of final CRAND elements in the inner Saturnian radiation belts was done. We show that for inner radiation belt most energetic ions comes from GCR interaction with rings, it's penetration and from interaction of secondaries with Saturn's atmosphere. This simulation allows us to predict the fluxes of energetic ions and electrons, which particle detector MIMI/LEMMS onboard the Cassini can measure during the so-called "proximal

  18. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  19. Reconstruction of flood events based on documentary data and transnational flood risk analysis of the Upper Rhine and its French and German tributaries since AD 1480

    Science.gov (United States)

    Himmelsbach, I.; Glaser, R.; Schoenbein, J.; Riemann, D.; Martin, B.

    2015-10-01

    This paper presents the long-term analysis of flood occurrence along the southern part of the Upper Rhine River system and of 14 of its tributaries in France and Germany covering the period starting from 1480 BC. Special focus is given on the temporal and spatial variations of flood events and their underlying meteorological causes over time. Examples are presented of how long-term information about flood events and knowledge about the historical aspect of flood protection in a given area can help to improve the understanding of risk analysis and therefor transnational risk management. Within this context, special focus is given to flood vulnerability while comparing selected historical and modern extreme events, establishing a common evaluation scheme. The transnational aspect becomes especially evident analyzing the tributaries: on this scale, flood protection developed impressively different on the French and German sides. We argue that comparing high technological standards of flood protection, which were initiated by the dukes of Baden on the German side starting in the early 19th century, misled people to the common belief that the mechanical means of flood protection like dams and barrages can guarantee the security from floods and their impacts. This lead to widespread settlements and the establishment of infrastructure as well as modern industries in potentially unsafe areas until today. The legal status in Alsace on the French side of the Rhine did not allow for continuous flood protection measurements, leading to a constant - and probably at last annoying - reminder that the floodplains are a potentially unsafe place to be. From a modern perspective of flood risk management, this leads to a significant lower aggregation of value in the floodplains of the small rivers in Alsace compared to those on the Baden side - an interesting fact - especially if the modern European Flood directive is taken into account.

  20. A Tailorable Structural Composite for GCR and Albedo Neutron Protection on the Lunar Surface Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A tailorable structural composite that will provide protection from the lunar radiation environment, including GCR and albedo neutrons will be developed. This...

  1. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  2. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  3. Rolling contact fatigue life of ion-implanted GCr15

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Presents an experimental research into the rooling contact fatigue life of GCr15 steel with Tix N, TiX N + Ag and Tix N + DLC layers ion-implanted using the plasma ion-implantation technology on a ball-rod style high-speed con tact fatigue tester, and concludes with test results that the fatigue life increases to varying degrees with Tix N, Tix N + Ag, and Tix N + DLC layers implanted, and increases 1.8 times with Tix N + Ag layer implanted, hairline cracks grow continuously into fatigue pits under the action of shear stress in the superficial layer of material, and ion-implantation acts to prevent initiation of cracks and slow down propagation of cracks.

  4. On the description of the GCR intensity in the last three solar minima

    CERN Document Server

    Kalinin, M S; Krainev, M B; Svirzhevskaya, A K; Svirzhevsky, N S

    2014-01-01

    We discuss the main characteristic features in the heliospheric parameters important for the GCR intensity modulation for the last three solar minima (1986--1987, 1996--1997 and 2008--2009). The model for the GCR intensity modulation is considered and the set of the model parameters is chosen which allows the description of the observed GCR intensity distributions at the moments of the maximum GCR intensity in two solar minima (1987 and 1997) normal for the second half of the last century. Then we try to describe with the above model and set of parameters the unusually soft GCR energy spectra at the moments of the maximum GCR intensity in the last solar minimum between cycles 23 and 24 (2009). Our main conclusion is that the most simple way to do so is to reduce the size of the modulation region and, probably, change the rigidity dependence of the diffusion coefficient. The change of both parameters is substantiated by the observations of the solar wind and heliospheric magnetic field.

  5. Early Results from the Advanced Radiation Protection Thick GCR Shielding Project

    Science.gov (United States)

    Norman, Ryan B.; Clowdsley, Martha; Slaba, Tony; Heilbronn, Lawrence; Zeitlin, Cary; Kenny, Sean; Crespo, Luis; Giesy, Daniel; Warner, James; McGirl, Natalie; hide

    2017-01-01

    The Advanced Radiation Protection Thick Galactic Cosmic Ray (GCR) Shielding Project leverages experimental and modeling approaches to validate a predicted minimum in the radiation exposure versus shielding depth curve. Preliminary results of space radiation models indicate that a minimum in the dose equivalent versus aluminum shielding thickness may exist in the 20-30 g/cm2 region. For greater shield thickness, dose equivalent increases due to secondary neutron and light particle production. This result goes against the long held belief in the space radiation shielding community that increasing shielding thickness will decrease risk to crew health. A comprehensive modeling effort was undertaken to verify the preliminary modeling results using multiple Monte Carlo and deterministic space radiation transport codes. These results verified the preliminary findings of a minimum and helped drive the design of the experimental component of the project. In first-of-their-kind experiments performed at the NASA Space Radiation Laboratory, neutrons and light ions were measured between large thicknesses of aluminum shielding. Both an upstream and a downstream shield were incorporated into the experiment to represent the radiation environment inside a spacecraft. These measurements are used to validate the Monte Carlo codes and derive uncertainty distributions for exposure estimates behind thick shielding similar to that provided by spacecraft on a Mars mission. Preliminary results for all aspects of the project will be presented.

  6. GCR intensity during the sunspot maximum phase and the inversion of the heliospheric magnetic field

    CERN Document Server

    Krainev, M; Kalinin, M; Svirzhevskaya, A; Svirzhevsky, N

    2015-01-01

    The maximum phase of the solar cycle is characterized by several interesting features in the solar activity, heliospheric characteristics and the galactic cosmic ray (GCR) intensity. Recently the maximum phase of the current solar cycle (SC) 24, in many relations anomalous when compared with solar cycles of the second half of the 20-th century, came to the end. The corresponding phase in the GCR intensity cycle is also in progress. In this paper we study different aspects of the sunspot, heliospheric and GCR behavior around this phase. Our main conclusions are as follows: 1) The maximum phase of the sunspot SC 24 ended in 06.2014, the development of the sunspot cycle being similar to those of SC 14, 15 (the Glaisberg minimum). The maximum phase of SC 24 in the GCR intensity is still in progress. 2) The inversion of the heliospheric magnetic field consists of three stages, characterized by the appearance of the global heliospheric current sheet (HCS), connecting all longitudes. In two transition dipole stages ...

  7. New Technology of GCr15 Steel′s Sphere Annealing in Pulse-electric Field%脉冲电场作用下GCr15钢球化退火新工艺

    Institute of Scientific and Technical Information of China (English)

    曹丽云; 王建中; 曹力生

    2002-01-01

    研究了在脉冲电场作用下,GCr15钢球化退火的新工艺.研究结果表明:在脉冲电场作用下,GCr15钢的球化退火工艺可以相对简化,在保证得到良好球化组织的同时,可以降低加热及等温温度,缩短GCr15钢球化退火的保温时间.

  8. 大规格轴承钢GCr15SiMn的试制开发%Development of Large Size Bearing Steel GCr15SiMn

    Institute of Scientific and Technical Information of China (English)

    闻小德

    2014-01-01

    莱钢特钢事业部采用热装铁水+废钢→100 t电炉冶炼→LF精炼→VD真空脱气→连铸(Φ500 mm)→入坑缓冷→加热→Φ1350×1+Φ950×4+Φ800×2轧制→入坑缓冷→精整的工艺流程生产Φ120 mm GCr15SiMn轴承钢,通过优化冶炼工艺、保护浇注、弱二冷、控制加热、大压缩比轧制等措施,开发的GCr15SiMn轴承钢成分均匀,纯净度高,氧含量控制在(9~10)×10-6,碳化物带状级别均在1.5以下,液析0.5级,各项指标完全达到技术标准要求。%Laiwu Steel Special Steel Department adopted the process flow, that is hot metal charging+steel scrap→100 t EAF→LF→VD→CC(Φ500 mm)→slow cooling in pit→ heating→ rolling(Φ1 350 × 1 + Φ950 × 4+ Φ800 × 2)→slow cooling in pit→finishing for producing Φ120 mm GCr15SiMn bearing steel. By adopting some measures such as optimizing smelting process, protective casting, weak secondary cooling, controlling heating and large compression ratio rolling, developed GCr15SiMn bearing steel had uniform composition and high purity. The oxygen content was controlled between 9×10-6 and 10×10-6. The carbide band level is all below 1.5 grade, the liquation carbonide is 0.5 grade and all indicators met the requirements of technical standard.

  9. GCr15轴承钢球化退火工艺%Spheroidizing annealing technology for GCr15 bearing steel

    Institute of Scientific and Technical Information of China (English)

    孙明义; 杜振民; 郑秀仿; 秦文明; 郭俊成

    2013-01-01

    The most effective way for steel wire to get fine spheroidal pearlite is spheroidizing annealing.According to the goal standard formulated to set bearing steel spheroidizing process,to select GCr15 hot rolling pickling wire rod as raw material,use roller-hearth type short period protective atmosphere heat treatment furnace imported from the South Korea,ensure the temperature in furnace to be controlled within ± 5 ℃.Selecting two kinds of process to do experiment,analyze and compare mechanical properties and metallographic structure of GCr15 hot rolling wire rod after pickling,the results indicated that the two process all reached design goal.The No.1 process(hot charging heating to 795 ℃,heat preservation 7 h,cooling to 720 ℃ fastly,heat preservation 5 h,cooling to 650 ℃ at the speed of 20 ℃/h,then draw a charge) has better effect,it can be used to initial production.%钢丝获得细粒状珠光体组织最有效的途径是球化退火.按照标准规定的目标制定轴承钢球化退火工艺:材料选择GCr15热轧酸洗盘条;退火设备采用从韩国引进的辊底式短周期保护气氛热处理炉(STC炉),炉内温度精度控制在±5℃之内.选择2种工艺进行试验,对处理后的GCr15热轧酸洗盘条分别进行力学性能和金相显微组织分析比较.结果表明,2种工艺均能够达到设计目标,但工艺1(热装炉升温到795 ℃,保温7h,快速冷却到720℃,保温5h,以20℃/h冷却速度降至650 ℃出炉)球化效果更佳,可以用于初期生产.

  10. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....

  11. Landscape of international event-based biosurveillance.

    Science.gov (United States)

    Hartley, Dm; Nelson, Np; Walters, R; Arthur, R; Yangarber, R; Madoff, L; Linge, Jp; Mawudeku, A; Collier, N; Brownstein, Js; Thinus, G; Lightfoot, N

    2010-01-01

    Event-based biosurveillance is a scientific discipline in which diverse sources of data, many of which are available from the Internet, are characterized prospectively to provide information on infectious disease events. Biosurveillance complements traditional public health surveillance to provide both early warning of infectious disease events and situational awareness. The Global Health Security Action Group of the Global Health Security Initiative is developing a biosurveillance capability that integrates and leverages component systems from member nations. This work discusses these biosurveillance systems and identifies needed future studies.

  12. On the GCR intensity and the inversion of the heliospheric magnetic field during the periods of the high solar activity

    CERN Document Server

    Krainev, M B

    2014-01-01

    We consider the long-term behavior of the solar and heliospheric parameters and the GCR intensity in the periods of high solar activity and the inversions of heliospheric magnetic field (HMF). The classification of the HMF polarity structures and the meaning of the HMF inversion are discussed. The procedure is considered how to use the known HMF polarity distribution for the GCR intensity modeling during the periods of high solar activity. We also briefly discuss the development and the nearest future of the sunspot activity and the GCR intensity in the current unusual solar cycle 24.

  13. Asynchronous event-based binocular stereo matching.

    Science.gov (United States)

    Rogister, Paul; Benosman, Ryad; Ieng, Sio-Hoi; Lichtsteiner, Patrick; Delbruck, Tobi

    2012-02-01

    We present a novel event-based stereo matching algorithm that exploits the asynchronous visual events from a pair of silicon retinas. Unlike conventional frame-based cameras, recent artificial retinas transmit their outputs as a continuous stream of asynchronous temporal events, in a manner similar to the output cells of the biological retina. Our algorithm uses the timing information carried by this representation in addressing the stereo-matching problem on moving objects. Using the high temporal resolution of the acquired data stream for the dynamic vision sensor, we show that matching on the timing of the visual events provides a new solution to the real-time computation of 3-D objects when combined with geometric constraints using the distance to the epipolar lines. The proposed algorithm is able to filter out incorrect matches and to accurately reconstruct the depth of moving objects despite the low spatial resolution of the sensor. This brief sets up the principles for further event-based vision processing and demonstrates the importance of dynamic information and spike timing in processing asynchronous streams of visual events.

  14. Evaluation of SPE and GCR Radiation Effects in Inflatable, Space Suit and Composite Habitat Materials Project

    Science.gov (United States)

    Waller, Jess M.; Nichols, Charles

    2016-01-01

    The radiation resistance of polymeric and composite materials to space radiation is currently based on irradiating materials with Co-60 gamma-radiation to the equivalent total ionizing dose (TID) expected during mission. This is an approximation since gamma-radiation is not truly representative of the particle species; namely, Solar Particle Event (SPE) protons and Galactic Cosmic Ray (GCR) nucleons, encountered in space. In general, the SPE and GCR particle energies are much higher than Co-60 gamma-ray photons, and since the particles have mass, there is a displacement effect due to nuclear collisions between the particle species and the target material. This effort specifically bridges the gap between estimated service lifetimes based on decades old Co-60 gamma-radiation data, and newer assessments of what the service lifetimes actually are based on irradiation with particle species that are more representative of the space radiation environment.

  15. A model for GCR-particle fluxes in stony meteorites and production rates of cosmogenic nuclides

    Science.gov (United States)

    Reedy, R. C.

    1985-02-01

    A model is presented for the differential fluxes of galactic-cosmic-ray (GCR) particles with energies above 1 MeV inside any spherical stony meteorite as a function of the meteorite's radius and the sample's depth. This model is based on the Reedy-Arnold equations for the energy-dependent fluxes of GCR particles in the moon and is an extension of flux parameters that were derived for several meteorites of various sizes. This flux is used to calculate the production rates of many cosmogenic nuclides as a function of radius and depth. The peak production rates for most nuclides made by the reactions and energetic GCR particles occur near the centers of meteorites with radii of 40 to 70 g/cm (2). Although the model has some limitations, it reproduces well the basic trends for the depth-dependent production of cosmogenic nuclides in stony meteorites of various radii. These production profiles agree fairly well with measurments of cosmogenic nuclides in meteorites. Some of these production profiles are different than those calculated by others. The chemical dependence of the production rates for several nuclides varies with size and depth.

  16. On the mechanisms of the quasi-biennial oscillations in the GCR intensity

    CERN Document Server

    Krainev, M; Kalinin, M; Svirzhevskaya, A; Svirzhevsky, N

    2015-01-01

    Quasi-biennial oscillation (QBO) is a well-known quasi-periodical variation with characteristic time 0.5-4 years in different solar, heliospheric and cosmic ray characteristics. In this paper a hypothesis is checked on the causes of the apparent lack of correlation between solar and heliospheric QBOs, then the possible mechanisms of QBO in the GCR intensity are discussed as well as the idea of the same nature of the step-like changes and Gnevyshev Gap effects in the GCR intensity. Our main conclusions are as follows: 1) In the first approximation the hypothesis is justified that the change in the sunspot and QBO cycles in the transition from the Sun to the heliosphere is due to 1) the different magnitude and time behavior of the large-scale and small-scale photospheric solar magnetic fields and 2) the stronger attenuation of the small-scale fields in this transition. 2) As the QBO in the HMF strength influences both the diffusion coefficients and drift velocity, it can give rise to the complex QBO in the GCR ...

  17. An Event Based Approach To Situational Representation

    CERN Document Server

    Ashish, Naveen; Mehrotra, Sharad; Venkatasubramanian, Nalini

    2009-01-01

    Many application domains require representing interrelated real-world activities and/or evolving physical phenomena. In the crisis response domain, for instance, one may be interested in representing the state of the unfolding crisis (e.g., forest fire), the progress of the response activities such as evacuation and traffic control, and the state of the crisis site(s). Such a situation representation can then be used to support a multitude of applications including situation monitoring, analysis, and planning. In this paper, we make a case for an event based representation of situations where events are defined to be domain-specific significant occurrences in space and time. We argue that events offer a unifying and powerful abstraction to building situational awareness applications. We identify challenges in building an Event Management System (EMS) for which traditional data and knowledge management systems prove to be limited and suggest possible directions and technologies to address the challenges.

  18. Solidified crust mechanism of refining slag for GCr15 bearing steel%GCr15轴承钢精炼渣结壳机理

    Institute of Scientific and Technical Information of China (English)

    刘志宏; 张兴中

    2015-01-01

    针对GCr15轴承钢生产过程中精炼渣结壳严重,导致钢液大量吸气、钢中夹杂物增多的问题,通过对结壳程度不同的精炼渣进行工业取样,采用化学分析、物理测试、微观测定的方法,研究其化学成分、熔化状况和微观结构对结壳的影响。研究发现,结壳物主要物相为钙铝酸盐、氧化钙、尖晶石和硅酸二钙,且高熔点的氧化钙、尖晶石、硅酸二钙先于低熔点的钙铝酸盐析出,并存在于钙铝酸盐之中,增加了钙铝酸盐晶体之间的结合强度,造成精炼渣结壳;应优化精炼渣成分,使其处于CaO-SiO2-Al2O3-MgO相图中钙铝酸盐物相区,减少冷却过程高熔点物相析出,防止凝固结壳的发生。%The problem of increasing of gas and inclusions in molten steel was caused by the solidified crust of refining slag during the production of the GCr15 bearing steel. The effect of the chemical composition,melting conditions and mi-crostructure on solidified crust was studied by chemical analysis,melting test,SEM and XRD of the industrial refining slag. The results showed that the crust composed primarily of calcium aluminates,calcium oxide,magnesium-aluminium spinel,and dicalcium silicate. The bond of calcium aluminates crystals was strengthened,which made the refining slag crusted caused by numerous high melting point precipitates of calcium oxide,magnesium-aluminium spinel and dicalcium silicate precipitate ahead of the low melting point calcium aluminates,and enriched the low melting precipitates. Solidi-fied crusts can be prevented by reducing the precipitation of high melting point phase during cooling process when the re-fining slag composition is optimized in the range of low melting point area in CaO-SiO2-Al2O3-MgO phase diagram.

  19. Microstructural evolution of GCr15 steel during austenitizing and quenching considering C and Cr content

    Institute of Scientific and Technical Information of China (English)

    刘青龙; 钱东升; 魏文婷

    2016-01-01

    Microstructural evolution of GCr15 steels with different C and Cr contents during austenitizing and quenching was studied. Thermodynamic analysis of cementite dissolution was implied to obtain the critical temperature. The coordination numberx in FexCr3-xC and the volume fraction of undissolved cementite were computed according to element conservation and equilibrium phase diagram. TheMS (martensite transformation temperature) was calculated by using empirical formula. The retained austenite content was calculated with further consideration of quenching temperature. The results showed that the coordination number and the undissolved cementite content were promoted by the austenitizing temperature and carbon content of the steel. Increasing Cr element reduced the coordination number.GCr15 steels with different components had nearly the sameMS when austenitization at 830 °C to 860 °C. The interaction of C and Cr complicated the evolution ofMS and retained austenite content. The results were in good agreement with the literature, which could guide to obtain specified retained austenite and/or carbides.

  20. Miniaturized Hollow-Waveguide Gas Correlation Radiometer (GCR) for Trace Gas Detection in the Martian Atmosphere

    Science.gov (United States)

    Wilson, Emily L.; Georgieva, E. M.; Melroy, H. R.

    2012-01-01

    Gas correlation radiometry (GCR) has been shown to be a sensitive and versatile method for detecting trace gases in Earth's atmosphere. Here, we present a miniaturized and simplified version of this instrument capable of mapping multiple trace gases and identifying active regions on the Mars surface. Reduction of the size and mass of the GCR instrument has been achieved by implementing a lightweight, 1 mm inner diameter hollow-core optical fiber (hollow waveguide) for the gas correlation cell. Based on a comparison with an Earth orbiting CO2 gas correlation instrument, replacement of the 10 meter mUltipass cell with hollow waveguide of equivalent pathlength reduces the cell mass from approx 150 kg to approx 0.5 kg, and reduces the volume from 1.9 m x 1.3 m x 0.86 m to a small bundle of fiber coils approximately I meter in diameter by 0.05 m in height (mass and volume reductions of >99%). This modular instrument technique can be expanded to include measurements of additional species of interest including nitrous oxide (N2O), hydrogen sulfide (H2S), methanol (CH3OH), and sulfur dioxide (SO2), as well as carbon dioxide (CO2) for a simultaneous measure of mass balance.

  1. Elemental GCR Observations during the 2009-2010 Solar Minimum Period

    Science.gov (United States)

    Lave, K. A.; Israel, M. H.; Binns, W. R.; Christian, E. R.; Cummings, A. C.; Davis, A. J.; deNolfo, G. A.; Leske, R. A.; Mewaldt, R. A.; Stone, E. C.; hide

    2013-01-01

    Using observations from the Cosmic Ray Isotope Spectrometer (CRIS) onboard the Advanced Composition Explorer (ACE), we present new measurements of the galactic cosmic ray (GCR) elemental composition and energy spectra for the species B through Ni in the energy range approx. 50-550 MeV/nucleon during the record setting 2009-2010 solar minimum period. These data are compared with our observations from the 1997-1998 solar minimum period, when solar modulation in the heliosphere was somewhat higher. For these species, we find that the intensities during the 2009-2010 solar minimum were approx. 20% higher than those in the previous solar minimum, and in fact were the highest GCR intensities recorded during the space age. Relative abundances for these species during the two solar minimum periods differed by small but statistically significant amounts, which are attributed to the combination of spectral shape differences between primary and secondary GCRs in the interstellar medium and differences between the levels of solar modulation in the two solar minima. We also present the secondary-to-primary ratios B/C and (Sc+Ti+V)/Fe for both solar minimum periods, and demonstrate that these ratios are reasonably well fit by a simple "leaky-box" galactic transport model that is combined with a spherically symmetric solar modulation model.

  2. Friction and Wear Behavior of GCr15 Under Multiple Movement Condition

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Friction and wear of GCr15 under cross-sliding condition is tested on a ball-on-disc wear test machine. This result shows that the cross-sliding of friction pair leads to different friction and wear behavior. For the condition described in this paper, the friction coefficients with ball reciprocating are smaller than that without ball reciprocating. The friction coefficients increase with the increase of reciprocating frequency.. The wear weight loss of the ball subjected reciprocating sliding decreases, however, the wear weight loss of disc against the reciprocating ball increases. In cross-sliding friction, the worn surfaces of the ball show crinkle appearance along the circumferential sliding traces. Delaminating of small strip debris is formed along the plowing traces on the disc worn surface. The plowing furrow on the disc surfaces looks deeper and wider than that without reciprocating sliding. The size of wear particles from cross-sliding wear is larger than those without reciprocating sliding.

  3. Secondary Cosmic Ray Particles Due to GCR Interactions in the Earth's Atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Battistoni, G.; /Milan U. /INFN, Milan; Cerutti, F.; /CERN; Fasso, A.; /SLAC; Ferrari, A.; /CERN; Garzelli, M.V.; /Milan U. /INFN, Milan; Lantz, M.; /Goteborg, ITP; Muraro, S. /Milan U. /INFN, Milan; Pinsky, L.S.; /Houston U.; Ranft, J.; /Siegen U.; Roesler, S.; /CERN; Sala, P.R.; /Milan U. /INFN, Milan

    2009-06-16

    Primary GCR interact with the Earth's atmosphere originating atmospheric showers, thus giving rise to fluxes of secondary particles in the atmosphere. Electromagnetic and hadronic interactions interplay in the production of these particles, whose detection is performed by means of complementary techniques in different energy ranges and at different depths in the atmosphere, down to the Earth's surface. Monte Carlo codes are essential calculation tools which can describe the complexity of the physics of these phenomena, thus allowing the analysis of experimental data. However, these codes are affected by important uncertainties, concerning, in particular, hadronic physics at high energy. In this paper we shall report some results concerning inclusive particle fluxes and atmospheric shower properties as obtained using the FLUKA transport and interaction code. Some emphasis will also be given to the validation of the physics models of FLUKA involved in these calculations.

  4. Hot deformation behaviors and flow stress model of GCr15 bearing steel

    Institute of Scientific and Technical Information of China (English)

    LIAO Shu-lun; ZHANG Li-wen; YUE Chong-xiang; PEI Ji-bin; GAO Hui-ju

    2008-01-01

    The hot deformation behaviors of GCr15 bearing steel were investigated by isothermal compression tests, performed on a Gleeble-3800 thermal-mechanical simulator at temperatures between 950℃ and 1 150 ℃ and strain rates between 0.1 and 10s-1.The peak stress and peak strain as functions of processing parameters were obtained. The dependence of peak stress on strain rate and temperature obeys a hyperbolic sine equation with a Zener-Hollomon parameter. By regression analysis, in the temperature range of 950-1150℃ and strain rate range of 0.1-10 s-1, the mean activation energy and the stress exponent were determined to be 351kJ/mol and 4.728, respectively. Meanwhile, models of flow stress and dynamic recrystallization (DRX) grain size were also established. The model predictions show good agreement with experimental results.

  5. GCr15钢连续冷却过程中的相变和组织演变%Phase transformation and microstructure evolution of GCr15 steel during continuous cooling

    Institute of Scientific and Technical Information of China (English)

    张小垒; 李辉; 徐士新; 李志超; 米振莉

    2014-01-01

    采用膨胀法结合组织观察和硬度测试,绘制了GCr15钢的连续冷却转变( CCT )曲线,分析了不同加热温度、不同连续冷却速率下的相变及显微组织。结果表明,随着冷却速率增加,GCr15钢的硬度增大;加热温度由临界区升高到完全奥氏体区时,CCT曲线中珠光体转变区域向右下方移动、珠光体转变推迟且珠光体转变的温度区域扩大;随着奥氏体化温度升高,晶粒粗化,珠光体和马氏体开始转变点温度降低。%By thermal dilation method combining with microstructure examination and hardness measurement , the continuous cooling transformation (CCT) curves of GCr15 steel were studied.The phase transformation and microstructure evolution rules of the GCr 15 steel at different heating temperature and continuous cooling conditions were analyzed .The results show that with the cooling rate increasing , the hardness values of the GCr 15 steel rise.When the heating temperature increases from critical region to the complete austenitizing area , the pearlite zone in the CCT curve shifts to the bottom right , the pearlite transformation postpones and the phase transition range is gradually expanded.With the austenitizing temperature increasing , the grains coursen and the pearlite and martensite starting transition points gradually decrease .

  6. Effect of Austenitizing Process on Quick Spheroidizing Result for GCr15 Steel%GCr15钢奥氏体化工艺对快速球化退火效果的影响

    Institute of Scientific and Technical Information of China (English)

    袁晓敏; 陈明华

    2014-01-01

    Effect of austenitizing temperature , time at the temperature and cooling rate in two-phase region of quick spheroidizing process on both amount and distribution of residual carbide particles in GCr 15 steel was investigated . A new quick spheroidizing process was worked out on the basis of DET ( divorced eutectoid transformation ) and the effect of austenite state on residual carbide particles for GCr 15 steel.The experiment shows that after being quick spheroidizd by austenitizing at 790 ℃ for 10 min, followed by furnace cooling to 720 ℃and holding for 60 min, then furnace cooling , the spheroidizd microstructure number is 2.5, and the total spheroidizing cycle is 3.5 h, being appreciably superior to the traditional spheroidizing process .%研究了快速球化退火的奥氏体化温度、保温时间以及双相区冷却速度对GCr15钢残留碳化物粒子的数量和分布形态的影响。根据“离异共析”的原理和奥氏体状态对残留碳化物粒子影响的研究结果,制定了GCr15钢的快速球化退火工艺。试验表明,GCr15钢经790℃×10 min奥氏体化,炉冷至720℃等温60 min炉冷快速球化退火后,其球化组织为2.5级,总退火时间为3.5 h,明显优于传统球化退火工艺。

  7. 27-day variation of the GCR intensity based on corrected and uncorrected for geomagnetic disturbances data of neutron monitors

    CERN Document Server

    Alania, M V; Wawrzynczak, A; Sdobnov, V E; Kravtsova, M V

    2015-01-01

    We study 27-day variations of the galactic cosmic ray (GCR) intensity for 2005- 2008 period of the solar cycle #23. We use neutron monitors (NMs) data corrected and uncorrected for geomagnetic disturbances. Besides the limited time intervals when the 27-day variations are clearly established, always exist some feeble 27-day variations in the GCR 5 intensity related to the constantly present weak heliolongitudinal asymmetry in the heliosphere. We calculate the amplitudes of the 27-day variation of the GCR intensity based on the NMs data corrected and uncorrected for geomagnetic disturbances. We show that these amplitudes do not differ for NMs with cut-off rigidities smaller than 4-5 GV comparing with NMs of higher cut-off rigidities. Rigidity spectrum of the 27-day variation of the GCR intensity found in the uncorrected data is soft while it is hard in the case of the corrected data. For both cases exists definite tendency of softening the temporal changes of the 27-day variation's rigidity spectrum in period ...

  8. Peculiarities of Galactic Cosmic Ray (GCR) anisotropy variation in connection with the recurrent and sporadic Forbush effects

    Science.gov (United States)

    Naskidashvili, B. D.; Nachkebia, N. A.; Tsereteli, G. L.; Shatashvili, L. K.

    1985-01-01

    It has been established, that the beginning of the change of vector of Solar-diurnal anisotropy of Galactic Cosmic Rays (GCR) preceeds due to disturbed region (DR) of Solar wind existing time of which is Tau or = 8 days. The meridional gradient delta theta eta of density during the recurrent FD is valued.

  9. Results of Simulated Galactic Cosmic Radiation (GCR) and Solar Particle Events (SPE) on Spectra Restraint Fabric

    Science.gov (United States)

    Peters, Benjamin; Hussain, Sarosh; Waller, Jess

    2017-01-01

    Spectra or similar Ultra-high-molecular-weight polyethylene (UHMWPE) fabric is the likely choice for future structural space suit restraint materials due to its high strength-to-weight ratio, abrasion resistance, and dimensional stability. During long duration space missions, space suits will be subjected to significant amounts of high-energy radiation from several different sources. To insure that pressure garment designs properly account for effects of radiation, it is important to characterize the mechanical changes to structural materials after they have been irradiated. White Sands Test Facility (WSFTF) collaborated with the Crew and Thermal Systems Division at the Johnson Space Center (JSC) to irradiate and test various space suit materials by examining their tensile properties through blunt probe puncture testing and single fiber tensile testing after the materials had been dosed at various levels of simulated GCR and SPE Iron and Proton beams at Brookhaven National Laboratories. The dosages were chosen based on a simulation developed by the Structural Engineering Division at JSC for the expected radiation dosages seen by space suit softgoods seen on a Mars reference mission. Spectra fabric tested in the effort saw equivalent dosages at 2x, 10x, and 20x the predicted dose as well as a simulated 50 year exposure to examine the range of effects on the material and examine whether any degradation due to GCR would be present if the suit softgoods were stored in deep space for a long period of time. This paper presents the results of this work and outlines the impact on space suit pressure garment design for long duration deep space missions.

  10. A Stochastic Model of Space Radiation Transport as a Tool in the Development of Time-Dependent Risk Assessment

    Science.gov (United States)

    Kim, Myung-Hee Y.; Nounu, Hatem N.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2011-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) [1] for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of heavy ions in tissue and shielding materials is made with a stochastic approach that includes both ion track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model [2]. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections

  11. Modeling the time and energy behavior of the GCR intensity in the periods of low activity around the last three solar minima

    CERN Document Server

    Krainev, M B; Kalinin, M S; Svirzhevskaya, A K; Svirzhevsky, N S

    2014-01-01

    Using the simple model for the description of the GCR modulation in the heliosphere and the sets of parameters discussed in the accompanying paper we model some features of the time and energy behavior of the GCR intensity near the Earth observed during periods of low solar activity around three last solar minima. In order to understand the mechanisms underlying these features in the GCR behavior, we use the suggested earlier decomposition of the calculated intensity into the partial intensities corresponding to the main processes (diffusion, adiabatic losses, convection and drifts).

  12. Monte Carlo simulation of GCR neutron capture production of cosmogenic nuclides in stony meteorites and lunar surface

    Science.gov (United States)

    KolláR, D.; Michel, R.; Masarik, J.

    2006-03-01

    A purely physical model based on a Monte Carlo simulation of galactic cosmic ray (GCR) particle interaction with meteoroids is used to investigate neutron interactions down to thermal energies. Experimental and/or evaluated excitation functions are used to calculate neutron capture production rates as a function of the size of the meteoroid and the depth below its surface. Presented are the depth profiles of cosmogenic radionuclides 36Cl, 41Ca, 60Co, 59Ni, and 129I for meteoroid radii from 10 cm up to 500 cm and a 2π irradiation. Effects of bulk chemical composition on n-capture processes are studied and discussed for various chondritic and lunar compositions. The mean GCR particle flux over the last 300 ka was determined from the comparison of simulations with measured 41Ca activities in the Apollo 15 drill core. The determined value significantly differs from that obtained using equivalent models of spallation residue production.

  13. MHD compressor---expander conversion system integrated with GCR inside a deployable reflector

    Energy Technology Data Exchange (ETDEWEB)

    Tuninetti, G. (Ansaldo S.p.A., Genoa (Italy). Research Div.); Botta, E.; Criscuolo, C.; Riscossa, P. (Ansaldo S.p.A., Genoa (Italy). Nuclear Div.); Giammanco, F. (Pisa Univ. (Italy). Dipt. di Fisica); Rosa-Clot, M. (Florence Univ. (Italy). Dipt. di Fisica)

    1989-04-20

    This work originates from the proposal MHD Compressor-Expander Conversion System Integrated with a GCR Inside a Deployable Reflector''. The proposal concerned an innovative concept of nuclear, closed-cycle MHD converter for power generation on space-based systems in the multi-megawatt range. The basic element of this converter is the Power Conversion Unit (PCU) consisting of a gas core reactor directly coupled to an MHD expansion channel. Integrated with the PCU, a deployable reflector provides reactivity control. The working fluid could be either uranium hexafluoride or a mixture of uranium hexafluoride and helium, added to enhance the heat transfer properties. The original Statement of Work, which concerned the whole conversion system, was subsequently redirected and focused on the basic mechanisms of neutronics, reactivity control, ionization and electrical conductivity in the PCU. Furthermore, the study was required to be inherently generic such that the study was required to be inherently generic such that the analysis an results can be applied to various nuclear reactor and/or MHD channel designs''.

  14. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  15. Inclusions in GCr15 bearing steel produced by 120 t LD-LF-VD-CC process%120t转炉-LF-VD-CC流程生产GCr15轴承钢的夹杂物

    Institute of Scientific and Technical Information of China (English)

    范植金; 罗国华; 冯文圣; 朱玉秀

    2011-01-01

    Inclusions in GCr15 bearing steel produced by 120 t LD-LF-VD-CC process were investigated using metallogaphic microscope analysis,electron probe microanalysis and inclusion electroanalysis.The results indicate that the A inclusion grade is 0.5-1.5,B inclusion grade is 0.5-1.5,C inclusion grade is 0,D inclusion grade is not more than 0.5.The results of inclusion grading satisfy requirements of the GB/T 18254—2002 standard.Average total amount of oxide inclusions in GCr15 bearing steel produced by LD-LF-VD-CC process is 0.0037% in mass percent,which is reduced in evidence compared with Al-killed GCr15 steel produced by electric furnace.The inclusions mainly are strip manganese sulfide,chain alumina,fusiform manganese sulfide enwrapping granular alumina,granular calcium aluminate,granular calcium sulfide enwrapping calcium aluminate,granular magnesium aluminum spinel,granular calcium sulfide enwrapping magnesium aluminum spinel,quadrate titanium(carbide)nitride,etc.%对120 t转炉-LF-VD-CC流程生产的GCr15轴承钢夹杂物进行了金相显微镜观察评级、电子探针观察分析和电解夹杂分析。结果显示:A类夹杂0.5~1.5级,B类夹杂0.5~1.5级,C类夹杂0级,D类夹杂不大于0.5级,可充分满足GB/T 18254—2002标准的规定;其氧化夹杂物总质量分数平均值为0.00370%,相比采用Al脱氧电炉冶炼的GCr15轴承钢明显降低;夹杂物类型主要有条状硫化锰、链状氧化铝、纺锤状硫化锰包覆粒状氧化铝、粒状铝酸钙、粒状硫化钙包覆铝酸钙、粒状镁铝尖晶石、粒状硫化钙包覆镁铝尖晶石、方块状氮(碳)化钛等。

  16. G-protein signalling components GCR1 and GPA1 mediate responses to multiple abiotic stresses in Arabidopsis

    Directory of Open Access Journals (Sweden)

    Navjyoti eChakraborty

    2015-11-01

    Full Text Available G-protein signalling components have been implicated in some individual stress responses in Arabidopsis, but have not been comprehensively evaluated at the genetic and biochemical level. Stress emerged as the largest functional category in our whole transcriptome analyses of knock-out mutants of GCR1 and/or GPA1 in Arabidopsis (Chakraborty et al., 2015a, PloS one 10, e0117819 and Chakraborty et al., 2015b, Plant Mol. Biol., doi: 10.1007/s11103-015-0374-2. This led us to ask whether G-protein signalling components offer converging points in the plant’s response to multiple abiotic stresses. In order to test this hypothesis, we carried out detailed analysis of the stress category in the present study, which revealed 144 differentially expressed genes (DEGs, spanning a wide range of abiotic stresses, including heat, cold, salt, light stress etc. Only 10 of these DEGs are shared by all the three mutants, while the single mutants (GCR1/GPA1 shared more DEGs between themselves than with the double mutant (GCR1-GPA1. RT-qPCR validation of 28 of these genes spanning different stresses revealed identical regulation of the DEGs shared between the mutants. We also validated the effects of cold, heat and salt stresses in all the 3 mutants and WT on % germination, root and shoot length, relative water content, proline content, lipid peroxidation and activities of catalase, ascorbate peroxidase and superoxide dismutase. All the 3 mutants showed evidence of stress tolerance, especially to cold, followed by heat and salt, in terms of all the above parameters. This clearly shows the role of GCR1 and GPA1 in mediating the plant’s response to multiple abiotic stresses for the first time, especially cold, heat and salt stresses. This also implies a role for classical G-protein signalling pathways in stress sensitivity in the normal plants of Arabidopsis. This is also the first genetic and biochemical evidence of abiotic stress tolerance rendered by knock

  17. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  18. Training Team Problem Solving Skills: An Event-Based Approach.

    Science.gov (United States)

    Oser, R. L.; Gualtieri, J. W.; Cannon-Bowers, J. A.; Salas, E.

    1999-01-01

    Discusses how to train teams in problem-solving skills. Topics include team training, the use of technology, instructional strategies, simulations and training, theoretical framework, and an event-based approach for training teams to perform in naturalistic environments. Contains 68 references. (Author/LRW)

  19. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    2011-01-01

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a u

  20. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  1. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  2. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  3. 拟南芥GCR2参与感应N-丁酰基高丝氨酸内酯过程的初步研究%Preliminarily Research on Involvement ofArabidopsisGCR2 Responsing to N-Butyryl-DL-homoserine Lactone

    Institute of Scientific and Technical Information of China (English)

    艾秋实; 张哲; 屈凌波; 刘方; 赵芊; 宋水山

    2015-01-01

    N-丁酰基高丝氨酸内酯(C4-HSL)是革兰氏阴性菌主要的群体感应信号,其能促进植物生长发育,激活细胞膜表面的Ca2+通道,从而参与调控植物的生理代谢。然而,植物感应C4-HSL的分子机制并不清楚。拟南芥GCR2作为ABA的受体,对植物的生理代谢十分重要。旨在探寻GCR2是否参与拟南芥感应C4-HSL的过程。qRT-PCR结果表明,C4-HSL处理后1 h, GCR2基因表达量出现明显上调并在6 h后达到最大值,说明C4-HSL可调节GCR2。ELISA结果显示,GCR2蛋白表达量也在6 h达到最大值。对体外表达的GCR2进行纯化和浓缩,使其达到0.6 mg/mL后进行微量热泳动(MST)检测。MST测得C4-HSL与GCR2的解离常数(Kd)为166 nmol/L,显示出较强的结合能力。用BSA作为阴性对照,表明C4-HSL与GCR2的结合具有一定的特异性。这些结果表明GCR2可能参与了拟南芥感应C4-HSL的过程。%N-Butyryl-DL-homoserine lactone(C4-HSL)is a main quorum sensing signal in gram-negative bacteria. It could significantly promote root elongation and activate Ca2+ channel at cytomembrane. C4-HSL can regulate metabolization of plant. However, little is known about the molecular mechanism of plants responding to C4-HSL.Arabidopsis thalianaGCR2 is receptor of abscisic acid(ABA). It is important for metabolization of plant. This research aimed to explore whether theGCR2 is involved in the process ofArabidopsis thaliana reacting to C4-HSL. qRT-PCR showed that C4-HSL could regulate expression of GCR2. Expression of GCR2 was significantly upregulated after 1 h treated by C4-HSL and maximaized at 6 h. ELISA also showed that expression of GCR2 maximaized at 6 h. GCR2 was purified and condensed to 0.6 mg/mL for Microscale Thermophoresis(MST)measurement. MST indicated that dissociation constant(Kd)of GCR2 and C4-HSL was 166 nmol/L, which meant that they had strong binding affinity. Taking BSA as negative control, this certified that

  4. Evaluating Shielding Effectiveness for Reducing Space Radiation Cancer Risks

    Science.gov (United States)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Ren, Lei

    2007-01-01

    We discuss calculations of probability distribution functions (PDF) representing uncertainties in projecting fatal cancer risk from galactic cosmic rays (GCR) and solar particle events (SPE). The PDF s are used in significance tests of the effectiveness of potential radiation shielding approaches. Uncertainties in risk coefficients determined from epidemiology data, dose and dose-rate reduction factors, quality factors, and physics models of radiation environments are considered in models of cancer risk PDF s. Competing mortality risks and functional correlations in radiation quality factor uncertainties are treated in the calculations. We show that the cancer risk uncertainty, defined as the ratio of the 95% confidence level (CL) to the point estimate is about 4-fold for lunar and Mars mission risk projections. For short-stay lunar missions (shielding, especially for carbon composites structures with high hydrogen content. In contrast, for long duration lunar (>180 d) or Mars missions, GCR risks may exceed radiation risk limits, with 95% CL s exceeding 10% fatal risk for males and females on a Mars mission. For reducing GCR cancer risks, shielding materials are marginally effective because of the penetrating nature of GCR and secondary radiation produced in tissue by relativistic particles. At the present time, polyethylene or carbon composite shielding can not be shown to significantly reduce risk compared to aluminum shielding based on a significance test that accounts for radiobiology uncertainties in GCR risk projection.

  5. Determining the Magnitude of Neutron and Galactic Cosmic Ray (GCR) Fluxes at the Moon using the Lunar Exploration Neutron Detector during the Historic Space-Age Era of High GCR Flux

    Science.gov (United States)

    Chin, G.; Sagdeev, R.; Boynton, W. V.; Mitrofanov, I. G.; Milikh, G. M.; Su, J. J.; Livengood, T. A.; McClanahan, T. P.; Evans, L.; Starr, R. D.; litvak, M. L.; Sanin, A.

    2013-12-01

    The Lunar Reconnaissance Orbiter (LRO) was launched June 18, 2009 during an historic space-age era of minimum solar activity [1]. The lack of solar sunspot activity signaled a complex set of heliospheric phenomena [2,3,4] that also gave rise to a period of unprecedentedly high Galactic Cosmic Ray (GCR) flux [5]. These events coincided with the primary mission of the Lunar Exploration Neutron Detector (LEND, [6]), onboard LRO in a nominal 50-km circular orbit of the Moon [7]. Methods to calculate the emergent neutron albedo population using Monte Carlo techniques [8] rely on an estimate of the GCR flux and spectra calibrated at differing periods of solar activity [9,10,11]. Estimating the actual GCR flux at the Moon during the LEND's initial period of operation requires a correction using a model-dependent heliospheric transport modulation parameter [12] to adjust the GCR flux appropriate to this unique solar cycle. These corrections have inherent uncertainties depending on model details [13]. Precisely determining the absolute neutron and GCR fluxes is especially important in understanding the emergent lunar neutrons measured by LEND and subsequently in estimating the hydrogen/water content in the lunar regolith [6]. LEND is constructed with a set of neutron detectors to meet differing purposes [6]. Specifically there are two sets of detector systems that measure the flux of epithermal neutrons: a) the uncollimated Sensor for Epi-Thermal Neutrons (SETN) and b) the Collimated Sensor for Epi-Thermal Neutrons (CSETN). LEND SETN and CSETN observations form a complementary set of simultaneous measurements that determine the absolute scale of emergent lunar neutron flux in an unambiguous fashion and without the need for correcting to differing solar-cycle conditions. LEND measurements are combined with a detailed understanding of the sources of instrumental back-ground, and the performance of CSETN and SETN. This comparison allows us to calculate a constant scale factor

  6. Mars Science Laboratory; A Model for Event-Based EPO

    Science.gov (United States)

    Mayo, Louis; Lewis, E.; Cline, T.; Stephenson, B.; Erickson, K.; Ng, C.

    2012-10-01

    The NASA Mars Science Laboratory (MSL) and its Curiosity Rover, a part of NASA's Mars Exploration Program, represent the most ambitious undertaking to date to explore the red planet. MSL/Curiosity was designed primarily to determine whether Mars ever had an environment capable of supporting microbial life. NASA's MSL education program was designed to take advantage of existing, highly successful event based education programs to communicate Mars science and education themes to worldwide audiences through live webcasts, video interviews with scientists, TV broadcasts, professional development for teachers, and the latest social media frameworks. We report here on the success of the MSL education program and discuss how this methodological framework can be used to enhance other event based education programs.

  7. Calculation for Growth Rate of Divorced Eutectoid Transformation Front in GCr15 Steel%GCr15钢离异共析转变前沿生长速度的计算

    Institute of Scientific and Technical Information of China (English)

    丁美良; 关建辉

    2013-01-01

    通过中断淬火试验,采用扫描电子显微镜研究了GCr15钢的离异共析转变.采用相变动力学方法计算了片状珠光体转变和离异共析转变前沿的生长速度.结果表明:过冷奥氏体剩余碳化物颗粒间距越小,离异共析转变临界过冷度就越大.

  8. Measurement of Contact Fatigue P- S- N Curve for Specially Strengthened GCr15 Steel Balls%特殊强化GCr15钢球的接触疲劳 P-S-N曲线的测定

    Institute of Scientific and Technical Information of China (English)

    高元安; 韩红民; 张晓旭

    2005-01-01

    测定经特殊强化制造的GCr15钢球的接触疲劳P-S-N曲线,估计出试验应力S与试样寿命N之间函数关系式N=CS-m中的待定参数C和m,得出不同破坏概率下试验应力S与寿命N的关系,为该钢球的使用和产品设计提供试验依据.

  9. DNA binding of the cell cycle transcriptional regulator GcrA depends on N6-adenosine methylation in Caulobacter crescentus and other Alphaproteobacteria.

    Science.gov (United States)

    Fioravanti, Antonella; Fumeaux, Coralie; Mohapatra, Saswat S; Bompard, Coralie; Brilli, Matteo; Frandi, Antonio; Castric, Vincent; Villeret, Vincent; Viollier, Patrick H; Biondi, Emanuele G

    2013-05-01

    Several regulators are involved in the control of cell cycle progression in the bacterial model system Caulobacter crescentus, which divides asymmetrically into a vegetative G1-phase (swarmer) cell and a replicative S-phase (stalked) cell. Here we report a novel functional interaction between the enigmatic cell cycle regulator GcrA and the N6-adenosine methyltransferase CcrM, both highly conserved proteins among Alphaproteobacteria, that are activated early and at the end of S-phase, respectively. As no direct biochemical and regulatory relationship between GcrA and CcrM were known, we used a combination of ChIP (chromatin-immunoprecipitation), biochemical and biophysical experimentation, and genetics to show that GcrA is a dimeric DNA-binding protein that preferentially targets promoters harbouring CcrM methylation sites. After tracing CcrM-dependent N6-methyl-adenosine promoter marks at a genome-wide scale, we show that these marks recruit GcrA in vitro and in vivo. Moreover, we found that, in the presence of a methylated target, GcrA recruits the RNA polymerase to the promoter, consistent with its role in transcriptional activation. Since methylation-dependent DNA binding is also observed with GcrA orthologs from other Alphaproteobacteria, we conclude that GcrA is the founding member of a new and conserved class of transcriptional regulators that function as molecular effectors of a methylation-dependent (non-heritable) epigenetic switch that regulates gene expression during the cell cycle.

  10. DNA binding of the cell cycle transcriptional regulator GcrA depends on N6-adenosine methylation in Caulobacter crescentus and other Alphaproteobacteria.

    Directory of Open Access Journals (Sweden)

    Antonella Fioravanti

    2013-05-01

    Full Text Available Several regulators are involved in the control of cell cycle progression in the bacterial model system Caulobacter crescentus, which divides asymmetrically into a vegetative G1-phase (swarmer cell and a replicative S-phase (stalked cell. Here we report a novel functional interaction between the enigmatic cell cycle regulator GcrA and the N6-adenosine methyltransferase CcrM, both highly conserved proteins among Alphaproteobacteria, that are activated early and at the end of S-phase, respectively. As no direct biochemical and regulatory relationship between GcrA and CcrM were known, we used a combination of ChIP (chromatin-immunoprecipitation, biochemical and biophysical experimentation, and genetics to show that GcrA is a dimeric DNA-binding protein that preferentially targets promoters harbouring CcrM methylation sites. After tracing CcrM-dependent N6-methyl-adenosine promoter marks at a genome-wide scale, we show that these marks recruit GcrA in vitro and in vivo. Moreover, we found that, in the presence of a methylated target, GcrA recruits the RNA polymerase to the promoter, consistent with its role in transcriptional activation. Since methylation-dependent DNA binding is also observed with GcrA orthologs from other Alphaproteobacteria, we conclude that GcrA is the founding member of a new and conserved class of transcriptional regulators that function as molecular effectors of a methylation-dependent (non-heritable epigenetic switch that regulates gene expression during the cell cycle.

  11. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. The bacterial cell cycle regulator GcrA is a σ70 cofactor that drives gene expression from a subset of methylated promoters.

    Science.gov (United States)

    Haakonsen, Diane L; Yuan, Andy H; Laub, Michael T

    2015-11-01

    Cell cycle progression in most organisms requires tightly regulated programs of gene expression. The transcription factors involved typically stimulate gene expression by binding specific DNA sequences in promoters and recruiting RNA polymerase. Here, we found that the essential cell cycle regulator GcrA in Caulobacter crescentus activates the transcription of target genes in a fundamentally different manner. GcrA forms a stable complex with RNA polymerase and localizes to almost all active σ(70)-dependent promoters in vivo but activates transcription primarily at promoters harboring certain DNA methylation sites. Whereas most transcription factors that contact σ(70) interact with domain 4, GcrA interfaces with domain 2, the region that binds the -10 element during strand separation. Using kinetic analyses and a reconstituted in vitro transcription assay, we demonstrated that GcrA can stabilize RNA polymerase binding and directly stimulate open complex formation to activate transcription. Guided by these studies, we identified a regulon of ∼ 200 genes, providing new insight into the essential functions of GcrA. Collectively, our work reveals a new mechanism for transcriptional regulation, and we discuss the potential benefits of activating transcription by promoting RNA polymerase isomerization rather than recruitment exclusively.

  13. Event-based Corpuscular Model for Quantum Optics Experiments

    CERN Document Server

    Michielsen, K; De Raedt, H

    2010-01-01

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a unified description of multiple-beam fringes of a plane parallel plate, single-photon Mach-Zehnder interferometer, Wheeler's delayed choice, photon tunneling, quantum erasers, two-beam interference, double-slit, and Einstein-Podolsky-Rosen-Bohm and Hanbury Brown-Twiss experiments.

  14. Event-based Implicit Invocation Decentralized in Ada

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Nowadays more and more attraction is drawn by the event-basedimplicit invocation - one of useful architectural patterns, because of its loose couplin g between components in the architecture and reactive integration in software sys tems. Analyzing object-oriented interaction with objects, this paper, based upo n the principle of software architecture, presents an approach on event-based ob j ect model with Ada exception handler. Consequently it is possible for us to impr ove, with adding specific architectural patterns, traditional programming langua ges into architectural description languages.

  15. Event-based processing of neutron scattering data

    Science.gov (United States)

    Peterson, Peter F.; Campbell, Stuart I.; Reuter, Michael A.; Taylor, Russell J.; Zikovsky, Janik

    2015-12-01

    Many of the world's time-of-flight spallation neutrons sources are migrating to recording individual neutron events. This provides for new opportunities in data processing, the least of which is to filter the events based on correlating them with logs of sample environment and other ancillary equipment. This paper will describe techniques for processing neutron scattering data acquired in event mode which preserve event information all the way to a final spectrum, including any necessary corrections or normalizations. This results in smaller final uncertainties compared to traditional methods, while significantly reducing processing time and memory requirements in typical experiments. Results with traditional histogramming techniques will be shown for comparison.

  16. Differential regulation of the overlapping Kaposi's sarcoma-associated herpesvirus vGCR (orf74) and LANA (orf73) promoters.

    Science.gov (United States)

    Jeong, J; Papin, J; Dittmer, D

    2001-02-01

    Similar to that of other herpesviruses, Kaposi's sarcoma-associated herpesvirus (KSHV/HHV-8) lytic replication destroys the host cell, while the virus can persist in a latent state in synchrony with the host. During latency only a few genes are transcribed, and the question becomes one of what determines latent versus lytic gene expression. Here we undertake a detailed analysis of the latency-associated nuclear antigen (LANA [orf73]) promoter (LANAp). We characterized a minimal region that is necessary and sufficient to maintain high-level transcription in all tissues tested, including primary endothelial cells and B cells, which are the suspected natural host for KSHV. We show that in transient-transfection assays LANAp mimics the expression pattern observed for the authentic promoter in the context of the KSHV episome. Unlike other KSHV promoters tested thus far, LANAp is not affected by tetradecanoyl phorbol acetate or viral lytic cycle functions. It is, however, subject to control by LANA itself and cellular regulatory factors, such as p53. This is in contrast to the K14/vGCR (orf74) promoter, which overlaps LANAp and directs transcription on the opposite strand. We isolated a minimal cis-regulatory region sufficient for K14/vGCR promoter activity and show that it, too, mimics the regulation observed for the authentic viral promoter. In particular, we demonstrate that its activity is absolutely dependent on the immediate-early transactivator orf50, the KSHV homolog of the Epstein-Barr virus Rta transactivator.

  17. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  18. Event-based incremental updating of spatio-temporal database

    Institute of Scientific and Technical Information of China (English)

    周晓光; 陈军; 蒋捷; 朱建军; 李志林

    2004-01-01

    Based on the relationship among the geographic events, spatial changes and the database operations, a new automatic (semi-automatic) incremental updating approach of spatio-temporal database (STDB) named as event-based incremental updating (E-BIU) is proposed in this paper. At first, the relationship among the events, spatial changes and the database operations is analyzed, then a total architecture of E-BIU implementation is designed, which includes an event queue, three managers and two sets of rules, each component is presented in detail. The process of the E-BIU of master STDB is described successively. An example of building's incremental updating is given to illustrate this approach at the end. The result shows that E-BIU is an efficient automatic updating approach for master STDB.

  19. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  20. Metacognitive awareness of event-based prospective memory.

    Science.gov (United States)

    Thadeus Meeks, J; Hicks, Jason L; Marsh, Richard L

    2007-12-01

    This study examined people's ability to predict and postdict their performance on an event-based prospective memory task. Using nonfocal cues, one group of participants predicted their success at finding animal words and a different group predicted their ability to find words with a particular syllable in it. The authors also administered a self-report questionnaire on everyday prospective and retrospective memory failures. Based on the different strategies adopted by the two groups and correlations among the dependent variables, the authors concluded that people do have a basic awareness of their prospective memory abilities, but that this awareness is far from accurate. The importance of metamemory concerning one's prospective memory is discussed in terms of how it influences the strategies that people might choose for actually completing their various everyday intentions.

  1. Draft Title 40 CFR 191 compliance certification application for the Waste Isolation Pilot Plant. Volume 6: Appendix GCR Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-31

    The Geological Characterization Report (GCR) for the WIPP site presents, in one document, a compilation of geologic information available to August, 1978, which is judged to be relevant to studies for the WIPP. The Geological Characterization Report for the WIPP site is neither a preliminary safety analysis report nor an environmental impact statement; these documents, when prepared, should be consulted for appropriate discussion of safety analysis and environmental impact. The Geological Characterization Report of the WIPP site is a unique document and at this time is not required by regulatory process. An overview is presented of the purpose of the WIPP, the purpose of the Geological Characterization Report, the site selection criteria, the events leading to studies in New Mexico, status of studies, and the techniques employed during geological characterization.

  2. Study on the Components and Performance of GCr15 Bearing Steel Surface by Gas Multi-elements Penetrating

    Institute of Scientific and Technical Information of China (English)

    ZHOU Hai; CHEN Fei; YAO Bin; ZHANG Jian-jun; CHEN Li

    2004-01-01

    Gas multi-elements Penetration is a new surface hardening technology to improve the performance of the surface.In this paper, we focus on the study on the influence of multi-elements penetration on hardness of GCr15 bearing steel surface by C-N-O multi-elements penetrating treatment, and analyze the three elements, C, N and O in the surface with an EDX. Analysis of SEM images shows that there forms a penetrated layer 75 μ m or so in thickness over the surface, in which,0-30 μ m is the passivation layer, 30-60 μ m, the bright layer, and 60-75, the transition layer.

  3. Study on the Components and Performance of GCr15 Bearing Steel Surface by Gas Multi-elements Penetrating

    Institute of Scientific and Technical Information of China (English)

    ZHOUHai; CHENFei; YAOBin; ZHANGJian-jun; CHENLi

    2004-01-01

    Gas multi-elements Penetration is a new surface hardening technology to improve the performance of the surface.In this paper, we focus on the study on the influence of multi-elements penetration on hardness of GCrI5 bearing steel surface by C-N-O multi-elements penetrating treatment, and analyze the three elements, C, N and O in the surface with an EDX. Analysis of SEM images shows that there forms a penetrated layer 75μm or so in thickness over the surface, in which,0-30μm is the passivation layer, 30-60μm, the bright layer, and 60-75, the transition layer.

  4. Managing Lunar and Mars Mission Radiation Risks. Part 1; Cancer Risks, Uncertainties, and Shielding Effectiveness

    Science.gov (United States)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Ren, Lei

    2005-01-01

    This document addresses calculations of probability distribution functions (PDFs) representing uncertainties in projecting fatal cancer risk from galactic cosmic rays (GCR) and solar particle events (SPEs). PDFs are used to test the effectiveness of potential radiation shielding approaches. Monte-Carlo techniques are used to propagate uncertainties in risk coefficients determined from epidemiology data, dose and dose-rate reduction factors, quality factors, and physics models of radiation environments. Competing mortality risks and functional correlations in radiation quality factor uncertainties are treated in the calculations. The cancer risk uncertainty is about four-fold for lunar and Mars mission risk projections. For short-stay lunar missins (shielding. For long-duration (>180 d) lunar or Mars missions, GCR risks may exceed radiation risk limits. While shielding materials are marginally effective in reducing GCR cancer risks because of the penetrating nature of GCR and secondary radiation produced in tissue by relativisitc particles, polyethylene or carbon composite shielding cannot be shown to significantly reduce risk compared to aluminum shielding. Therefore, improving our knowledge of space radiobiology to narrow uncertainties that lead to wide PDFs is the best approach to ensure radiation protection goals are met for space exploration.

  5. Event based classification of Web 2.0 text streams

    CERN Document Server

    Bauer, Andreas

    2012-01-01

    Web 2.0 applications like Twitter or Facebook create a continuous stream of information. This demands new ways of analysis in order to offer insight into this stream right at the moment of the creation of the information, because lots of this data is only relevant within a short period of time. To address this problem real time search engines have recently received increased attention. They take into account the continuous flow of information differently than traditional web search by incorporating temporal and social features, that describe the context of the information during its creation. Standard approaches where data first get stored and then is processed from a peristent storage suffer from latency. We want to address the fluent and rapid nature of text stream by providing an event based approach that analyses directly the stream of information. In a first step we want to define the difference between real time search and traditional search to clarify the demands in modern text filtering. In a second s...

  6. Event-based internet biosurveillance: relation to epidemiological observation

    Directory of Open Access Journals (Sweden)

    Nelson Noele P

    2012-06-01

    Full Text Available Abstract Background The World Health Organization (WHO collects and publishes surveillance data and statistics for select diseases, but traditional methods of gathering such data are time and labor intensive. Event-based biosurveillance, which utilizes a variety of Internet sources, complements traditional surveillance. In this study we assess the reliability of Internet biosurveillance and evaluate disease-specific alert criteria against epidemiological data. Methods We reviewed and compared WHO epidemiological data and Argus biosurveillance system data for pandemic (H1N1 2009 (April 2009 – January 2010 from 8 regions and 122 countries to: identify reliable alert criteria among 15 Argus-defined categories; determine the degree of data correlation for disease progression; and assess timeliness of Internet information. Results Argus generated a total of 1,580 unique alerts; 5 alert categories generated statistically significant (p  Conclusion Confirmed pandemic (H1N1 2009 cases collected by Argus and WHO methods returned consistent results and confirmed the reliability and timeliness of Internet information. Disease-specific alert criteria provide situational awareness and may serve as proxy indicators to event progression and escalation in lieu of traditional surveillance data; alerts may identify early-warning indicators to another pandemic, preparing the public health community for disease events.

  7. Research on Behavior of Non-metallic Inclusions in GCr15 Bearing Steel%GCr15轴承钢中非金属夹杂物行为的研究

    Institute of Scientific and Technical Information of China (English)

    张仰东; 吴晓东; 谈盛康

    2011-01-01

    Based on the productive process of BOF→LF→RH→CC for GCr15 bearing steel produced in Huaigang steel, using metallography and SEM-EDS analysis, the non-metallic inclusions in molten steel were studied in the size, composition and morphology. The changing about the inclusions in the bearing steel were investigated at different refining slag basicity. The main inclusions in rolling were mixed oxides and sulfide. The component diagram of inclusions were analyzed and calculated. The results show that micro-inclusions reduce from 23.34 number/mm2 into 14.02 number /mm2 after LF refining. Inclusions decrease slightly after RH treatment. The quantity of inclusions decrease slightly in rolling process. With the smelting process carrying out, the large inclusions are effectively removed by the steel flowing in ladle movement. The main inclusions in the steel are oxides, sulfide and CaO(CaS)-Al2O3-MgO complex inclusions.%针对淮钢80t转炉-90tLF- 100tRH-CC工艺生产的GCr15轴承钢,采用金相、SEM和EDS等方法,研究了精炼过程中夹杂物的尺寸、成分和形貌等的变化情况.经分析计算,得出了各工序夹杂物的成分图,并分析了夹杂物在冶炼过程中的变化规律.结果表明,在LF炉精炼后,微观夹杂物由23.34个/mm2下降到14.02个/mm2;经RH循环脱气处理后,夹杂物有所减少,成材中,夹杂物数量略有减少;随着冶炼过程的进行,大颗粒夹杂在钢包中随着钢流的运动得到了有效去除,细微夹杂物所占比例逐步升高;钢中存在的夹杂物主要有氧化物、硫化物以及CaO(CaS)-Al2O3-MgO类复合夹杂物.

  8. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  9. GERMcode: A Stochastic Model for Space Radiation Risk Assessment

    Science.gov (United States)

    Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2012-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and high charge and energy (HZE) particles that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of HZE particles in tissue and shielding materials is made with a stochastic approach that includes both particle track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections. For NSRL applications, the GERMcode evaluates a set of biophysical properties, such as the Poisson distribution of particles or delta-ray hits for a given cellular area and particle dose, the radial dose on tissue, and the frequency distribution of energy deposition in a DNA volume. By utilizing the ProE/Fishbowl ray-tracing analysis, the GERMcode will be used as a bi-directional radiation transport model for future spacecraft shielding analysis in support of Mars mission risk assessments. Recent radiobiological experiments suggest the need for new approaches to risk assessment that include time-dependent biological events due to the signaling times for activation and relaxation of biological processes in cells and tissue. Thus, the tracking of the temporal and spatial distribution of events in tissue is a major goal of the GERMcode in support of the simulation of biological processes important in GCR risk assessments. In order to validate our approach, basic radiobiological responses such as cell survival curves, mutation, chromosomal

  10. GCr15轴承钢线材冷拔工艺优化的实践%Practice of Cold Drawing Process Optimization of GCr15 Bearing Steel Rod

    Institute of Scientific and Technical Information of China (English)

    王莹莹; 杨鹏远

    2015-01-01

    分析了GCr15轴承钢(1.01%C,1.58% Cr) Φ11 mm线材的减面率、拉拔模角度和定径带长度,拉拔速率对拉拔应力和拉拔表面精度的影响.得出Φ11 mm盘条冷拉至Φ10.2 mm线材的优化工艺,即Φ11 mm线材(HB193)-等温球化退火(785℃4.5 h→750℃3h)-冷拔至Φ10.4 mm盘条-740℃4.5h去应力退火-冷拔至Φ10.2 mm棒材成品(HB205);冷拔速度35 m/min,油性润滑等工艺措施可获良好的表面质量.

  11. Spatial gradients of GCR protons in the inner heliosphere derived from Ulysses COSPIN/KET and PAMELA measurements

    CERN Document Server

    Gieseler, Jan

    2016-01-01

    During the transition from solar cycle 23 to 24 from 2006 to 2009, the Sun was in an unusual solar minimum with very low activity over a long period. These exceptional conditions included a very low interplanetary magnetic field (IMF) strength and a high tilt angle, which both play an important role in the modulation of galactic cosmic rays (GCR) in the heliosphere. Thus, the radial and latitudinal gradients of GCRs are very much expected to depend not only on the solar magnetic epoch, but also on the overall modulation level. We determine the non-local radial and the latitudinal gradients of protons in the rigidity range from ~0.45 to 2 GV. This was accomplished by using data from the satellite-borne experiment Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) at Earth and the Kiel Electron Telescope (KET) onboard Ulysses on its highly inclined Keplerian orbit around the Sun with the aphelion at Jupiter's orbit. In comparison to the previous A>0 solar magnetic epoch, we find th...

  12. Effect of friction heat on tribological behavior of M2 steel against GCr15 steel in dry sliding systems

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The tribological behavior depends significantly on friction heat under high sliding velocity. Many factors influence the conduction rate of friction heat, such as thermophysical properties of the pairs, the formation components of interface-film, environment mediums, etc. Through theoretical and experimental studies on surface temperature, the heat partition approaches have been applied to the pairs of M2 steel against GCr15 steel to compare and discuss their tribological behavior in dry sliding contact. The results indicate that the values of the contact pressure have little effect on the heat partition at a high sliding velocity of 40 m/s. Furthermore, the degree of correlation between the dynamic temperature and friction coefficient is obvious, and the correlation degree of parameters increases as the pressure grows. A close correlation exists among the temperatures measured from different points of the pin specimen. At last, X-ray diffraction analysis denotes that the carbides of secondary M6C are separated out during the process of friction.

  13. In vitro manganese-dependent cross-talk between Streptococcus mutans VicK and GcrR: implications for overlapping stress response pathways.

    Directory of Open Access Journals (Sweden)

    Jennifer S Downey

    Full Text Available Streptococcus mutans, a major acidogenic component of the dental plaque biofilm, has a key role in caries etiology. Previously, we demonstrated that the VicRK two-component signal transduction system modulates biofilm formation, oxidative stress and acid tolerance responses in S. mutans. Using in vitro phosphorylation assays, here we demonstrate for the first time, that in addition to activating its cognate response regulator protein, the sensor kinase, VicK can transphosphorylate a non-cognate stress regulatory response regulator, GcrR, in the presence of manganese. Manganese is an important micronutrient that has been previously correlated with caries incidence, and which serves as an effector of SloR-mediated metalloregulation in S. mutans. Our findings supporting regulatory effects of manganese on the VicRK, GcrR and SloR, and the cross-regulatory networks formed by these components are more complex than previously appreciated. Using DNaseI footprinting we observed overlapping DNA binding specificities for VicR and GcrR in native promoters, consistent with these proteins being part of the same transcriptional regulon. Our results also support a role for SloR as a positive regulator of the vicRK two component signaling system, since its transcription was drastically reduced in a SloR-deficient mutant. These findings demonstrate the regulatory complexities observed with the S. mutans manganese-dependent response, which involves cross-talk between non-cognate signal transduction systems (VicRK and GcrR to modulate stress response pathways.

  14. Ionizing Radiation Environments and Exposure Risks

    Science.gov (United States)

    Kim, M. H. Y.

    2015-12-01

    Space radiation environments for historically large solar particle events (SPE) and galactic cosmic rays (GCR) are simulated to characterize exposures to radio-sensitive organs for missions to low-Earth orbit (LEO), moon, near-Earth asteroid, and Mars. Primary and secondary particles for SPE and GCR are transported through the respective atmospheres of Earth or Mars, space vehicle, and astronaut's body tissues using NASA's HZETRN/QMSFRG computer code. Space radiation protection methods, which are derived largely from ground-based methods recommended by the National Council on Radiation Protection and Measurements (NCRP) or International Commission on Radiological Protections (ICRP), are built on the principles of risk justification, limitation, and ALARA (as low as reasonably achievable). However, because of the large uncertainties in high charge and energy (HZE) particle radiobiology and the small population of space crews, NASA develops distinct methods to implement a space radiation protection program. For the fatal cancer risks, which have been considered the dominant risk for GCR, the NASA Space Cancer Risk (NSCR) model has been developed from recommendations by NCRP; and undergone external review by the National Research Council (NRC), NCRP, and through peer-review publications. The NSCR model uses GCR environmental models, particle transport codes describing the GCR modification by atomic and nuclear interactions in atmospheric shielding coupled with spacecraft and tissue shielding, and NASA-defined quality factors for solid cancer and leukemia risk estimates for HZE particles. By implementing the NSCR model, the exposure risks from various heliospheric conditions are assessed for the radiation environments for various-class mission types to understand architectures and strategies of human exploration missions and ultimately to contribute to the optimization of radiation safety and well-being of space crewmembers participating in long-term space missions.

  15. Cleanliness of GCr15 Bearing Steel Produced by Hot Metal Pretreatment→120 t LD→LF→RH→CC Process%钢液预处理→120t转炉→钢包精炼→真空脱气→连铸流程生产GCr15轴承钢的纯净度

    Institute of Scientific and Technical Information of China (English)

    范植金; 罗国华; 徐志东; 王瑞敏

    2011-01-01

    对7炉采用钢液预处理→120t转炉→钢包精炼→真空脱气→连铸流程生产的GCr15轴承钢进行了氧、氮及残余元素含量分析、非金属夹杂物评级、夹杂物电子探针观察和能谱分析以及电解夹杂物等分析。结果表明:采用该流程生产的GCr15轴承钢的纯净度较高,尤其是氮元素含量和残余元素含量均比电炉流程生产的GCr15轴承钢明显降低;钢中的夹杂物主要是铝酸钙和氧化铝,也有少量的硫化物和氮化物,而且总是以复合夹杂物的形式存在。%The oxygen,nitrogen and residual elements contents,non-metallic inclusions rating and inclusions morphology and compositions of 7 furnaces GCr15 bearing steel produced by technological process of hot metal pretreatment→120 t LD→LF→RH→CC were analyzed.The results indicate that the cleanliness of the steel was rather high,and especially the nitrogen and residual elements contents were reduced obviously,comparing with GCr15 bearing steel produced by electric furnace.The inclusions were mainly calcium aluminate,alumina,and also a small amount of sulfide and nitride,and the inclusions were always existed as complex inclusions.

  16. Oil Spill! An Event-Based Science Module. Student Edition. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  17. Oil Spill!: An Event-Based Science Module. Teacher's Guide. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  18. Blight! An Event-Based Science Module. Student Edition. Plants and Plant Diseases Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  19. Asteroid! An Event-Based Science Module. Student Edition. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  20. Asteroid! An Event-Based Science Module. Teacher's Guide. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  1. Asteroid! An Event-Based Science Module. Teacher's Guide. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  2. Asteroid! An Event-Based Science Module. Student Edition. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  3. Gold Rush!: An Event-Based Science Module. Student Edition. Rocks and Minerals Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  4. Gold Rush!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  5. Blight! An Event-Based Science Module. Teacher's Guide. Plants and Plant Diseases Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school life science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  6. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  7. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  8. Discharges of past flood events based on historical river profiles

    Directory of Open Access Journals (Sweden)

    D. Sudhaus

    2008-10-01

    Full Text Available This paper presents a case study on the estimation of peak discharges of extreme flood events during the 19th century of the Neckar River located in south-western Germany. It was carried out as part of the BMBF (German Federal Ministry of Education and Research research project RIMAX (Risk Management of Extreme Flood Events. The discharge estimations were made for the 1824 and 1882 flood events, and are based on historical cross profiles. The 1-D model Hydrologic Engineering Centers River Analysis System (HEC-RAS was applied with different roughness coefficients to determine these estimations. The results are compared (i with contemporary historical calculations for the 1824 and 1882 flood events and (ii in the case of the flood event in 1824, with the discharge simulation by the water balance model LARSIM (Large Area Runoff Simulation Model. These calculations are matched by the HEC-RAS simulation based on the standard roughness coefficients.

  9. Effects on Structure and Abrasion Resistance of GCr15 Steel by Surface Gas-Phase RE Diffused Permeation with Laser Melting Solidification

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The effects on abrasion resistance and the microstructure of GCr15 steel surface by the compound technology of permeating RE combined with laser melting modification was studied. The results show that after compound treatment, the abrasion resistance of samples has been improved significantly and the weight loss has been reduced to 14% of blank sample; the microstructure has been denser and more uniform than that of untreated; meanwhile, the grain has been refined and the concentration gradients of the elements permeated have been decreased obviously.

  10. Qualitative Event-based Diagnosis with Possible Conflicts Applied to Spacecraft Power Distribution Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based diagnosis enables efficient and safe operation of engineered systems. In this paper, we describe two algorithms based on a qualitative event-based fault...

  11. Decentralized Event-Based Communication Strategy on Leader-Follower Consensus Control

    OpenAIRE

    Duosi Xie; Xiaochun Yin; Jianquan Xie

    2016-01-01

    This paper addresses the leader-follower consensus problem of networked systems by using a decentralized event-based control strategy. The event-based control strategy makes the controllers of agents update at aperiodic event instants. Two decentralized event functions are designed to generate these event instants. In particular, the second event function only uses its own information and the neighbors’ states at their latest event instants. By using this event function, no continuous communi...

  12. Efficiency of Event-Based Sampling According to Error Energy Criterion

    OpenAIRE

    Marek Miskowicz

    2010-01-01

    The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control). Compared to the integral sampling c...

  13. Assessing the continuum of event-based biosurveillance through an operational lens.

    Science.gov (United States)

    Corley, Courtney D; Lancaster, Mary J; Brigantic, Robert T; Chung, James S; Walters, Ronald A; Arthur, Ray R; Bruckner-Lea, Cynthia J; Calapristi, Augustin; Dowling, Glenn; Hartley, David M; Kennedy, Shaun; Kircher, Amy; Klucking, Sara; Lee, Eva K; McKenzie, Taylor; Nelson, Noele P; Olsen, Jennifer; Pancerella, Carmen; Quitugua, Teresa N; Reed, Jeremy Todd; Thomas, Carla S

    2012-03-01

    This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have a significant impact on the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance technologies is unclear. This article frames the continuum of event-based biosurveillance systems (that fuse media reports from the internet), models (ie, computational that forecast disease occurrence), and constructs (ie, descriptive analytical reports) through an operational lens (ie, aspects and attributes associated with operational considerations in the development, testing, and validation of the event-based biosurveillance methods and models and their use in an operational environment). A workshop was held in 2010 to scientifically identify, develop, and vet a set of attributes for event-based biosurveillance. Subject matter experts were invited from 7 federal government agencies and 6 different academic institutions pursuing research in biosurveillance event detection. We describe 8 attribute families for the characterization of event-based biosurveillance: event, readiness, operational aspects, geographic coverage, population coverage, input data, output, and cost. Ultimately, the analyses provide a framework from which the broad scope, complexity, and relevant issues germane to event-based biosurveillance useful in an operational environment can be characterized.

  14. Effect of Nano-additive on Friction and Wear Performance of GCr1S/1045 Steels%纳米添加剂对GCr15/1045钢摩擦磨损性能的影响

    Institute of Scientific and Technical Information of China (English)

    李征; 王文健; 刘启跃

    2011-01-01

    随着纳米科技的发展和对纳米材料功能特殊性的认识,纳米材料作为添加剂开始越来越多的应用到机械的润滑和抗磨自修复研究中.文中利用PLINT NENE-7型磨损试验机,以中石油兰州润滑油厂生产的中负荷工业闭式齿轮油L-CKC220作为基础油研究了纳米氮化铝、纳米碳化硅和油溶性纳米铜合金作为添加剂对GCr15/1045钢摩擦副滑动摩擦磨损特性的影响.分析不同纳米材料对摩擦因数曲线、磨斑形貌(SEM)及EDX能谱分析图的影响.结果表明:3种纳米添加剂均能使摩擦副的摩擦因数明显降低;纳米氮化铝和油溶性纳米铜合金作为添加剂具有良好的减摩和抗磨性能,分别使摩擦系数降低33.3%和28.6%,并能非常明显的沉积在摩擦副表面;纳米碳化硅的性能较差.%With the development of nanotechnology and the increasing knowledge of the particularity of nano-composite materials, more and more nanomaterials, which are used as additives, are applied to study the lubrication, anti-wear and self-healing of machines. The sliding friction and wear performance of nano-aluminum nitride, oil-soluble nano-copper alloy and nano-silicon carbide were studied, which were used as additives in the base oil. The medium duty industrial gear oil L-CKC220 from the Lanzhou Lubricating Oil Plant of PetroChina was chose as the base oil. The experiment was tested on the PLINT Deltalab-NENE-7 horizontal electro-hydraulic servo budge abrasion tester with GCrl5/1045 steel pair. The effects of different nanomaterials were analyzed. By analyzing friction coefficient curves, worn morphology and EDX, the results show that all of the three nano-additives can reduce friction coefficient obviously, however, nano-aluminum nitride and oil soluble nano-copper alloy give a better performance as additives in friction and anti-wear, reducing the friction coefficient 33. 3% and 28. 6% respectively, and these two materials deposite on the

  15. How safe is safe enough? Radiation risk for a human mission to Mars.

    Directory of Open Access Journals (Sweden)

    Francis A Cucinotta

    Full Text Available Astronauts on a mission to Mars would be exposed for up to 3 years to galactic cosmic rays (GCR--made up of high-energy protons and high charge (Z and energy (E (HZE nuclei. GCR exposure rate increases about three times as spacecraft venture out of Earth orbit into deep space where protection of the Earth's magnetosphere and solid body are lost. NASA's radiation standard limits astronaut exposures to a 3% risk of exposure induced death (REID at the upper 95% confidence interval (CI of the risk estimate. Fatal cancer risk has been considered the dominant risk for GCR, however recent epidemiological analysis of radiation risks for circulatory diseases allow for predictions of REID for circulatory diseases to be included with cancer risk predictions for space missions. Using NASA's models of risks and uncertainties, we predicted that central estimates for radiation induced mortality and morbidity could exceed 5% and 10% with upper 95% CI near 10% and 20%, respectively for a Mars mission. Additional risks to the central nervous system (CNS and qualitative differences in the biological effects of GCR compared to terrestrial radiation may significantly increase these estimates, and will require new knowledge to evaluate.

  16. Are Time- and Event-based Prospective Memory Comparably Affected in HIV Infection?†

    Science.gov (United States)

    Zogg, Jennifer B.; Woods, Steven Paul; Weber, Erica; Doyle, Katie; Grant, Igor; Atkinson, J. Hampton; Ellis, Ronald J.; McCutchan, J. Allen; Marcotte, Thomas D.; Hale, Braden R.; Ellis, Ronald J.; McCutchan, J. Allen; Letendre, Scott; Capparelli, Edmund; Schrier, Rachel; Heaton, Robert K.; Cherner, Mariana; Moore, David J.; Jernigan, Terry; Fennema-Notestine, Christine; Archibald, Sarah L.; Hesselink, John; Annese, Jacopo; Taylor, Michael J.; Masliah, Eliezer; Everall, Ian; Langford, T. Dianne; Richman, Douglas; Smith, David M.; McCutchan, J. Allen; Everall, Ian; Lipton, Stuart; McCutchan, J. Allen; Atkinson, J. Hampton; Ellis, Ronald J.; Letendre, Scott; Atkinson, J. Hampton; von Jaeger, Rodney; Gamst, Anthony C.; Cushman, Clint; Masys, Daniel R.; Abramson, Ian; Ake, Christopher; Vaida, Florin

    2011-01-01

    According to the multi-process theory of prospective memory (ProM), time-based tasks rely more heavily on strategic processes dependent on prefrontal systems than do event-based tasks. Given the prominent frontostriatal pathophysiology of HIV infection, one would expect HIV-infected individuals to demonstrate greater deficits in time-based versus event-based ProM. However, the two prior studies examining this question have produced variable results. We evaluated this hypothesis in 143 individuals with HIV infection and 43 demographically similar seronegative adults (HIV−) who completed the research version of the Memory for Intentions Screening Test, which yields parallel subscales of time- and event-based ProM. Results showed main effects of HIV serostatus and cue type, but no interaction between serostatus and cue. Planned pair-wise comparisons showed a significant effect of HIV on time-based ProM and a trend-level effect on event-based ProM that was driven primarily by the subset of participants with HIV-associated neurocognitive disorders. Nevertheless, time-based ProM was more strongly correlated with measures of executive functions, attention/working memory, and verbal fluency in HIV-infected persons. Although HIV-associated deficits in time- and event-based ProM appear to be of comparable severity, the cognitive architecture of time-based ProM may be more strongly influenced by strategic monitoring and retrieval processes. PMID:21459901

  17. Event-Based Robust Control for Uncertain Nonlinear Systems Using Adaptive Dynamic Programming.

    Science.gov (United States)

    Zhang, Qichao; Zhao, Dongbin; Wang, Ding

    2016-10-18

    In this paper, the robust control problem for a class of continuous-time nonlinear system with unmatched uncertainties is investigated using an event-based control method. First, the robust control problem is transformed into a corresponding optimal control problem with an augmented control and an appropriate cost function. Under the event-based mechanism, we prove that the solution of the optimal control problem can asymptotically stabilize the uncertain system with an adaptive triggering condition. That is, the designed event-based controller is robust to the original uncertain system. Note that the event-based controller is updated only when the triggering condition is satisfied, which can save the communication resources between the plant and the controller. Then, a single network adaptive dynamic programming structure with experience replay technique is constructed to approach the optimal control policies. The stability of the closed-loop system with the event-based control policy and the augmented control policy is analyzed using the Lyapunov approach. Furthermore, we prove that the minimal intersample time is bounded by a nonzero positive constant, which excludes Zeno behavior during the learning process. Finally, two simulation examples are provided to demonstrate the effectiveness of the proposed control scheme.

  18. Efficiency of event-based sampling according to error energy criterion.

    Science.gov (United States)

    Miskowicz, Marek

    2010-01-01

    The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control). Compared to the integral sampling criteria, the error energy criterion gives more weight to extreme sampling error values. The proposed sampling principle extends a range of event-based sampling schemes and makes the choice of particular sampling criterion more flexible to application requirements. In the paper, it is proved analytically that the proposed event-based sampling criterion is more effective than the periodic sampling by a factor defined by the ratio of the maximum to the mean of the cubic root of the signal time-derivative square in the analyzed time interval. Furthermore, it is shown that the sampling according to energy criterion is less effective than the send-on-delta scheme but more effective than the sampling according to integral criterion. On the other hand, it is indicated that higher effectiveness in sampling according to the selected event-based criterion is obtained at the cost of increasing the total sampling error defined as the sum of errors for all the samples taken.

  19. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  20. An Event-based Fast Movement Detection Algorithm for a Positioning Robot Using POWERLINK Communication

    OpenAIRE

    Barrios-Avilés, Juan; Iakymchuk, Taras; Samaniego, Jorge; Rosado-Muñoz, Alfredo

    2017-01-01

    This work develops a tracking system based on an event-based camera. A bioinspired filtering algorithm to reduce noise and transmitted data while keeping the main features at the scene is implemented in FPGA which also serves as a network node. POWERLINK IEEE 61158 industrial network is used to communicate the FPGA with a controller connected to a self-developed two axis servo-controlled robot. The FPGA includes the network protocol to integrate the event-based camera as any other existing ne...

  1. Mind the gap: modelling event-based and millennial-scale landscape dynamics

    NARCIS (Netherlands)

    Baartman, J.E.M.

    2012-01-01

    This research looks at landscape dynamics – erosion and deposition – from two different perspectives: long-term landscape evolution over millennial timescales on the one hand and short-term event-based erosion and deposition at the other hand. For the first, landscape evolution models (L

  2. Event-based prospective memory in mildly and severely autistic children

    NARCIS (Netherlands)

    Sheppard, D.P.; Kvavilashvili, L.; Ryder, N.

    2016-01-01

    Background: There is a growing body of research into the development of prospective memory (PM) in typically developing children but research is limited in autistic children (Aut) and rarely includes children with more severe symptoms. Aims: This study is the first to specifically compare event-base

  3. Improving the Critic Learning for Event-Based Nonlinear H∞ Control Design.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-01-30

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H∞ state feedback control design. First of all, the H∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  4. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  5. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and probab

  6. An Event-Based Methodology to Generate Class Diagrams and its Empirical Evaluation

    Directory of Open Access Journals (Sweden)

    Sandeep K. Singh

    2010-01-01

    Full Text Available Problem statement: Event-based systems have importance in many application domains ranging from real time monitoring systems in production, logistics, medical devices and networking to complex event processing in finance and security. The increasing popularity of Event-based systems has opened new challenging issues for them. One such issue is to carry out requirements analysis of event-based systems and build conceptual models. Currently, Object Oriented Analysis (OOA using Unified Modeling Language (UML is the most popular requirement analysis approach for which several OOA tools and techniques have been proposed. But none of the techniques and tools to the best of our knowledge, have focused on event-based requirements analysis, rather all are behavior-based approaches. Approach: This study described a requirement analysis approach specifically for event based systems. The proposed approach started from events occurring in the system and derives an importable class diagram specification in XML Metadata Interchange (XMI format for Argo UML tool. Requirements of the problem domain are captured as events in restricted natural language using the proposed Event Templates in order to reduce the ambiguity. Results: Rules were designed to extract a domain model specification (analysis-level class diagram from Event Templates. A prototype tool 'EV-ClassGEN' is also developed to provide automation support to extract events from requirements, document the extracted events in Event Templates and implement rules to derive specification for an analysis-level class diagram. The proposed approach is also validated through a controlled experiment by applying it on many cases from different application domains like real time systems, business applications, gaming. Conclusion: Results of the controlled experiment had shown that after studying and applying Event-based approach, student's perception about ease of use and usefulness of OOA technique has

  7. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  8. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  9. A simulation based approach to quantify the difference between event-based and routine water quality monitoring schemes

    Directory of Open Access Journals (Sweden)

    J.S. Lessels

    2015-09-01

    New hydrological insights for the region: The inclusion of event-based sampling improved annual load estimates of all sites with a maximum RMSE difference of 16.11 tonnes between event-based and routine sampling. Based on the accuracy of annual loads, event-based sampling was found to be more important in catchments with a large relief and high annual rainfall in this region. Using this approach, different sampling schemes can be compared based on limited historical data.

  10. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  11. Event-Based Control Strategy for Mobile Robots in Wireless Environments

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-01-01

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy. PMID:26633412

  12. Use of unstructured event-based reports for global infectious disease surveillance.

    Science.gov (United States)

    Keller, Mikaela; Blench, Michael; Tolentino, Herman; Freifeld, Clark C; Mandl, Kenneth D; Mawudeku, Abla; Eysenbach, Gunther; Brownstein, John S

    2009-05-01

    Free or low-cost sources of unstructured information, such as Internet news and online discussion sites, provide detailed local and near real-time data on disease outbreaks, even in countries that lack traditional public health surveillance. To improve public health surveillance and, ultimately, interventions, we examined 3 primary systems that process event-based outbreak information: Global Public Health Intelligence Network, HealthMap, and EpiSPIDER. Despite similarities among them, these systems are highly complementary because they monitor different data types, rely on varying levels of automation and human analysis, and distribute distinct information. Future development should focus on linking these systems more closely to public health practitioners in the field and establishing collaborative networks for alert verification and dissemination. Such development would further establish event-based monitoring as an invaluable public health resource that provides critical context and an alternative to traditional indicator-based outbreak reporting.

  13. Event-Based Control Strategy for Mobile Robots in Wireless Environments

    Directory of Open Access Journals (Sweden)

    Rafael Socas

    2015-12-01

    Full Text Available In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  14. ARTUS - A Framework for Event-based Data Analysis in High Energy Physics

    CERN Document Server

    Berger, Joram; Friese, Raphael; Haitz, Dominik; Hauth, Thomas; Müller, Thomas; Quast, Günter; Sieber, Georg

    2015-01-01

    ARTUS is an event-based data-processing framework for high energy physics experiments. It is designed for large-scale data analysis in a collaborative environment. The architecture design choices take into account typical challenges and are based on experiences with similar applications. The structure of the framework and its advantages are described. An example use case and performance measurements are presented. The framework is well-tested and successfully used by several analysis groups.

  15. An event-based approach for comparing the performance of methods for prospective medical product monitoring

    Science.gov (United States)

    Gagne, Joshua J.; Walker, Alexander M.; Glynn, Robert J.; Rassen, Jeremy A.; Schneeweiss, Sebastian

    2012-01-01

    Prospective medical product monitoring is intended to alert stakeholders about whether and when safety problems are identifiable in a continuous stream of longitudinal electronic healthcare data. In comparing the performance of methods to generate these alerts, three factors must be considered: (1) accuracy in alerting; (2) timeliness of alerting; and (3) the trade-offs between the costs of false negative and false positive alerting. Using illustrative examples, we show that traditional scenario-based measures of accuracy, such as sensitivity and specificity, which classify only at the end of monitoring, fail to appreciate timeliness of alerting. We propose an event-based approach that classifies exposed outcomes according to whether or not a prior alert was generated. We provide event-based extensions to existing metrics and discuss why these metrics are limited in this setting because of inherent tradeoffs that they impose between the relative consequences of false positives versus false negatives. We provide an expression that summarizes event-based sensitivity (the proportion of exposed events that occur after alerting among all exposed events in scenarios with true safety issues) and event-based specificity (the proportion of exposed events that occur in the absence of alerting among all exposed events in scenarios with no true safety issues) by taking an average weighted by the relative costs of false positive and false negative alerting. This approach explicitly accounts for accuracy in alerting, timeliness in alerting, and the trade-offs between the costs of false negative and false positive alerting. Subsequent work will involve applying the metric to simulated data. PMID:22223544

  16. An event-based neurobiological recognition system with orientation detector for objects in multiple orientations

    Directory of Open Access Journals (Sweden)

    Hanyu Wang

    2016-11-01

    Full Text Available A new multiple orientation event-based neurobiological recognition system is proposed by integrating recognition and tracking function in this paper, which is used for asynchronous address-event representation (AER image sensors. The characteristic of this system has been enriched to recognize the objects in multiple orientations with only training samples moving in a single orientation. The system extracts multi-scale and multi-orientation line features inspired by models of the primate visual cortex. An orientation detector based on modified Gaussian blob tracking algorithm is introduced for object tracking and orientation detection. The orientation detector and feature extraction block work in simultaneous mode, without any increase in categorization time. An addresses lookup table (addresses LUT is also presented to adjust the feature maps by addresses mapping and reordering, and they are categorized in the trained spiking neural network. This recognition system is evaluated with the MNIST dataset which have played important roles in the development of computer vision, and the accuracy is increase owing to the use of both ON and OFF events. AER data acquired by a DVS are also tested on the system, such as moving digits, pokers, and vehicles. The experimental results show that the proposed system can realize event-based multi-orientation recognition.The work presented in this paper makes a number of contributions to the event-based vision processing system for multi-orientation object recognition. It develops a new tracking-recognition architecture to feedforward categorization system and an address reorder approach to classify multi-orientation objects using event-based data. It provides a new way to recognize multiple orientation objects with only samples in single orientation.

  17. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  18. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    Directory of Open Access Journals (Sweden)

    Young Soo Suh

    2009-04-01

    Full Text Available This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD, sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen.

  19. Music, clicks, and their imaginations favor differently the event-based timing component for rhythmic movements.

    Science.gov (United States)

    Bravi, Riccardo; Quarta, Eros; Del Tongo, Claudia; Carbonaro, Nicola; Tognetti, Alessandro; Minciacchi, Diego

    2015-06-01

    The involvement or noninvolvement of a clock-like neural process, an effector-independent representation of the time intervals to produce, is described as the essential difference between event-based and emergent timing. In a previous work (Bravi et al. in Exp Brain Res 232:1663-1675, 2014a. doi: 10.1007/s00221-014-3845-9 ), we studied repetitive isochronous wrist's flexion-extensions (IWFEs), performed while minimizing visual and tactile information, to clarify whether non-temporal and temporal characteristics of paced auditory stimuli affect the precision and accuracy of the rhythmic motor performance. Here, with the inclusion of new recordings, we expand the examination of the dataset described in our previous study to investigate whether simple and complex paced auditory stimuli (clicks and music) and their imaginations influence in a different way the timing mechanisms for repetitive IWFEs. Sets of IWFEs were analyzed by the windowed (lag one) autocorrelation-wγ(1), a statistical method recently introduced for the distinction between event-based and emergent timing. Our findings provide evidence that paced auditory information and its imagination favor the engagement of a clock-like neural process, and specifically that music, unlike clicks, lacks the power to elicit event-based timing, not counteracting the natural shift of wγ(1) toward positive values as frequency of movements increase.

  20. 表面激光硬化轴承的疲劳失效分析%Analysis on Fatigue Failure of GCr15 Steel Bearing by Laser Hardening

    Institute of Scientific and Technical Information of China (English)

    雷声; 黄曼平; 薛正堂; 吴跃波

    2013-01-01

    选用CO2激光器进行GCr15钢轴承滚道表面激光淬火处理试验,内圈硬化层深度可达到0.5 mm,外圈可达到0.45 mm.用金相显微镜、扫描电镜和X射线分析等现代测试技术对改性层的相组成及改性机理进行分析.结果表明,轴承表面的激光相变硬化可以产生具有较多残余奥氏体、细小碳化物以及过饱和的隐晶马氏体组织,从而提高轴承滚道表面的硬度.最后进行了轴承钢的接触疲劳性能试验.通过疲劳失效轴承表面的显微观察,验证了激光淬火套圈表面的疲劳失效形式仍为表层剥落.造成激光淬火套圈早期疲劳失效的主要原因是激光扫描开始与结束接口处没有完全对接上,以及硬化层深不均匀.%The laser surface hardening process of GCr15 steel bearing inner and outer rings was tested by using CO2 continuous wave laser. A hardening depths of 0.50 mm (inner rings) and 0.45 mm (outer rings) were researched. The macromorphology and microstructure of the laser surface hardened layers were investigated by scanning electron microscopy (SEM), optical microscopy and X-ray diffraction measurements. The results show that the laser transformation hardening can produce martensitic microstructure with more retained austenite and finer carbides to increase the hardness of the bearing surface. The fatigue properties of laser surface hardening GCrl5 steel bearing were studied. Bearing rings surfaces were examined by microscope after failure. It was identified by tests that the failure mode of bearing rings surfaces is surface spalling. The reason of early fatigue failure of laser surface hardening GCrl5 steel bearing is no completely docking between the beginning and ending boundaries of the bearing rings and the unven hardening layer depth.

  1. A coupled model of TiN inclusion growth in GCr15SiMn during solidification in the electroslag remelting process

    Institute of Scientific and Technical Information of China (English)

    Liang Yang; Guo-guang Cheng; Shi-jian Li; Min Zhao; Gui-ping Feng; Tao Li

    2015-01-01

    TiN inclusions observed in an ingot produced by electroslag remelting (ESR) are extremely harmful to GCr15SiMn steel. There-fore, accurate predictions of the growth size of these inclusions during steel solidification are significant for clean ESR ingot production. On the basis of our previous work, a coupled model of solute microsegregation and TiN inclusion growth during solidification has been estab-lished. The results demonstrate that compared to a non-coupled model, the coupled model predictions of the size of TiN inclusions are in good agreement with experimental results using scanning electron microscopy with energy disperse spectroscopy (SEM-EDS). Because of high cooling rate, the sizes of TiN inclusions in the edge area of the ingots are relatively small compared to the sizes in the center area. Dur-ing the ESR process, controlling the content of Ti in the steel is a feasible and effective method of decreasing the sizes of TiN inclusions.

  2. Event-based prospective memory deficits in individuals with high depressive symptomatology: problems controlling attentional resources?

    Science.gov (United States)

    Li, Yanqi Ryan; Loft, Shayne; Weinborn, Michael; Maybery, Murray T

    2014-01-01

    Depression has been found to be related to neurocognitive deficits in areas important to successful prospective memory (PM) performance, including executive function, attention, and retrospective memory. However, research specific to depression and PM has produced a mixed pattern of results. The current study further examined the task conditions in which event-based PM deficits may emerge in individuals with high depressive symptomatology (HDS) relative to individuals with low depressive symptomatology (LDS) and the capacity of HDS individuals to allocate attentional resources to event-based PM tasks. Sixty-four participants (32 HDS, 32 LDS) were required to make a PM response when target words were presented during an ongoing lexical decision task. When the importance of the ongoing task was emphasized, response time costs to the ongoing task, and PM accuracy, did not differ between the HDS and LDS groups. This finding is consistent with previous research demonstrating that event-based PM task accuracy is not always impaired by depression, even when the PM task is resource demanding. When the importance of the PM task was emphasized, costs to the ongoing task further increased for both groups, indicating an increased allocation of attentional resources to the PM task. Crucially, while a corresponding improvement in PM accuracy was observed in the LDS group when the importance of the PM task was emphasized, this was not true for the HDS group. The lack of improved PM accuracy in the HDS group compared with the LDS group despite evidence of increased cognitive resources allocated to PM tasks may have been due to inefficiency in the application of the allocated attention, a dimension likely related to executive function difficulties in depression. Qualitatively different resource allocation patterns may underlie PM monitoring in HDS versus LDS individuals.

  3. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  4. Fast vision through frameless event-based sensing and convolutional processing: application to texture recognition.

    Science.gov (United States)

    Perez-Carrasco, Jose Antonio; Acha, Begona; Serrano, Carmen; Camunas-Mesa, Luis; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabe

    2010-04-01

    Address-event representation (AER) is an emergent hardware technology which shows a high potential for providing in the near future a solid technological substrate for emulating brain-like processing structures. When used for vision, AER sensors and processors are not restricted to capturing and processing still image frames, as in commercial frame-based video technology, but sense and process visual information in a pixel-level event-based frameless manner. As a result, vision processing is practically simultaneous to vision sensing, since there is no need to wait for sensing full frames. Also, only meaningful information is sensed, communicated, and processed. Of special interest for brain-like vision processing are some already reported AER convolutional chips, which have revealed a very high computational throughput as well as the possibility of assembling large convolutional neural networks in a modular fashion. It is expected that in a near future we may witness the appearance of large scale convolutional neural networks with hundreds or thousands of individual modules. In the meantime, some research is needed to investigate how to assemble and configure such large scale convolutional networks for specific applications. In this paper, we analyze AER spiking convolutional neural networks for texture recognition hardware applications. Based on the performance figures of already available individual AER convolution chips, we emulate large scale networks using a custom made event-based behavioral simulator. We have developed a new event-based processing architecture that emulates with AER hardware Manjunath's frame-based feature recognition software algorithm, and have analyzed its performance using our behavioral simulator. Recognition rate performance is not degraded. However, regarding speed, we show that recognition can be achieved before an equivalent frame is fully sensed and transmitted.

  5. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  6. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  7. Infectious diseases prioritisation for event-based surveillance at the European Union level for the 2012 Olympic and Paralympic Games.

    Science.gov (United States)

    Economopoulou, A; Kinross, P; Domanovic, D; Coulombier, D

    2014-04-17

    In 2012, London hosted the Olympic and Paralympic Games (the Games), with events occurring throughout the United Kingdom (UK) between 27 July and 9 September 2012. Public health surveillance was performed by the Health Protection Agency (HPA). Collaboration between the HPA and the European Centre for Disease Prevention and Control (ECDC) was established for the detection and assessment of significant infectious disease events (SIDEs) occurring outside the UK during the time of the Games. Additionally, ECDC undertook an internal prioritisation exercise to facilitate ECDC’s decisions on which SIDEs should have preferentially enhanced monitoring through epidemic intelligence activities for detection and reporting in daily surveillance in the European Union (EU). A team of ECDC experts evaluated potential public health risks to the Games, selecting and prioritising SIDEs for event-based surveillance with regard to their potential for importation to the Games, occurrence during the Games or export to the EU/European Economic Area from the Games. The team opted for a multilevel approach including comprehensive disease selection, development and use of a qualitative matrix scoring system and a Delphi method for disease prioritisation. The experts selected 71 infectious diseases to enter the prioritisation exercise of which 27 were considered as priority for epidemic intelligence activities by ECDC for the EU for the Games.

  8. Exploiting different active silicon detectors in the International Space Station: ALTEA and DOSTEL galactic cosmic radiation (GCR) measurements

    Science.gov (United States)

    Narici, Livo; Berger, Thomas; Burmeister, Sönke; Di Fino, Luca; Rizzo, Alessandro; Matthiä, Daniel; Reitz, Günther

    2017-08-01

    The solar system exploration by humans requires to successfully deal with the radiation exposition issue. The scientific aspect of this issue is twofold: knowing the radiation environment the astronauts are going to face and linking radiation exposure to health risks. Here we focus on the first issue. It is generally agreed that the final tool to describe the radiation environment in a space habitat will be a model featuring the needed amount of details to perform a meaningful risk assessment. The model should also take into account the shield changes due to the movement of materials inside the habitat, which in turn produce changes in the radiation environment. This model will have to undergo a final validation with a radiation field of similar complexity. The International Space Station (ISS) is a space habitat that features a radiation environment inside which is similar to what will be found in habitats in deep space, if we use measurements acquired only during high latitude passages (where the effects of the Earth magnetic field are reduced). Active detectors, providing time information, that can easily select data from different orbital sections, are the ones best fulfilling the requirements for these kinds of measurements. The exploitation of the radiation measurements performed in the ISS by all the available instruments is therefore mandatory to provide the largest possible database to the scientific community, to be merged with detailed Computer Aided Design (CAD) models, in the quest for a full model validation. While some efforts in comparing results from multiple active detectors have been attempted, a thorough study of a procedure to merge data in a single data matrix in order to provide the best validation set for radiation environment models has never been attempted. The aim of this paper is to provide such a procedure, to apply it to two of the most performing active detector systems in the ISS: the Anomalous Long Term Effects in Astronauts (ALTEA

  9. An efficient hybrid causative event-based approach for deriving the annual flood frequency distribution

    Science.gov (United States)

    Thyer, Mark; Li, Jing; Lambert, Martin; Kuczera, George; Metcalfe, Andrew

    2015-04-01

    Flood extremes are driven by highly variable and complex climatic and hydrological processes. Derived flood frequency methods are often used to predict the flood frequency distribution (FFD) because they can provide predictions in ungauged catchments and evaluate the impact of land-use or climate change. This study presents recent work on development of a new derived flood frequency method called the hybrid causative events (HCE) approach. The advantage of the HCE approach is that it combines the accuracy of the continuous simulation approach with the computational efficiency of the event-based approaches. Derived flood frequency methods, can be divided into two classes. Event-based approaches provide fast estimation, but can also lead to prediction bias due to limitations of inherent assumptions required for obtaining input information (rainfall and catchment wetness) for events that cause large floods. Continuous simulation produces more accurate predictions, however, at the cost of massive computational time. The HCE method uses a short continuous simulation to provide inputs for a rainfall-runoff model running in an event-based fashion. A proof-of-concept pilot study that the HCE produces estimates of the flood frequency distribution with similar accuracy as the continuous simulation, but with dramatically reduced computation time. Recent work incorporated seasonality into the HCE approach and evaluated with a more realistic set of eight sites from a wide range of climate zones, typical of Australia, using a virtual catchment approach. The seasonal hybrid-CE provided accurate predictions of the FFD for all sites. Comparison with the existing non-seasonal hybrid-CE showed that for some sites the non-seasonal hybrid-CE significantly over-predicted the FFD. Analysis of the underlying cause of whether a site had a high, low or no need to use seasonality found it was based on a combination of reasons, that were difficult to predict apriori. Hence it is recommended

  10. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes...... representing the events that can happen and arrows representing four relations between events: condition, response, include, and exclude. Distributed DCR Graphs is then obtained by assigning roles to events and principals. We give a graphical notation inspired by related work by van der Aalst et al. We...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  11. Pinning cluster synchronization in an array of coupled neural networks under event-based mechanism.

    Science.gov (United States)

    Li, Lulu; Ho, Daniel W C; Cao, Jinde; Lu, Jianquan

    2016-04-01

    Cluster synchronization is a typical collective behavior in coupled dynamical systems, where the synchronization occurs within one group, while there is no synchronization among different groups. In this paper, under event-based mechanism, pinning cluster synchronization in an array of coupled neural networks is studied. A new event-triggered sampled-data transmission strategy, where only local and event-triggering states are utilized to update the broadcasting state of each agent, is proposed to realize cluster synchronization of the coupled neural networks. Furthermore, a self-triggered pinning cluster synchronization algorithm is proposed, and a set of iterative procedures is given to compute the event-triggered time instants. Hence, this will reduce the computational load significantly. Finally, an example is given to demonstrate the effectiveness of the theoretical results.

  12. Experimental Test of an Event-Based Corpuscular Model Modification as an Alternative to Quantum Mechanics

    Science.gov (United States)

    Brida, Giorgio; Degiovanni, Ivo Pietro; Genovese, Marco; Migdall, Alan; Piacentini, Fabrizio; Polyakov, Sergey V.; Traina, Paolo

    2013-03-01

    We present the first experimental test that distinguishes between an event-based corpuscular model (EBCM) [H. De Raedt et al.: J. Comput. Theor. Nanosci. 8 (2011) 1052] of the interaction of photons with matter and quantum mechanics. The test looks at the interference that results as a single photon passes through a Mach--Zehnder interferometer [H. De Raedt et al.: J. Phys. Soc. Jpn. 74 (2005) 16]. The experimental results, obtained with a low-noise single-photon source [G. Brida et al.: Opt. Express 19 (2011) 1484], agree with the predictions of standard quantum mechanics with a reduced χ2 of 0.98 and falsify the EBCM with a reduced χ2 of greater than 20.

  13. Experimental Test of an Event-Based Corpuscular Model Modification as an Alternative to Quantum Mechanics

    CERN Document Server

    Brida, Giorgio; Genovese, Marco; Migdall, Alan; Piacentini, Fabrizio; Polyakov, Sergey V; Traina, Paolo

    2013-01-01

    We present the first experimental test that distinguishes between an event-based corpuscular model (EBCM) [H. De Raedt et al.: J. Comput. Theor. Nanosci. 8 (2011) 1052] of the interaction of photons with matter and quantum mechanics. The test looks at the interference that results as a single photon passes through a Mach-Zehnder interferometer [H. De Raedt et al.: J. Phys. Soc. Jpn. 74 (2005) 16]. The experimental results, obtained with a low-noise single-photon source [G. Brida et al.: Opt. Expr. 19 (2011) 1484], agree with the predictions of standard quantum mechanics with a reduced $\\chi^2$ of 0.98 and falsify the EBCM with a reduced $\\chi^2$ of greater than 20.

  14. Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition

    Science.gov (United States)

    Daigle, Matthew; Roychoudhury, Indranil

    2010-01-01

    We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach

  15. Stabilization of Networked Distributed Systems with Partial and Event-Based Couplings

    Directory of Open Access Journals (Sweden)

    Sufang Zhang

    2015-01-01

    Full Text Available The stabilization problem of networked distributed systems with partial and event-based couplings is investigated. The channels, which are used to transmit different levels of information of agents, are considered. The channel matrix is introduced to indicate the work state of the channels. An event condition is designed for each channel to govern the sampling instants of the channel. Since the event conditions are separately given for different channels, the sampling instants of channels are mutually independent. To stabilize the system, the state feedback controllers are implemented in the system. The control signals also suffer from the two communication constraints. The sufficient conditions in terms of linear matrix equalities are proposed to ensure the stabilization of the controlled system. Finally, a numerical example is given to demonstrate the advantage of our results.

  16. Event based self-supervised temporal integration for multimodal sensor data.

    Science.gov (United States)

    Barakova, Emilia I; Lourens, Tino

    2005-06-01

    A method for synergistic integration of multimodal sensor data is proposed in this paper. This method is based on two aspects of the integration process: (1) achieving synergistic integration of two or more sensory modalities, and (2) fusing the various information streams at particular moments during processing. Inspired by psychophysical experiments, we propose a self-supervised learning method for achieving synergy with combined representations. Evidence from temporal registration and binding experiments indicates that different cues are processed individually at specific time intervals. Therefore, an event-based temporal co-occurrence principle is proposed for the integration process. This integration method was applied to a mobile robot exploring unfamiliar environments. Simulations showed that integration enhanced route recognition with many perceptual similarities; moreover, they indicate that a perceptual hierarchy of knowledge about instant movement contributes significantly to short-term navigation, but that visual perceptions have bigger impact over longer intervals.

  17. Robust spike-train learning in spike-event based weight update.

    Science.gov (United States)

    Shrestha, Sumit Bam; Song, Qing

    2017-09-12

    Supervised learning algorithms in a spiking neural network either learn a spike-train pattern for a single neuron receiving input spike-train from multiple input synapses or learn to output the first spike time in a feedforward network setting. In this paper, we build upon spike-event based weight update strategy to learn continuous spike-train in a spiking neural network with a hidden layer using a dead zone on-off based adaptive learning rate rule which ensures convergence of the learning process in the sense of weight convergence and robustness of the learning process to external disturbances. Based on different benchmark problems, we compare this new method with other relevant spike-train learning algorithms. The results show that the speed of learning is much improved and the rate of successful learning is also greatly improved. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. 废钢+铁水-50 t EAF-LF-VD流程GCr15轴承钢中的非金属夹杂物行为%Behavior of Non-Metallic Inclusions in GCr15 Bearing Steel Steelmaking by Scrap + Hot Metal-50 t EAF-LF-VD Flow Sheet

    Institute of Scientific and Technical Information of China (English)

    杨密平; 吴兵; 林腾昌; 安杰; 马传庆

    2014-01-01

    通过50 t EAF配加30 ~40 t铁水和12~16 t优质废钢,EBT无渣出钢,加150 ~ 200 kg钢芯铝预脱氧,LF用SiC扩散脱氧,控制精炼渣碱度4.0~5.9,VD前后软吹氩、连铸保护浇铸和电磁搅拌等工艺措施,GCr15轴承钢轧材中的氧含量为8×10-6 ~9×10-6.分析结果表明,LF前至VD后钢中夹杂物尺寸一般≤10 μm,最大尺寸40μm,大部分夹杂物尺寸为3~6 μm;LF前主要夹杂物为Al2O3,镁铝尖晶石,硫化物,Cr2O3,TiO2;VD前后为镁铝尖晶石,CaS和MgO.

  19. Influence of Interlayer's Thickness on Strength of HIP Diffusion Bonding Joints Between P/M TC4 Alloy and GCr15 Bearing Steel%中间层厚度对P/M TC4-GCr15扩散焊接头强度的影响

    Institute of Scientific and Technical Information of China (English)

    郎泽保; 吕宏军; 王亮

    2009-01-01

    采用不同厚度的电镀镍作为中间层,在900℃、4 h和150 MPa压力的热等静压条件下,使用TCA预合金粉末和GCr15轴承钢制备了钛钢扩散焊接头.利用光学显微镜、扫描电镜、XRD和机械拉伸对接头进行了测试和分析.结果表明:当没有添加中间层时,接头的强度达到了564 MPa;当添加了中间层且中间层的厚度为150μm时,接头的强度最高,为502 MPa.中间层过厚或者过薄,都会导致接头强度的下降.

  20. Constructing An Event Based Aerosol Product Under High Aerosol Loading Conditions

    Science.gov (United States)

    Levy, R. C.; Shi, Y.; Mattoo, S.; Remer, L. A.; Zhang, J.

    2016-12-01

    High aerosol loading events, such as the Indonesia's forest fire in Fall 2015 or the persistent wintertime haze near Beijing, gain tremendous interests due to their large impact on regional visibility and air quality. Understanding the optical properties of these events and further being able to simulate and predict these events are beneficial. However, it is a great challenge to consistently identify and then retrieve aerosol optical depth (AOD) from passive sensors during heavy aerosol events. Some reasons include:1). large differences between optical properties of high-loading aerosols and those under normal conditions, 2) spectral signals of optically thick aerosols can be mistaken with surface depending on aerosol types, and 3) Extremely optically thick aerosol plumes can also be misidentified as clouds due to its high optical thickness. Thus, even under clear-sky conditions, the global distribution of extreme aerosol events is not well captured in datasets such as the MODIS Dark-Target (DT) aerosol product. In this study, with the synthetic use of OMI Aerosol Index, MODIS cloud product, and operational DT product, the heavy smoke events over the seven sea region are identified and retrieved over the dry season. An event based aerosol product that would compensate the standard "global" aerosol retrieval will be created and evaluated. The impact of missing high AOD retrievals on the regional aerosol climatology will be studied using this newly developed research product.

  1. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2009-01-01

    Full Text Available Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results.

  2. Hydrologic Modeling in the Kenai River Watershed using Event Based Calibration

    Science.gov (United States)

    Wells, B.; Toniolo, H. A.; Stuefer, S. L.

    2015-12-01

    Understanding hydrologic changes is key for preparing for possible future scenarios. On the Kenai Peninsula in Alaska the yearly salmon runs provide a valuable stimulus to the economy. It is the focus of a large commercial fishing fleet, but also a prime tourist attraction. Modeling of anadromous waters provides a tool that assists in the prediction of future salmon run size. Beaver Creek, in Kenai, Alaska, is a lowlands stream that has been modeled using the Army Corps of Engineers event based modeling package HEC-HMS. With the use of historic precipitation and discharge data, the model was calibrated to observed discharge values. The hydrologic parameters were measured in the field or calculated, while soil parameters were estimated and adjusted during the calibration. With the calibrated parameter for HEC-HMS, discharge estimates can be used by other researches studying the area and help guide communities and officials to make better-educated decisions regarding the changing hydrology in the area and the tied economic drivers.

  3. Assessing the Continuum of Event-Based Biosurveillance Through an Operational Lens

    Energy Technology Data Exchange (ETDEWEB)

    Corley, Courtney D.; Lancaster, Mary J.; Brigantic, Robert T.; Chung, James S.; Walters, Ronald A.; Arthur, Ray; Bruckner-Lea, Cindy J.; Calapristi, Augustin J.; Dowling, Glenn; Hartley, David M.; Kennedy, Shaun; Kircher, Amy; Klucking, Sara; Lee, Eva K.; McKenzie, Taylor K.; Nelson, Noele P.; Olsen, Jennifer; Pancerella, Carmen M.; Quitugua, Teresa N.; Reed, Jeremy T.; Thomas, Carla S.

    2012-03-28

    This research follows the Updated Guidelines for Evaluating Public Health Surveillance Systems, Recommendations from the Guidelines Working Group, published by the Centers for Disease Control and Prevention nearly a decade ago. Since then, models have been developed and complex systems have evolved with a breadth of disparate data to detect or forecast chemical, biological, and radiological events that have significant impact in the One Health landscape. How the attributes identified in 2001 relate to the new range of event-based biosurveillance (EBB) technologies is unclear. This manuscript frames the continuum of EBB methods, models, and constructs through an operational lens (i.e., aspects and attributes associated with operational considerations in the development, testing, and validation of the EBB methods and models and their use in an operational environment). A 2-day subject matter expert workshop was held to scientifically identify, develop, and vet a set of attributes for the broad range of such operational considerations. Workshop participants identified and described comprehensive attributes for the characterization of EBB. The identified attributes are: (1) event, (2) readiness, (3) operational aspects, (4) geographic coverage, (5) population coverage, (6) input data, (7) output, and (8) cost. Ultimately, the analyses herein discuss the broad scope, complexity, and relevant issues germane to EBB useful in an operational environment.

  4. A review of evaluations of electronic event-based biosurveillance systems.

    Science.gov (United States)

    Gajewski, Kimberly N; Peterson, Amy E; Chitale, Rohit A; Pavlin, Julie A; Russell, Kevin L; Chretien, Jean-Paul

    2014-01-01

    Electronic event-based biosurveillance systems (EEBS's) that use near real-time information from the internet are an increasingly important source of epidemiologic intelligence. However, there has not been a systematic assessment of EEBS evaluations, which could identify key uncertainties about current systems and guide EEBS development to most effectively exploit web-based information for biosurveillance. To conduct this assessment, we searched PubMed and Google Scholar to identify peer-reviewed evaluations of EEBS's. We included EEBS's that use publicly available internet information sources, cover events that are relevant to human health, and have global scope. To assess the publications using a common framework, we constructed a list of 17 EEBS attributes from published guidelines for evaluating health surveillance systems. We identified 11 EEBS's and 20 evaluations of these EEBS's. The number of published evaluations per EEBS ranged from 1 (Gen-Db, GODsN, MiTAP) to 8 (GPHIN, HealthMap). The median number of evaluation variables assessed per EEBS was 8 (range, 3-15). Ten published evaluations contained quantitative assessments of at least one key variable. No evaluations examined usefulness by identifying specific public health decisions, actions, or outcomes resulting from EEBS outputs. Future EEBS assessments should identify and discuss critical indicators of public health utility, especially the impact of EEBS's on public health response.

  5. A review of evaluations of electronic event-based biosurveillance systems.

    Directory of Open Access Journals (Sweden)

    Kimberly N Gajewski

    Full Text Available Electronic event-based biosurveillance systems (EEBS's that use near real-time information from the internet are an increasingly important source of epidemiologic intelligence. However, there has not been a systematic assessment of EEBS evaluations, which could identify key uncertainties about current systems and guide EEBS development to most effectively exploit web-based information for biosurveillance. To conduct this assessment, we searched PubMed and Google Scholar to identify peer-reviewed evaluations of EEBS's. We included EEBS's that use publicly available internet information sources, cover events that are relevant to human health, and have global scope. To assess the publications using a common framework, we constructed a list of 17 EEBS attributes from published guidelines for evaluating health surveillance systems. We identified 11 EEBS's and 20 evaluations of these EEBS's. The number of published evaluations per EEBS ranged from 1 (Gen-Db, GODsN, MiTAP to 8 (GPHIN, HealthMap. The median number of evaluation variables assessed per EEBS was 8 (range, 3-15. Ten published evaluations contained quantitative assessments of at least one key variable. No evaluations examined usefulness by identifying specific public health decisions, actions, or outcomes resulting from EEBS outputs. Future EEBS assessments should identify and discuss critical indicators of public health utility, especially the impact of EEBS's on public health response.

  6. A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors

    Science.gov (United States)

    Mishra, Abhishek; Ghosh, Rohan; Principe, Jose C.; Thakor, Nitish V.; Kukreja, Sunil L.

    2017-01-01

    Motion segmentation is a critical pre-processing step for autonomous robotic systems to facilitate tracking of moving objects in cluttered environments. Event based sensors are low power analog devices that represent a scene by means of asynchronous information updates of only the dynamic details at high temporal resolution and, hence, require significantly less calculations. However, motion segmentation using spatiotemporal data is a challenging task due to data asynchrony. Prior approaches for object tracking using neuromorphic sensors perform well while the sensor is static or a known model of the object to be followed is available. To address these limitations, in this paper we develop a technique for generalized motion segmentation based on spatial statistics across time frames. First, we create micromotion on the platform to facilitate the separation of static and dynamic elements of a scene, inspired by human saccadic eye movements. Second, we introduce the concept of spike-groups as a methodology to partition spatio-temporal event groups, which facilitates computation of scene statistics and characterize objects in it. Experimental results show that our algorithm is able to classify dynamic objects with a moving camera with maximum accuracy of 92%. PMID:28316563

  7. Too exhausted to remember: ego depletion undermines subsequent event-based prospective memory.

    Science.gov (United States)

    Li, Jian-Bin; Nie, Yan-Gang; Zeng, Min-Xia; Huntoon, Meghan; Smith, Jessi L

    2013-01-01

    Past research has consistently found that people are likely to do worse on high-level cognitive tasks after exerting self-control on previous actions. However, little has been unraveled about to what extent ego depletion affects subsequent prospective memory. Drawing upon the self-control strength model and the relationship between self-control resources and executive control, this study proposes that the initial actions of self-control may undermine subsequent event-based prospective memory (EBPM). Ego depletion was manipulated through watching a video requiring visual attention (Experiment 1) or completing an incongruent Stroop task (Experiment 2). Participants were then tested on EBPM embedded in an ongoing task. As predicted, the results showed that after ruling out possible intervening variables (e.g. mood, focal and nonfocal cues, and characteristics of ongoing task and ego depletion task), participants in the high-depletion condition performed significantly worse on EBPM than those in the low-depletion condition. The results suggested that the effect of ego depletion on EBPM was mainly due to an impaired prospective component rather than to a retrospective component.

  8. Framework for event-based semidistributed modeling that unifies the SCS-CN method, VIC, PDM, and TOPMODEL

    Science.gov (United States)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-09-01

    Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of "prethreshold" and "threshold-excess" runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics.

  9. Agreement between event-based and trend-based glaucoma progression analyses.

    Science.gov (United States)

    Rao, H L; Kumbar, T; Kumar, A U; Babu, J G; Senthil, S; Garudadri, C S

    2013-07-01

    To evaluate the agreement between event- and trend-based analyses to determine visual field (VF) progression in glaucoma. VFs of 175 glaucoma eyes with ≥5 VFs were analyzed by proprietary software of VF analyzer to determine progression. Agreement (κ) between trend-based analysis of VF index (VFI) and event-based analysis (glaucoma progression analysis, GPA) was evaluated. For eyes progressing by event- and trend-based methods, time to progression by two methods was calculated. Median number of VFs per eye was 7 and follow-up 7.5 years. GPA classified 101 eyes (57.7%) as stable, 30 eyes (17.1%) as possible and 44 eyes (25.2%) as likely progression. Trend-based analysis classified 122 eyes (69.7%) as stable (slope >-1% per year or any slope magnitude with P>0.05), 53 eyes (30.3%) as progressing with slope trend-based analysis was 0.48, and between specific criteria of GPA (possible clubbed with no progression) and trend-based analysis was 0.50. In eyes progressing by sensitive criteria of both methods (42 eyes), median time to progression by GPA (4.9 years) was similar (P=0.30) to trend-based method (5.0 years). This was also similar in eyes progressing by specific criteria of both methods (25 eyes; 5.6 years versus 5.9 years, P=0.23). Agreement between event- and trend-based progression analysis was moderate. GPA seemed to detect progression earlier than trend-based analysis, but this wasn't statistically significant.

  10. Coupling urban event-based and catchment continuous modelling for combined sewer overflow river impact assessment

    Science.gov (United States)

    Andrés-Doménech, I.; Múnera, J. C.; Francés, F.; Marco, J. B.

    2010-10-01

    Since Water Framework Directive (WFD) was passed in year 2000, the conservation of water bodies in the EU must be understood in a completely different way. Regarding to combined sewer overflows (CSOs) from urban drainage networks, the WFD implies that we cannot accept CSOs because of their intrinsic features, but they must be assessed for their impact on the receiving water bodies in agreement with specific environmental aims. Consequently, both, urban system and the receiving water body must be jointly analysed to evaluate the environmental impact generated on the latter. In this context, a coupled scheme is presented in this paper to assess the CSOs impact on a river system in Torrelavega (Spain). First, a urban model is developed to statistically characterise the CSOs frequency, volume and duration. The main feature of this first model is the fact of being event-based: the system is modelled with some built synthetic storms which cover adequately the probability range of the main rainfall descriptors, i.e., rainfall event volume and peak intensity. Thus, CSOs are characterised in terms of their occurrence probability. Secondly, a continuous and distributed basin model is built to assess river response at different points in the river network. This model was calibrated initially on a daily scale and downscaled later to hourly scale. The main objective of this second element of the scheme is to provide the most likely state of the receiving river when a CSO occurs. By combining results of both models, CSO and river flows are homogeneously characterised from a statistical point of view. Finally, results from both models were coupled to estimate the final concentration of some analysed pollutants (biochemical oxygen demand, BOD, and total ammonium, NH4+), within the river just after the spills.

  11. Coupling urban event-based and catchment continuous modelling for combined sewer overflow river impact assessment

    Directory of Open Access Journals (Sweden)

    I. Andrés-Doménech

    2010-10-01

    Full Text Available Since Water Framework Directive (WFD was passed in year 2000, the conservation of water bodies in the EU must be understood in a completely different way. Regarding to combined sewer overflows (CSOs from urban drainage networks, the WFD implies that we cannot accept CSOs because of their intrinsic features, but they must be assessed for their impact on the receiving water bodies in agreement with specific environmental aims. Consequently, both, urban system and the receiving water body must be jointly analysed to evaluate the environmental impact generated on the latter. In this context, a coupled scheme is presented in this paper to assess the CSOs impact on a river system in Torrelavega (Spain. First, a urban model is developed to statistically characterise the CSOs frequency, volume and duration. The main feature of this first model is the fact of being event-based: the system is modelled with some built synthetic storms which cover adequately the probability range of the main rainfall descriptors, i.e., rainfall event volume and peak intensity. Thus, CSOs are characterised in terms of their occurrence probability. Secondly, a continuous and distributed basin model is built to assess river response at different points in the river network. This model was calibrated initially on a daily scale and downscaled later to hourly scale. The main objective of this second element of the scheme is to provide the most likely state of the receiving river when a CSO occurs. By combining results of both models, CSO and river flows are homogeneously characterised from a statistical point of view. Finally, results from both models were coupled to estimate the final concentration of some analysed pollutants (biochemical oxygen demand, BOD, and total ammonium, NH4+, within the river just after the spills.

  12. Coupling urban event-based and catchment continuous modelling for combined sewer overflow river impact assessment

    Directory of Open Access Journals (Sweden)

    I. Andrés-Doménech

    2010-05-01

    Full Text Available Since the Water Framework Directive (WFD was passed in year 2000, the protection of water bodies in the EU must be understood in a completely different way. Regarding to combined sewer overflows (CSOs from urban drainage networks, the WFD implies that CSOs cannot be accepted because of their intrinsic features, but must be assessed for their impact on the receiving water bodies in agreement with specific environmental aims. Consequently, both, the urban system and the receiving one must be jointly analysed to evaluate their impact. In this context, a coupled scheme is presented in this paper to assess the CSOs impact in a river system in Torrelavega (Spain. First, an urban model is developed to characterise statistically the CSOs frequency, volume and duration. The main feature of this first model is the fact of being event-based: the system is modelled with some built synthetic storms which cover adequately the probability range of the main rainfall descriptors, i.e., rainfall event volume and peak intensity. Thus, CSOs are characterised in terms of their occurrence probability. Secondly, a continuous and distributed basin model is built to assess the river response at different points in the river network. This model was calibrated initially on a daily scale and downscaled later to the hourly scale. The main objective of this second element of the scheme is to provide the most likely state of the receiving river when a CSO occurs. By combining results of both models, CSO and river flows are homogeneously characterised from a statistical point of view. Finally, results from both models were coupled to estimate the final concentration of some analysed pollutants (the biochemical oxygen demand, BOD, and the total ammonium, NH4+, in the river just after the spills.

  13. Generalization of the Event-Based Carnevale-Hines Integration Scheme for Integrate-and-Fire Models

    NARCIS (Netherlands)

    van Elburg, Ronald A. J.; van Ooyen, Arjen

    2009-01-01

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on th

  14. Assessing distractors and teamwork during surgery: developing an event-based method for direct observation.

    Science.gov (United States)

    Seelandt, Julia C; Tschan, Franziska; Keller, Sandra; Beldi, Guido; Jenni, Nadja; Kurmann, Anita; Candinas, Daniel; Semmer, Norbert K

    2014-11-01

    To develop a behavioural observation method to simultaneously assess distractors and communication/teamwork during surgical procedures through direct, on-site observations; to establish the reliability of the method for long (>3 h) procedures. Observational categories for an event-based coding system were developed based on expert interviews, observations and a literature review. Using Cohen's κ and the intraclass correlation coefficient, interobserver agreement was assessed for 29 procedures. Agreement was calculated for the entire surgery, and for the 1st hour. In addition, interobserver agreement was assessed between two tired observers and between a tired and a non-tired observer after 3 h of surgery. The observational system has five codes for distractors (door openings, noise distractors, technical distractors, side conversations and interruptions), eight codes for communication/teamwork (case-relevant communication, teaching, leadership, problem solving, case-irrelevant communication, laughter, tension and communication with external visitors) and five contextual codes (incision, last stitch, personnel changes in the sterile team, location changes around the table and incidents). Based on 5-min intervals, Cohen's κ was good to excellent for distractors (0.74-0.98) and for communication/teamwork (0.70-1). Based on frequency counts, intraclass correlation coefficient was excellent for distractors (0.86-0.99) and good to excellent for communication/teamwork (0.45-0.99). After 3 h of surgery, Cohen's κ was 0.78-0.93 for distractors, and 0.79-1 for communication/teamwork. The observational method developed allows a single observer to simultaneously assess distractors and communication/teamwork. Even for long procedures, high interobserver agreement can be achieved. Data collected with this method allow for investigating separate or combined effects of distractions and communication/teamwork on surgical performance and patient outcomes. Published by the

  15. On the significance of the Nash-Sutcliffe efficiency measure for event-based flood models

    Science.gov (United States)

    Moussa, Roger

    2010-05-01

    When modelling flood events, the important challenge that awaits the modeller is first to choose a rainfall-runoff model, then to calibrate a set of parameters that can accurately simulate a number of flood events and related hydrograph shapes, and finally to evaluate the model performance separately on each event using multi-criteria functions. This study analyses the significance of the Nash-Sutcliffe efficiency (NSE) and proposes a new method to assess the performance of flood event models (see Moussa, 2010, "When monstrosity can be beautiful while normality can be ugly : assessing the performance of event-based-flood-models", Hydrological Science Journal, in press). We focus on the specific cases of events difficult to model and characterized by low NSE values, which we call "monsters". The properties of the NSE were analysed as a function of the calculated hydrograph shape and of the benchmark reference model. As application case, a multi-criteria analysis method to assess the model performance on each event is proposed and applied on the Gardon d'Anduze catchment. This paper discusses first the significance of the well-known Nash-Sutcliffe efficiency (NSE) criteria function when calculated separately on flood events. The NSE is a convenient and normalized measure of model performance, but does not provide a reliable basis for comparing the results of different case studies. We show that simulated hydrographs with low or negative values of NSE, called "monsters", can be due solely to a simple lag translation or a homothetic ratio of the observed hydrograph which reproduces the dynamic of the hydrograph, with acceptable errors on other criteria. In the opposite, results show that simulations with a NSE close to 1 can become "monsters" and give very low values (even negative) of the criteria function G, if the average observed discharged used as a benchmark reference model in the NSE is modified. This paper argues that the definition of an appropriate benchmark

  16. Event-based prospective memory among veterans: The role of posttraumatic stress disorder symptom severity in executing intentions.

    Science.gov (United States)

    McFarland, Craig P; Clark, Justin B; Lee, Lewina O; Grande, Laura J; Marx, Brian P; Vasterling, Jennifer J

    2016-01-01

    Posttraumatic stress disorder (PTSD) has been linked with neuropsychological deficits in several areas, including attention, learning and memory, and cognitive inhibition. Although memory dysfunction is among the most commonly documented deficits associated with PTSD, our existing knowledge pertains only to retrospective memory. The current study investigated the relationship between PTSD symptom severity and event-based prospective memory (PM). Forty veterans completed a computerized event-based PM task, a self-report measure of PTSD, and measures of retrospective memory. Hierarchical regression analysis results revealed that PTSD symptom severity accounted for 16% of the variance in PM performance, F(3, 36) = 3.47, p memory. Additionally, each of the three PTSD symptom clusters was related, to varying degrees, with PM performance. Results suggest that elevated PTSD symptoms may be associated with more difficulties completing tasks requiring PM. Further examination of PM in PTSD is warranted, especially in regard to its impact on everyday functioning.

  17. Evaluations of Risks from the Lunar and Mars Radiation Environments

    Science.gov (United States)

    Kim, Myung-Hee; Hayat, Matthew J.; Feiveson, Alan H.; Cucinotta, Francis A.

    2008-01-01

    Protecting astronauts from the space radiation environments requires accurate projections of radiation in future space missions. Characterization of the ionizing radiation environment is challenging because the interplanetary plasma and radiation fields are modulated by solar disturbances and the radiation doses received by astronauts in interplanetary space are likewise influenced. The galactic cosmic radiation (GCR) flux for the next solar cycle was estimated as a function of interplanetary deceleration potential, which has been derived from GCR flux and Climax neutron monitor rate measurements over the last 4 decades. For the chaotic nature of solar particle event (SPE) occurrence, the mean frequency of SPE at any given proton fluence threshold during a defined mission duration was obtained from a Poisson process model using proton fluence measurements of SPEs during the past 5 solar cycles (19-23). Analytic energy spectra of 34 historically large SPEs were constructed over broad energy ranges extending to GeV. Using an integrated space radiation model (which includes the transport codes HZETRN [1] and BRYNTRN [2], and the quantum nuclear interaction model QMSFRG[3]), the propagation and interaction properties of the energetic nucleons through various media were predicted. Risk assessment from GCR and SPE was evaluated at the specific organs inside a typical spacecraft using CAM [4] model. The representative risk level at each event size and their standard deviation were obtained from the analysis of 34 SPEs. Risks from different event sizes and their frequency of occurrences in a specified mission period were evaluated for the concern of acute health effects especially during extra-vehicular activities (EVA). The results will be useful for the development of an integrated strategy of optimizing radiation protection on the lunar and Mars missions. Keywords: Space Radiation Environments; Galactic Cosmic Radiation; Solar Particle Event; Radiation Risk; Risk

  18. 高品质GCr15轴承钢二次精炼过程中夹杂物的演变规律%The evolution of inclusions in high quality GCr1 5 bearing steels during secondary refining process

    Institute of Scientific and Technical Information of China (English)

    朱诚意; 吴炳新; 张志成; 李光强; 潘明旭

    2015-01-01

    采用FE-SEM/EDS研究了转炉流程生产的GCr15轴承钢LF、RH 精炼过程中夹杂物的演变规律,分析了其演变机理。结果表明:钢中复合夹杂物的演变规律可归纳为:Al2 O3→MgO·Al2 O3→(CaO-MgO-Al2 O3-(CaS))复合氧化物夹杂和 Al2 O3→(Al2 O3-MnS)→(Al2 O3-MnS-Ti(C,N))复合氧硫碳氮物夹杂2种方式。LF精炼过程脱硫作用明显,钢中的硫化物夹杂数量大幅减少。LF精炼初期钢中主要是MnS、Al2 O3、TiN的单相夹杂物。LF 精炼结束后钢中的夹杂物演变为Al2 O3为核心外包氧化物及 MnS、TiN、Ti(C,N)、CaS 的复合夹杂物。精炼渣中的CaO 和耐火材料中的MgO 经还原后与钢中溶解氧反应导致LF精炼结束时D类夹杂物增加。RH及软吹处理进一步强化了去除钢中的硫化物,但D 类及其与A、T 类复合的夹杂物含量增加。在LF阶段,夹杂物尺寸主要集中在1~3μm范围内,到R H 阶段,夹杂物尺寸则主要集中分布在小于1μm的粒度范围。最大夹杂物尺寸由10.79μm降到5.68μm,单位面积夹杂个数由372个/mm2降到258个/mm2。RH 及软吹处理有效地降低了钢中大于3μm的夹杂物。%Based on the productive process of BOF-LF-RH-CC for GCr1 5 bearing steels and using FE-SEM/EDS,evolution rule and mechanism of the inclusions in LF and RH refining process have been studied.Results show that two types of evolution manners of the complex inclusions for the steel during refining process are summarized.The complex inclusions originate from Al2 O3 and then change to MgO· Al2 O3 → (CaO-MgO-Al2 O3-(CaS )) and (Al2 O3-MnS )→ (Al2 O3-MnS- Ti (C,N )) respectively. Desulfurization effect is obvious in LF refining process, and sulfide inclusions in steel decrease significantly.The main inclusions at the beginning of LF refining process are some simple inclusions such as MnS,Al2 O3 ,TiN,etc.At the end of LF refining process,the inclusions are

  19. 铝脱氧并真空脱气后喂钙-硅线对GCr5钢中氧含量及夹杂物的影响%Effect of Ca-Si Wire Feeding on Oxygen Content and Inclusion in AI-Deoxidized and Vacuum Degassed GCrl5 Steel

    Institute of Scientific and Technical Information of China (English)

    范植金; 冯文圣; 罗国华; 朱玉秀

    2011-01-01

    The effect of Ca-Si wire feeding on oxygen content and inclusion in Al-deoxidized and vacuum degassed CX2rl5 steel was investigated by oxygen content analysis, inclusions rating and electron probe observation. The results show that there was no further effect on oxygen content and D-type inclusion between Al-deoxidized and vacuum degassed GCr15 steel with Ca-Si wire feeding after LF and VD secondary refining and Al-deoxidized GCrl5 steel without Ca-Si wire feeding. The problem o{ nozzle clogging during continuous casting could be solved by feeding Ca-Si wire. Granular calcium-aluminates wrapped by a layer of calcium sulfide were formed in GCr15 steel treated by refining slag of CaO-SiO2-Al2O3.%对GCr15钢进行了氧含量分析、夹杂物评级和电子探针观察,研究了喂钙-硅线对铝脱氧并真空脱气后钢中氧含量及夹杂物的影响。结果表明:铝脱氧的GCr15钢经LF和VD炉真空脱气(精炼)后再喂钙-硅线,其氧含量和D类点状夹杂物级别与未喂钙一硅线的相比没有明显的差别;为解决连铸水口堵塞问题可喂钙一硅线;GCr15钢采用CaO-SiO2-Al2O3渣系精炼,可形成粒状铝酸钙夹杂物,其外表面常常吸附一层CaS。

  20. Risk

    Science.gov (United States)

    Barshi, Immanuel

    2016-01-01

    Speaking up, i.e. expressing ones concerns, is a critical piece of effective communication. Yet, we see many situations in which crew members have concerns and still remain silent. Why would that be the case? And how can we assess the risks of speaking up vs. the risks of keeping silent? And once we do make up our minds to speak up, how should we go about it? Our workshop aims to answer these questions, and to provide us all with practical tools for effective risk assessment and effective speaking-up strategies..

  1. Event-Based Operational Semantics and a Consistency Result for Real-Time Concurrent Processes with Action Refinement

    Institute of Scientific and Technical Information of China (English)

    Xiu-Li Sun; Wen-Yin Zhang; Jin-Zhao Wu

    2004-01-01

    In this paper an event-based operational interleaving semantics is proposed for real-time processes, for which action refinement and a denotational true concurrency semantics are developed and defined in terms of timed event structures. The authors characterize the timed event traces that are generated by the operational semantics in a denotational way, and show that this operational semantics is consistent with the denotational semantics in the sense that they generate the same set of timed event traces, thereby eliminating the gap between the true concurrency and interleaving semantics.

  2. Concentration of electrostatic solitary waves around magnetic nulls within magnetic reconnection diffusion region: single-event-based statistics

    Science.gov (United States)

    Li, Shiyou; Zhang, Shifeng; Cai, Hong; Yu, Sufang

    2014-12-01

    It is important to study the `concentrated' electrostatic solitary waves/structures (ESWs) associated with the magnetic reconnection. In the literature published as regards this topic, very few studies have reported the observation of such a large number of ESWs in a single magnetic reconnection event. In this work, we report our observation of a large number of ESWs around the magnetic null-pairs within the magnetic reconnection ion diffusion region of Earth's magnetosphere on 10 September 2001. With more than 9,600 cases of ESWs observed around magnetic null-pairs and more than 97,600 cases observed during the ion diffusion region crossing time span, the observation of such a large number of ESWs in the diffusion region has not been reported often in published works. We further perform single-event-based statistical analysis of the characteristics of the ESWs around magnetic null-pairs. Based on the statistical result, we speculate that the two-stream instability originating from the magnetic null and traveling outward along the plasma sheet boundary layer (PSBL) is the candidate mechanism of the large number of observed ESWs. Our observation and analysis in this work suggests that even with the presence of a complex magnetic structure around a magnetic null-pair in the three-dimensional regime, concentrated ESWs can be observed. This single-reconnection-event-based statistical result of ESWs around the magnetic null-pairs can aid in understanding the microdynamics associated with three-dimensional (3D) magnetic reconnection.

  3. Incentive Effects on Event-Based Prospective Memory Performance in Children and Adolescents with Traumatic Brain Injury

    Science.gov (United States)

    McCauley, Stephen R.; McDaniel, Mark A.; Pedroza, Claudia; Chapman, Sandra B.; Levin, Harvey S.

    2011-01-01

    Prospective memory (PM) is the formation of an intention and remembering to perform this intention at a future time or in response to specific cues. PM tasks are a ubiquitous part of daily life. Currently, there is a paucity of information regarding PM impairments in children with traumatic brain injury (TBI) and less empirical evidence regarding effective remediation strategies to mitigate these impairments. The present study employed two levels of a motivational enhancement (i.e., a monetary incentive) to determine if event-based PM could be improved in children with severe TBI. In a cross-over design, children with orthopedic injuries and mild or severe TBI were compared on two levels of incentive (dollars versus pennies) given in response to accurate performance. All three groups performed significantly better under the high- versus low-motivation conditions. However, the severe TBI group’s high-motivation condition performance remained significantly below the low-motivation condition performance of the orthopedic injury group. PM scores were positively and significantly related to age-attest, but there were no age-at-injury or time-postinjury effects. Overall, these results suggest that event-based PM can be significantly improved in children with severe TBI. PMID:19254093

  4. Breaking The Millisecond Barrier On SpiNNaker: Implementing Asynchronous Event-Based Plastic Models With Microsecond Resolution

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-06-01

    Full Text Available Spike-based neuromorphic sensors such as retinas and cochleas, change the way in which the world is sampled. Instead of producing data sampled at a constant rate, these sensors output spikes that are asynchronous and event driven. The event-based nature of neuromorphic sensors implies a complete paradigm shift in current perception algorithms towards those that emphasize the importance of precise timing. The spikes produced by these sensors usually have a time resolution in the order of microseconds. This high temporal resolution is a crucial factor in learning tasks. It is also widely used in the field of biological neural networks. Sound localization for instance relies on detecting time lags between the two ears which, in the barn owl, reaches a temporal resolution of 5 microseconds. Current available neuromorphic computation platforms such as SpiNNaker often limit their users to a time resolution in the order of milliseconds that is not compatible with the asynchronous outputs of neuromorphic sensors. To overcome these limitations and allow for the exploration of new types of neuromorphic computing architectures, we introduce a novel software framework on the SpiNNaker platform. This framework allows for simulations of spiking networks and plasticity mechanisms using a completely asynchronous and event-based scheme running with a microsecond time resolution. Results on two example networks using this new implementation are presented.

  5. Distributed Event-Based Set-Membership Filtering for a Class of Nonlinear Systems With Sensor Saturations Over Sensor Networks.

    Science.gov (United States)

    Ma, Lifeng; Wang, Zidong; Lam, Hak-Keung; Kyriakoulis, Nikos

    2016-07-07

    In this paper, the distributed set-membership filtering problem is investigated for a class of discrete time-varying system with an event-based communication mechanism over sensor networks. The system under consideration is subject to sector-bounded nonlinearity, unknown but bounded noises and sensor saturations. Each intelligent sensing node transmits the data to its neighbors only when certain triggering condition is violated. By means of a set of recursive matrix inequalities, sufficient conditions are derived for the existence of the desired distributed event-based filter which is capable of confining the system state in certain ellipsoidal regions centered at the estimates. Within the established theoretical framework, two additional optimization problems are formulated: one is to seek the minimal ellipsoids (in the sense of matrix trace) for the best filtering performance, and the other is to maximize the triggering threshold so as to reduce the triggering frequency with satisfactory filtering performance. A numerically attractive chaos algorithm is employed to solve the optimization problems. Finally, an illustrative example is presented to demonstrate the effectiveness and applicability of the proposed algorithm.

  6. Incentive effects on event-based prospective memory performance in children and adolescents with traumatic brain injury.

    Science.gov (United States)

    McCauley, Stephen R; McDaniel, Mark A; Pedroza, Claudia; Chapman, Sandra B; Levin, Harvey S

    2009-03-01

    Prospective memory (PM) is the formation of an intention and remembering to perform this intention at a future time or in response to specific cues. PM tasks are a ubiquitous part of daily life. Currently, there is a paucity of information regarding PM impairments in children with traumatic brain injury (TBI) and less empirical evidence regarding effective remediation strategies to mitigate these impairments. The present study employed two levels of a motivational enhancement (i.e., a monetary incentive) to determine whether event-based PM could be improved in children with severe TBI. In a crossover design, children with orthopedic injuries and mild or severe TBI were compared on two levels of incentive (dollars vs. pennies) given in response to accurate performance. All three groups performed significantly better under the high- versus low-motivation conditions. However, the severe TBI group's high-motivation condition performance remained significantly below the low-motivation condition performance of the orthopedic injury group. PM scores were positively and significantly related to age-at-test, but there were no age-at-injury or time-postinjury effects. Overall, these results suggest that event-based PM can be significantly improved in children with severe TBI.

  7. Intensity changes in future extreme precipitation: A statistical event-based approach.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2017-04-01

    method, unchanged. The advantages of the suggested Pi-Td method of projecting future precipitation events from historic events is that it is simple to use, is less expensive time, computational and resource wise compared to a numerical model. The outcome can be used directly for hydrological and climatological studies and for impact analysis such as for flood risk assessments.

  8. A study of preservice elementary teachers enrolled in a discrepant-event-based physical science class

    Science.gov (United States)

    Lilly, James Edward

    This research evaluated the POWERFUL IDEAS IN PHYSICAL SCIENCE (PIiPS) curriculum model used to develop a physical science course taken by preservice elementary teachers. The focus was on the evaluation of discrepant events used to induce conceptual change in relation to students' ideas concerning heat, temperature, and specific heat. Both quantitative and qualitative methodologies were used for the analysis. Data was collected during the 1998 Fall semester using two classes of physical science for elementary school teachers. The traditionally taught class served as the control group and the class using the PIiPS curriculum model was the experimental group. The PIiPS curriculum model was evaluated quantitatively for its influence on students' attitude toward science, anxiety towards teaching science, self efficacy toward teaching science, and content knowledge. An analysis of covariance was performed on the quantitative data to test for significant differences between the means of the posttests for the control and experimental groups while controlling for pretest. It was found that there were no significant differences between the means of the control and experimental groups with respect to changes in their attitude toward science, anxiety toward teaching science and self efficacy toward teaching science. A significant difference between the means of the content examination was found (F(1,28) = 14.202 and p = 0.001), however, the result is questionable. The heat and energy module was the target for qualitative scrutiny. Coding for discrepant events was adapted from Appleton's 1996 work on student's responses to discrepant event science lessons. The following qualitative questions were posed for the investigation: (1) what were the ideas of the preservice elementary students prior to entering the classroom regarding heat and energy, (2) how effective were the discrepant events as presented in the PIiPS heat and energy module, and (3) how much does the "risk taking

  9. Event-based knowledge elicitation of operating room management decision-making using scenarios adapted from information systems data

    Directory of Open Access Journals (Sweden)

    Epstein Richard H

    2011-01-01

    Full Text Available Abstract Background No systematic process has previously been described for a needs assessment that identifies the operating room (OR management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. Methods A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1. Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS data for most scenarios (43 of 45. Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Results Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of

  10. Evidence Report: Risk of Radiation Carcinogenesis

    Science.gov (United States)

    Huff, Janice; Carnell, Lisa; Blattnig, Steve; Chappell, Lori; Kerry, George; Lumpkins, Sarah; Simonsen, Lisa; Slaba, Tony; Werneth, Charles

    2016-01-01

    As noted by Durante and Cucinotta (2008), cancer risk caused by exposure to space radiation is now generally considered a main hindrance to interplanetary travel for the following reasons: large uncertainties are associated with the projected cancer risk estimates; no simple and effective countermeasures are available, and significant uncertainties prevent scientists from determining the effectiveness of countermeasures. Optimizing operational parameters such as the length of space missions, crew selection for age and sex, or applying mitigation measures such as radiation shielding or use of biological countermeasures can be used to reduce risk, but these procedures have inherent limitations and are clouded by uncertainties. Space radiation is comprised of high energy protons, neutrons and high charge (Z) and energy (E) nuclei (HZE). The ionization patterns and resulting biological insults of these particles in molecules, cells, and tissues are distinct from typical terrestrial radiation, which is largely X-rays and gamma-rays, and generally characterized as low linear energy transfer (LET) radiation. Galactic cosmic rays (GCR) are comprised mostly of highly energetic protons with a small component of high charge and energy (HZE) nuclei. Prominent HZE nuclei include He, C, O, Ne, Mg, Si, and Fe. GCR ions have median energies near 1 GeV/n, and energies as high as 10 GeV/n make important contributions to the total exposure. Ionizing radiation is a well known carcinogen on Earth (BEIR 2006). The risks of cancer from X-rays and gamma-rays have been established at doses above 50 mSv (5 rem), although there are important uncertainties and on-going scientific debate about cancer risk at lower doses and at low dose rates (leads to significant uncertainties in projecting cancer risks during space exploration (Cucinotta and Durante 2006; Durante and Cucinotta 2008).

  11. Event-based state estimation for a class of complex networks with time-varying delays: A comparison principle approach

    Science.gov (United States)

    Zhang, Wenbing; Wang, Zidong; Liu, Yurong; Ding, Derui; Alsaadi, Fuad E.

    2017-01-01

    The paper is concerned with the state estimation problem for a class of time-delayed complex networks with event-triggering communication protocol. A novel event generator function, which is dependent not only on the measurement output but also on a predefined positive constant, is proposed with hope to reduce the communication burden. A new concept of exponentially ultimate boundedness is provided to quantify the estimation performance. By means of the comparison principle, some sufficient conditions are obtained to guarantee that the estimation error is exponentially ultimately bounded, and then the estimator gains are obtained in terms of the solution of certain matrix inequalities. Furthermore, a rigorous proof is proposed to show that the designed triggering condition is free of the Zeno behavior. Finally, a numerical example is given to illustrate the effectiveness of the proposed event-based estimator.

  12. Event-based versus process-based informed consent to address scientific evidence and uncertainties in ionising medical imaging.

    Science.gov (United States)

    Recchia, Virginia; Dodaro, Antonio; Braga, Larissa

    2013-10-01

    Inappropriate ionising medical imaging has been escalating in the last decades. This trend leads to potential damage to health and has been associated to bioethical and legal issues of patient autonomy. While the doctrine underlines the importance of using informed consent to improve patient autonomy and physician-patient communication, some researchers have argued that it often falls short of this aim. There are basically two different informed consent practices. The first - the so-called "event-based model" - regards informed consent as a passive signature of a standard unreadable template, performed only once in each medical pathway. The second - the so-called "process-based model" - integrates information into the continuing dialogue between physician and patient, vital for diagnosis and treatment. Current medical behaviour often embraces the event-based model, which is considered ineffective and contributes to inappropriateness. We sought, in this review, to analyse from juridical and communication standpoints whether process-based informed consent can deal with scientific uncertainties in radiological decision-making. The informed consent is still a distinctive process in defence of both patients' and physicians' health and dignity in rule-of-law states and consequently in curtailing the abuse of ionising medical radiation. • Inappropriate ionising medical imaging is widespread and increasing worldwide. • This trend leads to noteworthy damage to health and is linked to the issue of patient autonomy. • Some authors have argued that informed consent often falls short of improving patient autonomy. • Process-based informed consent can deal with scientific uncertainties to contrast inappropriateness. • Informed consent is still a distinctive process in defence of both patients and physicians.

  13. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    Directory of Open Access Journals (Sweden)

    Wooyeon Sunwoo

    2017-01-01

    Full Text Available Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC prior to a rainfall event. Soil moisture is one of the most important variables in rainfall–runoff modeling, and remotely sensed soil moisture is recognized as an effective way to improve the accuracy of runoff prediction. In this study, the IWC was evaluated based on remotely sensed soil moisture by using the Soil Conservation Service-Curve Number (SCS-CN method, which is one of the representative event-based models used for reducing the uncertainty of runoff prediction. Four proxy variables for the IWC were determined from the measurements of total rainfall depth (API5, ground-based soil moisture (SSMinsitu, remotely sensed surface soil moisture (SSM, and soil water index (SWI provided by the advanced scatterometer (ASCAT. To obtain a robust IWC framework, this study consists of two main parts: the validation of remotely sensed soil moisture, and the evaluation of runoff prediction using four proxy variables with a set of rainfall–runoff events in the East Asian monsoon region. The results showed an acceptable agreement between remotely sensed soil moisture (SSM and SWI and ground based soil moisture data (SSMinsitu. In the proxy variable analysis, the SWI indicated the optimal value among the proposed proxy variables. In the runoff prediction analysis considering various infiltration conditions, the SSM and SWI proxy variables significantly reduced the runoff prediction error as compared with API5 by 60% and 66%, respectively. Moreover, the proposed IWC framework with

  14. Stochastic generation of daily rainfall events based on rainfall pattern classification and Copula-based rainfall characteristics simulation

    Science.gov (United States)

    Xu, Y. P.; Gao, C.

    2016-12-01

    To deal with the problem of having no or insufficiently long rainfall record, developing a stochastic rainfall model is very essential. This study first proposed a stochastic model of daily rainfall events based on classification and simulation of different rainfall patterns, and copula-based joint simulation of rainfall characteristics. Compared with current stochastic rainfall models, this new model not only keeps the dependence structure of rainfall characteristics by using copula functions, but also takes various rainfall patterns that may cause different hydrological responses to watershed into consideration. In order to determine the appropriate number of representative rainfall patterns in an objective way, we also introduced clustering validation measures to the stochastic model. Afterwards, the developed stochastic rainfall model is applied to 39 gauged meteorological stations in Zhejiang province, East China, and is then extended to ungauged stations for validation by applying the self-organizing map (SOM) method. The final results show that the 39 stations can be classified into seven regions that further fall into three categories based on rainfall generation mechanisms, i.e., plum-rain control region, typhoon-rain control region and typhoon-plum-rain compatible region. Rainfall patterns of each station can be classified into five or six types based on clustering validation measures. This study shows that the stochastic rainfall model is robust and can be applied to both gauged and ungauged stations for generating long rainfall record.

  15. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    Science.gov (United States)

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  16. SEDIMENT YIELD MODELING FOR SINGLE STORM EVENTS BASED ON HEAVY-DISCHARGE STAGE CHARACTERIZED BY STABLE SEDIMENT CONCENTRATION

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The relation between runoff volume and sediment yield for individual events in a given watershed receives little attention compared to the relation between water discharge and sediment yield, though it may underlie the event-based sediment-yield model for large-size watershed. The data observed at 12 experimental subwatersheds in the Dalihe river watershed in hilly areas of Loess Plateau, North China,was selected to develop and validate the relation. The peak flow is often considered as an important factor affecting event sediment yield. However, in the study areas, sediment concentration remains relatively constant when water discharge exceeds a certain critical value, implying that the heavier flow is not accompanied with the higher sediment transport capacity. Hence, only the runoff volume factor was considered in the sediment-yield model. As both the total sediment and runoff discharge were largely produced during the heavy-discharge stage, and the sediment concentration was negligibly variable during this stage, a proportional function can be used to model the relation between event runoff volume and sediment yield for a given subwatershed. The applicability of this model at larger spatial scales was also discussed, and it was found that for the Yaoxinzhuang station at the Puhe River basin, which controls a drainage area of 2264km2, a directly proportional relation between event runoff volume and sediment yield may also exist.

  17. A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems

    Science.gov (United States)

    Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo

    2017-01-01

    Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems. PMID:28079187

  18. A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems

    Science.gov (United States)

    Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo

    2017-01-01

    Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.

  19. Unit-Specific Event-Based and Slot-Based Hybrid Model Framework with Hierarchical Structure for Short-Term Scheduling

    Directory of Open Access Journals (Sweden)

    Yue Wang

    2015-01-01

    Full Text Available Unit-specific event-based continuous-time model has inaccurate calculation problems in involving resource constraints, due to the heterogeneous locations of the event points for different units. In order to address this limitation, a continuous-time unit-specific event-based and slot-based hybrid model framework with hierarchical structure is proposed in this work. A unit-specific event-based model without utility constraints is formulated in upper layer, and a slot-based model is introduced in lower layer. In the hierarchical structure, the two layers jointly address the short-term production scheduling problem of batch plants under utility consideration. The key features of this work include the following: (a eliminating overstrict constraints on utility resources, (b solving multiple counting problems, and (c considering duration time of event points in calculating utility utilization level. The effectiveness and advantages of proposed model are illustrated through two benchmark examples from the literatures.

  20. Prototype Biology-Based Radiation Risk Module Project

    Science.gov (United States)

    Terrier, Douglas; Clayton, Ronald G.; Patel, Zarana; Hu, Shaowen; Huff, Janice

    2015-01-01

    Biological effects of space radiation and risk mitigation are strategic knowledge gaps for the Evolvable Mars Campaign. The current epidemiology-based NASA Space Cancer Risk (NSCR) model contains large uncertainties (HAT #6.5a) due to lack of information on the radiobiology of galactic cosmic rays (GCR) and lack of human data. The use of experimental models that most accurately replicate the response of human tissues is critical for precision in risk projections. Our proposed study will compare DNA damage, histological, and cell kinetic parameters after irradiation in normal 2D human cells versus 3D tissue models, and it will use a multi-scale computational model (CHASTE) to investigate various biological processes that may contribute to carcinogenesis, including radiation-induced cellular signaling pathways. This cross-disciplinary work, with biological validation of an evolvable mathematical computational model, will help reduce uncertainties within NSCR and aid risk mitigation for radiation-induced carcinogenesis.

  1. Cancer Risk Map for the Surface of Mars

    Science.gov (United States)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2011-01-01

    We discuss calculations of the median and 95th percentile cancer risks on the surface of Mars for different solar conditions. The NASA Space Radiation Cancer Risk 2010 model is used to estimate gender and age specific cancer incidence and mortality risks for astronauts exploring Mars. Organ specific fluence spectra and doses for large solar particle events (SPE) and galactic cosmic rays (GCR) at various levels of solar activity are simulated using the HZETRN/QMSFRG computer code, and the 2010 version of the Badhwar and O Neill GCR model. The NASA JSC propensity model of SPE fluence and occurrence is used to consider upper bounds on SPE fluence for increasing mission lengths. In the transport of particles through the Mars atmosphere, a vertical distribution of Mars atmospheric thickness is calculated from the temperature and pressure data of Mars Global Surveyor, and the directional cosine distribution is implemented to describe the spherically distributed atmospheric distance along the slant path at each elevation on Mars. The resultant directional shielding by Mars atmosphere at each elevation is coupled with vehicle and body shielding for organ dose estimates. Astronaut cancer risks are mapped on the global topography of Mars, which was measured by the Mars Orbiter Laser Altimeter. Variation of cancer risk on the surface of Mars is due to a 16-km elevation range, and the large difference is obtained between the Tharsis Montes (Ascraeus, Pavonis, and Arsia) and the Hellas impact basin. Cancer incidence risks are found to be about 2-fold higher than mortality risks with a disproportionate increase in skin and thyroid cancers for all astronauts and breast cancer risk for female astronauts. The number of safe days on Mars to be below radiation limits at the 95th percent confidence level is reported for several Mission design scenarios.

  2. Benefits and limitations of data assimilation for discharge forecasting using an event-based rainfall–runoff model

    Directory of Open Access Journals (Sweden)

    M. Coustau

    2013-03-01

    Full Text Available Mediterranean catchments in southern France are threatened by potentially devastating fast floods which are difficult to anticipate. In order to improve the skill of rainfall-runoff models in predicting such flash floods, hydrologists use data assimilation techniques to provide real-time updates of the model using observational data. This approach seeks to reduce the uncertainties present in different components of the hydrological model (forcing, parameters or state variables in order to minimize the error in simulated discharges. This article presents a data assimilation procedure, the best linear unbiased estimator (BLUE, used with the goal of improving the peak discharge predictions generated by an event-based hydrological model Soil Conservation Service lag and route (SCS-LR. For a given prediction date, selected model inputs are corrected by assimilating discharge data observed at the basin outlet. This study is conducted on the Lez Mediterranean basin in southern France. The key objectives of this article are (i to select the parameter(s which allow for the most efficient and reliable correction of the simulated discharges, (ii to demonstrate the impact of the correction of the initial condition upon simulated discharges, and (iii to identify and understand conditions in which this technique fails to improve the forecast skill. The correction of the initial moisture deficit of the soil reservoir proves to be the most efficient control parameter for adjusting the peak discharge. Using data assimilation, this correction leads to an average of 12% improvement in the flood peak magnitude forecast in 75% of cases. The investigation of the other 25% of cases points out a number of precautions for the appropriate use of this data assimilation procedure.

  3. Benefits and limitations of data assimilation for discharge forecasting using an event-based rainfall-runoff model

    Science.gov (United States)

    Coustau, M.; Ricci, S.; Borrell-Estupina, V.; Bouvier, C.; Thual, O.

    2013-03-01

    Mediterranean catchments in southern France are threatened by potentially devastating fast floods which are difficult to anticipate. In order to improve the skill of rainfall-runoff models in predicting such flash floods, hydrologists use data assimilation techniques to provide real-time updates of the model using observational data. This approach seeks to reduce the uncertainties present in different components of the hydrological model (forcing, parameters or state variables) in order to minimize the error in simulated discharges. This article presents a data assimilation procedure, the best linear unbiased estimator (BLUE), used with the goal of improving the peak discharge predictions generated by an event-based hydrological model Soil Conservation Service lag and route (SCS-LR). For a given prediction date, selected model inputs are corrected by assimilating discharge data observed at the basin outlet. This study is conducted on the Lez Mediterranean basin in southern France. The key objectives of this article are (i) to select the parameter(s) which allow for the most efficient and reliable correction of the simulated discharges, (ii) to demonstrate the impact of the correction of the initial condition upon simulated discharges, and (iii) to identify and understand conditions in which this technique fails to improve the forecast skill. The correction of the initial moisture deficit of the soil reservoir proves to be the most efficient control parameter for adjusting the peak discharge. Using data assimilation, this correction leads to an average of 12% improvement in the flood peak magnitude forecast in 75% of cases. The investigation of the other 25% of cases points out a number of precautions for the appropriate use of this data assimilation procedure.

  4. Astrobiology and the Risk Landscape

    Science.gov (United States)

    Cirkovic, M. M.

    2013-09-01

    We live in the epoch of explosive development of astrobiology, a novel interdisciplinary field dealing with the origin, evolution, and the future of life. While at first glance its relevance for risk analysis is small, there is an increasing number of crossover problems and thematic areas which stem from considerations of observation selection effects and the cosmic future of humanity, as well as better understanding of our astrophysical environment and the open nature of the Earth system. In considering the totality of risks facing any intelligent species in the most general cosmic context (a natural generalization of the concept of global catastrophic risks or GCRs), there is a complex dynamical hierarchy of natural and anthropogenic risks, often tightly interrelated. I shall argue that this landscape-like structure can be defined in the space of astrobiological/SETI parameters and that it is a concept capable of unifying different strands of thought and research, a working concept and not only a metaphor. Fermi's Paradox or the "Great Silence" problem represents the crucial boundary condition on generic evolutionary trajectories of individual intelligent species; I briefly consider the conditions of its applicability as far as quantification of GCRs is concerned. Overall, such a perspective would strengthen foundations upon which various numerical models of the future of humanity can be built; the lack of such quantitative models has often been cited as the chief weakness of the entire GCR enterprise.

  5. Uncertainties in Estimates of the Risks of Late Effects from Space Radiation

    Science.gov (United States)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P.; Dicelli, J. F.

    2002-01-01

    The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, and non-cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a Maximum Likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objective's, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits.

  6. Time-Based and Event-Based Prospective Memory in Autism Spectrum Disorder: The Roles of Executive Function and Theory of Mind, and Time-Estimation

    Science.gov (United States)

    Williams, David; Boucher, Jill; Lind, Sophie; Jarrold, Christopher

    2013-01-01

    Prospective memory (remembering to carry out an action in the future) has been studied relatively little in ASD. We explored time-based (carry out an action at a pre-specified time) and event-based (carry out an action upon the occurrence of a pre-specified event) prospective memory, as well as possible cognitive correlates, among 21…

  7. Flood modelling with a distributed event-based parsimonious rainfall-runoff model: case of the karstic Lez river catchment

    Directory of Open Access Journals (Sweden)

    M. Coustau

    2012-04-01

    Full Text Available Rainfall-runoff models are crucial tools for the statistical prediction of flash floods and real-time forecasting. This paper focuses on a karstic basin in the South of France and proposes a distributed parsimonious event-based rainfall-runoff model, coherent with the poor knowledge of both evaporative and underground fluxes. The model combines a SCS runoff model and a Lag and Route routing model for each cell of a regular grid mesh. The efficiency of the model is discussed not only to satisfactorily simulate floods but also to get powerful relationships between the initial condition of the model and various predictors of the initial wetness state of the basin, such as the base flow, the Hu2 index from the Meteo-France SIM model and the piezometric levels of the aquifer. The advantage of using meteorological radar rainfall in flood modelling is also assessed. Model calibration proved to be satisfactory by using an hourly time step with Nash criterion values, ranging between 0.66 and 0.94 for eighteen of the twenty-one selected events. The radar rainfall inputs significantly improved the simulations or the assessment of the initial condition of the model for 5 events at the beginning of autumn, mostly in September–October (mean improvement of Nash is 0.09; correction in the initial condition ranges from −205 to 124 mm, but were less efficient for the events at the end of autumn. In this period, the weak vertical extension of the precipitation system and the low altitude of the 0 °C isotherm could affect the efficiency of radar measurements due to the distance between the basin and the radar (~60 km. The model initial condition S is correlated with the three tested predictors (R2 > 0.6. The interpretation of the model suggests that groundwater does not affect the first peaks of the flood, but can strongly impact subsequent peaks in the case of a multi-storm event. Because this kind of model is based on a limited

  8. A stochastical event-based continuous time step rainfall generator based on Poisson rectangular pulse and microcanonical random cascade models

    Science.gov (United States)

    Pohle, Ina; Niebisch, Michael; Zha, Tingting; Schümberg, Sabine; Müller, Hannes; Maurer, Thomas; Hinz, Christoph

    2017-04-01

    weights, which we implemented through sigmoid functions. Secondly, the branching of the first and last box is constrained to preserve the rainfall event durations generated by the Poisson rectangular pulse model. The event-based continuous time step rainfall generator has been developed and tested using 10 min and hourly rainfall data of four stations in North-Eastern Germany. The model performs well in comparison to observed rainfall in terms of event durations and mean event intensities as well as wet spell and dry spell durations. It is currently being tested using data from other stations across Germany and in different climate zones. Furthermore, the rainfall event generator is being applied in modelling approaches aimed at understanding the impact of rainfall variability on hydrological processes. Reference Olsson, J.: Evaluation of a scaling cascade model for temporal rainfall disaggregation, Hydrology and Earth System Sciences, 2, 19.30

  9. Biological Based Risk Assessment for Space Exploration

    Science.gov (United States)

    Cucinotta, Francis A.

    2011-01-01

    Exposures from galactic cosmic rays (GCR) - made up of high-energy protons and high-energy and charge (HZE) nuclei, and solar particle events (SPEs) - comprised largely of low- to medium-energy protons are the primary health concern for astronauts for long-term space missions. Experimental studies have shown that HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation, making risk assessments for cancer and degenerative risks, such as central nervous system effects and heart disease, highly uncertain. The goal for space radiation protection at NASA is to be able to reduce the uncertainties in risk assessments for Mars exploration to be small enough to ensure acceptable levels of risks are not exceeded and to adequately assess the efficacy of mitigation measures such as shielding or biological countermeasures. We review the recent BEIR VII and UNSCEAR-2006 models of cancer risks and their uncertainties. These models are shown to have an inherent 2-fold uncertainty as defined by ratio of the 95% percent confidence level to the mean projection, even before radiation quality is considered. In order to overcome the uncertainties in these models, new approaches to risk assessment are warranted. We consider new computational biology approaches to modeling cancer risks. A basic program of research that includes stochastic descriptions of the physics and chemistry of radiation tracks and biochemistry of metabolic pathways, to emerging biological understanding of cellular and tissue modifications leading to cancer is described.

  10. Getting ready for the manned mission to Mars: the astronauts' risk from space radiation.

    Science.gov (United States)

    Hellweg, Christine E; Baumstark-Khan, Christa

    2007-07-01

    Space programmes are shifting towards planetary exploration and, in particular, towards missions by human beings to the Moon and to Mars. Radiation is considered to be one of the major hazards for personnel in space and has emerged as the most critical issue to be resolved for long-term missions both orbital and interplanetary. The two cosmic sources of radiation that could impact a mission outside the Earth's magnetic field are solar particle events (SPE) and galactic cosmic rays (GCR). Exposure to the types of ionizing radiation encountered during space travel may cause a number of health-related problems, but the primary concern is related to the increased risk of cancer induction in astronauts. Predictions of cancer risk and acceptable radiation exposure in space are extrapolated from minimal data and are subject to many uncertainties. The paper describes present-day estimates of equivalent doses from GCR and solar cosmic radiation behind various shields and radiation risks for astronauts on a mission to Mars.

  11. Space Radiation: The Number One Risk to Astronaut Health beyond Low Earth Orbit

    Directory of Open Access Journals (Sweden)

    Jeffery C. Chancellor

    2014-09-01

    Full Text Available Projecting a vision for space radiobiological research necessitates understanding the nature of the space radiation environment and how radiation risks influence mission planning, timelines and operational decisions. Exposure to space radiation increases the risks of astronauts developing cancer, experiencing central nervous system (CNS decrements, exhibiting degenerative tissue effects or developing acute radiation syndrome. One or more of these deleterious health effects could develop during future multi-year space exploration missions beyond low Earth orbit (LEO. Shielding is an effective countermeasure against solar particle events (SPEs, but is ineffective in protecting crew members from the biological impacts of fast moving, highly-charged galactic cosmic radiation (GCR nuclei. Astronauts traveling on a protracted voyage to Mars may be exposed to SPE radiation events, overlaid on a more predictable flux of GCR. Therefore, ground-based research studies employing model organisms seeking to accurately mimic the biological effects of the space radiation environment must concatenate exposures to both proton and heavy ion sources. New techniques in genomics, proteomics, metabolomics and other “omics” areas should also be intelligently employed and correlated with phenotypic observations. This approach will more precisely elucidate the effects of space radiation on human physiology and aid in developing personalized radiological countermeasures for astronauts.

  12. Space Radiation: The Number One Risk to Astronaut Health beyond Low Earth Orbit.

    Science.gov (United States)

    Chancellor, Jeffery C; Scott, Graham B I; Sutton, Jeffrey P

    2014-09-11

    Projecting a vision for space radiobiological research necessitates understanding the nature of the space radiation environment and how radiation risks influence mission planning, timelines and operational decisions. Exposure to space radiation increases the risks of astronauts developing cancer, experiencing central nervous system (CNS) decrements, exhibiting degenerative tissue effects or developing acute radiation syndrome. One or more of these deleterious health effects could develop during future multi-year space exploration missions beyond low Earth orbit (LEO). Shielding is an effective countermeasure against solar particle events (SPEs), but is ineffective in protecting crew members from the biological impacts of fast moving, highly-charged galactic cosmic radiation (GCR) nuclei. Astronauts traveling on a protracted voyage to Mars may be exposed to SPE radiation events, overlaid on a more predictable flux of GCR. Therefore, ground-based research studies employing model organisms seeking to accurately mimic the biological effects of the space radiation environment must concatenate exposures to both proton and heavy ion sources. New techniques in genomics, proteomics, metabolomics and other "omics" areas should also be intelligently employed and correlated with phenotypic observations. This approach will more precisely elucidate the effects of space radiation on human physiology and aid in developing personalized radiological countermeasures for astronauts.

  13. Event-based biosurveillance of respiratory disease in Mexico, 2007-2009: connection to the 2009 influenza A(H1N1) pandemic?

    Science.gov (United States)

    Nelson, N P; Brownstein, J S; Hartley, D M

    2010-07-29

    The emergence of the 2009 pandemic influenza A(H1N1) virus in North America and its subsequent global spread highlights the public health need for early warning of infectious disease outbreaks. Event-based biosurveillance, based on local- and regional-level Internet media reports, is one approach to early warning as well as to situational awareness. This study analyses media reports in Mexico collected by the Argus biosurveillance system between 1 October 2007 and 31 May 2009. Results from Mexico are compared with the United States and Canadian media reports obtained from the HealthMap system. A significant increase in reporting frequency of respiratory disease in Mexico during the 2008-9 influenza season relative to that of 2007-8 was observed (p<0.0001). The timing of events, based on media reports, suggests that respiratory disease was prevalent in parts of Mexico, and was reported as unusual, much earlier than the microbiological identification of the pandemic virus. Such observations suggest that abnormal respiratory disease frequency and severity was occurring in Mexico throughout the winter of 2008-2009, though its connection to the emergence of the 2009 pandemic influenza A(H1N1) virus remains unclear.

  14. Influence of intra-event-based flood regime on sediment flow behavior from a typical agro-catchment of the Chinese Loess Plateau

    Science.gov (United States)

    Zhang, Le-Tao; Li, Zhan-Bin; Wang, He; Xiao, Jun-Bo

    2016-07-01

    The pluvial erosion process is significantly affected by tempo-spatial patterns of flood flows. However, despite their importance, only a few studies have investigated the sediment flow behavior that is driven by different flood regimes. The study aims to investigate the effect of intra-event-based flood regimes on the dynamics of sediment exports at Tuanshangou catchment, a typical agricultural catchment (unmanaged) in the hilly loess region on the Chinese Loess Plateau. Measurements of 193 flood events and 158 sediment-producing events were collected from Tuanshangou station between 1961 and 1969. The combined methods of hierarchical clustering approach, discriminant analysis and One-Way ANOVA were used to classify the flood events in terms of their event-based flood characteristics, including flood duration, peak discharge, and event flood runoff depth. The 193 flood events were classified into five regimes, and the mean statistical features of each regime significantly differed. Regime A includes flood events with the shortest duration (76 min), minimum flood crest (0.045 m s-1), least runoff depth (0.2 mm), and highest frequency. Regime B includes flood events with a medium duration (274 min), medium flood crest (0.206 m s-1), and minor runoff depth (0.7 mm). Regime C includes flood events with the longest duration (822 min), medium flood crest (0.236 m s-1), and medium runoff depth (1.7 mm). Regime D includes flood events with a medium duration (239 min), large flood crest (4.21 m s-1), and large runoff depth (10 mm). Regime E includes flood events with a medium duration (304 min), maximum flood crest (8.62 m s-1), and largest runoff depth (25.9 mm). The sediment yield by different flood regimes is ranked as follows: Regime E > Regime D > Regime B > Regime C > Regime A. In terms of event-based average and maximum suspended sediment concentration, these regimes are ordered as follows: Regime E > Regime D > Regime C > Regime B > Regime A. Regimes D and E

  15. An Overview of NASA's Risk of Cardiovascular Disease from Radiation Exposure

    Science.gov (United States)

    Patel, Zarana S.; Huff, Janice L.; Simonsen, Lisa C.

    2015-01-01

    The association between high doses of radiation exposure and cardiovascular damage is well established. Patients that have undergone radiotherapy for primary cancers of the head and neck and mediastinal regions have shown increased risk of heart and vascular damage and long-term development of radiation-induced heart disease [1]. In addition, recent meta-analyses of epidemiological data from atomic bomb survivors and nuclear industry workers has also shown that acute and chronic radiation exposures is strongly correlated with an increased risk of circulatory disease at doses above 0.5 Sv [2]. However, these analyses are confounded for lower doses by lifestyle factors, such as drinking, smoking, and obesity. The types of radiation found in the space environment are significantly more damaging than those found on Earth and include galactic cosmic radiation (GCR), solar particle events (SPEs), and trapped protons and electrons. In addition to the low-LET data, only a few studies have examined the effects of heavy ion radiation on atherosclerosis, and at lower, space-relevant doses, the association between exposure and cardiovascular pathology is more varied and unclear. Understanding the qualitative differences in biological responses produced by GCR compared to Earth-based radiation is a major focus of space radiation research and is imperative for accurate risk assessment for long duration space missions. Other knowledge gaps for the risk of radiation-induced cardiovascular disease include the existence of a dose threshold, low dose rate effects, and potential synergies with other spaceflight stressors. The Space Radiation Program Element within NASA's Human Research Program (HRP) is managing the research and risk mitigation strategies for these knowledge gaps. In this presentation, we will review the evidence and present an overview of the HRP Risk of Cardiovascular Disease and Other Degenerative Tissue Effects from Radiation Exposure.

  16. Third generation cephalosporin resistant Enterobacteriaceae and multidrug resistant gram-negative bacteria causing bacteremia in febrile neutropenia adult cancer patients in Lebanon, broad spectrum antibiotics use as a major risk factor, and correlation with poor prognosis

    Directory of Open Access Journals (Sweden)

    Rima eMoghnieh

    2015-02-01

    Full Text Available Bacteremia remains a major cause of life-threatening complications in patients receiving anticancer chemotherapy. The spectrum and susceptibility profiles of causative microorganisms differ with time and place. Data from Lebanon are scarce. We aim at evaluating the epidemiology of bacteremia in cancer patients in a university hospital in Lebanon, emphasizing antibiotic resistance and risk factors of multi-drug resistant organism (MDRO-associated bacteremia.This is a retrospective study of 75 episodes of bacteremia occurring in febrile neutropenic patients admitted to the hematology-oncology unit at Makassed General Hospital, Lebanon, from October 2009-January 2012.It corresponds to epidemiological data on bacteremia episodes in febrile neutropenic cancer patients including antimicrobial resistance and identification of risk factors associated with third generation cephalosporin resistance (3GCR and MDRO-associated bacteremia. Out of 75 bacteremias, 42.7% were gram-positive (GP, and 57.3% were gram-negative (GN. GP bacteremias were mostly due to methicillin-resistant coagulase negative staphylococci (28% of total bacteremias and 66% of GP bacteremias. Among the GN bacteremias, Escherichia coli (22.7% of total, 39.5% of GN organisms and Klebsiellapneumoniae(13.3% of total, 23.3% of GN organisms were the most important causative agents. GN bacteremia due to 3GC sensitive (3GCS bacteria represented 28% of total bacteremias, while 29% were due to 3GCR bacteria and 9% were due to carbapenem-resistant organisms. There was a significant correlation between bacteremia with MDRO and subsequent intubation, sepsis and mortality. Among potential risk factors, only broad spectrum antibiotic intake >4 days before bacteremia was found to be statistically significant for acquisition of 3GCR bacteria. Using carbapenems or piperacillin/ tazobactam>4 days before bacteremia was significantly associated with the emergence of MDRO (p value<0.05.

  17. On event based state estimation

    NARCIS (Netherlands)

    Sijs, J.; Lazar, M.

    2009-01-01

    To reduce the amount of data transfer in networked control systems and wireless sensor networks, measurements are usually taken only when an event occurs, rather than at each synchronous sampling instant. However, this complicates estimation and control problems considerably. The goal of this paper

  18. On event based state estimation

    NARCIS (Netherlands)

    Sijs, J.; Lazar, M.

    2009-01-01

    To reduce the amount of data transfer in networked control systems and wireless sensor networks, measurements are usually taken only when an event occurs, rather than at each synchronous sampling instant. However, this complicates estimation and control problems considerably. The goal of this paper

  19. Comment of "Event-based soil loss models for construction sites" by Trenouth and Gharabaghi, J. Hydrol. doi: 10.1016/jhydrol.2015.03.010

    Science.gov (United States)

    Kinnell, P. I. A.

    2015-09-01

    Trenouth and Gharabaghi (2015) present two models which replace the EI30 index used as the event erosivity index in the USLE/RUSLE with ones that include runoff and values of EI30 to powers that differ for 1.0 as the event erosivity factor in modelling soil loss for construction sites. Their analysis on the application of these models focused on data from 5 locations as a whole but did not show how the models worked at each location. Practically, the ability to predict sediment yields at a specific location is more relevant than the capacity of a model to predict sediment yields globally. Also, the mathematical structure of their proposed models shows little regard to the physical processes involved in causing erosion and sediment yield. There is still the need to develop event-based empirical models for construction sites that are robust because they give proper consideration to the erosion process involved, and take account of the fact that sediment yield is usually determined from measurements of suspended load whereas soil loss at the scale for which the USLE/RUSLE model was developed includes both suspended load and bed load.

  20. How crucial is it to account for the antecedent moisture conditions in flood forecasting? Comparison of event-based and continuous approaches on 178 catchments

    Science.gov (United States)

    Berthet, L.; Andréassian, V.; Perrin, C.; Javelle, P.

    2009-06-01

    This paper compares event-based and continuous hydrological modelling approaches for real-time forecasting of river flows. Both approaches are compared using a lumped hydrologic model (whose structure includes a soil moisture accounting (SMA) store and a routing store) on a data set of 178 French catchments. The main focus of this study was to investigate the actual impact of soil moisture initial conditions on the performance of flood forecasting models and the possible compensations with updating techniques. The rainfall-runoff model assimilation technique we used does not impact the SMA component of the model but only its routing part. Tests were made by running the SMA store continuously or on event basis, everything else being equal. The results show that the continuous approach remains the reference to ensure good forecasting performances. We show, however, that the possibility to assimilate the last observed flow considerably reduces the differences in performance. Last, we present a robust alternative to initialize the SMA store where continuous approaches are impossible because of data availability problems.

  1. How crucial is it to account for the antecedent moisture conditions in flood forecasting? Comparison of event-based and continuous approaches on 178 catchments

    Directory of Open Access Journals (Sweden)

    L. Berthet

    2009-06-01

    Full Text Available This paper compares event-based and continuous hydrological modelling approaches for real-time forecasting of river flows. Both approaches are compared using a lumped hydrologic model (whose structure includes a soil moisture accounting (SMA store and a routing store on a data set of 178 French catchments. The main focus of this study was to investigate the actual impact of soil moisture initial conditions on the performance of flood forecasting models and the possible compensations with updating techniques. The rainfall-runoff model assimilation technique we used does not impact the SMA component of the model but only its routing part. Tests were made by running the SMA store continuously or on event basis, everything else being equal. The results show that the continuous approach remains the reference to ensure good forecasting performances. We show, however, that the possibility to assimilate the last observed flow considerably reduces the differences in performance. Last, we present a robust alternative to initialize the SMA store where continuous approaches are impossible because of data availability problems.

  2. Comparison of event-based analysis of glaucoma progression assessed subjectively on visual fields and retinal nerve fibre layer attenuation measured by optical coherence tomography.

    Science.gov (United States)

    Kaushik, Sushmita; Mulkutkar, Samyak; Pandav, Surinder Singh; Verma, Neelam; Gupta, Amod

    2015-02-01

    The purpose is to study the ability of an event-based analysis of retinal nerve fibre layer (RNFL) attenuation measured by Stratus(®) optical coherence tomography (OCT) and to detect progression across the spectrum of glaucoma. Adult glaucoma suspects, ocular hypertensives and glaucoma patients who had undergone baseline RNFL thickness measurement on Stratus OCT and reliable automated visual field examination by Humphrey's visual field analyser prior to March 2007 and had 5-year follow-up data were recruited. Progression on OCT was defined by two criteria: decrease in average RNFL thickness from baseline by at least 10 and 20 µ. Visual field progression was defined by the modified Hodapp-Parrish-Anderson criteria. Absolute and percentage change in RNFL thickness from baseline was compared in progressors and non-progressors on visual fields. Concordance between structural and functional progression was analysed. 318 eyes of 162 patients were analysed. 35 eyes (11 %) progressed by visual fields, 8 (2.5 %) progressed using the 20 µ loss criterion, while 30 eyes (9.4 %) progressed using the 10 µ loss criterion. In glaucoma suspects, mean absolute RNFL attenuation was 8.6 µ (12.1 % of baseline) in those who progressed to glaucoma by visual fields. OCT was more useful to detect progression in early glaucoma, but performed poorly in advanced glaucoma. The 10 µ criterion appears to be closer to visual field progression. However, the ability to detect progression varies considerably between functional and structural tools depending upon the severity of the disease.

  3. Detection of prospective memory deficits in mild cognitive impairment of suspected Alzheimer's disease etiology using a novel event-based prospective memory task.

    LENUS (Irish Health Repository)

    Blanco-Campal, Alberto

    2009-01-01

    We investigated the relative discriminatory efficacy of an event-based prospective memory (PM) task, in which specificity of the instructions and perceptual salience of the PM cue were manipulated, compared with two widely used retrospective memory (RM) tests (Rivermead Paragraph Recall Test and CERAD-Word List Test), when detecting mild cognitive impairment of suspected Alzheimer\\'s disease etiology (MCI-AD) (N = 19) from normal controls (NC) (N = 21). Statistical analyses showed high discriminatory capacity of the PM task for detecting MCI-AD. The Non-Specific-Non-Salient condition proved particularly useful in detecting MCI-AD, possibly reflecting the difficulty of the task, requiring more strategic attentional resources to monitor for the PM cue. With a cutoff score of <4\\/10, the Non-Specific-Non-Salient condition achieved a sensitivity = 84%, and a specificity = 95%, superior to the most discriminative RM test used (CERAD-Total Learning: sensitivity = 83%; specificity = 76%). Results suggest that PM is an early sign of memory failure in MCI-AD and may be a more pronounced deficit than retrospective failure, probably reflecting the greater self-initiated retrieval demands involved in the PM task used. Limitations include the relatively small sample size, and the use of a convenience sample (i.e. memory clinic attenders and healthy active volunteers), reducing the generalizability of the results, which should be regarded as preliminary. (JINS, 2009, 15, 154-159.).

  4. Space radiation risks to the central nervous system

    Science.gov (United States)

    Cucinotta, Francis A.; Alp, Murat; Sulzman, Frank M.; Wang, Minli

    2014-07-01

    Central nervous system (CNS) risks which include during space missions and lifetime risks due to space radiation exposure are of concern for long-term exploration missions to Mars or other destinations. Possible CNS risks during a mission are altered cognitive function, including detriments in short-term memory, reduced motor function, and behavioral changes, which may affect performance and human health. The late CNS risks are possible neurological disorders such as premature aging, and Alzheimer's disease (AD) or other dementia. Radiation safety requirements are intended to prevent all clinically significant acute risks. However the definition of clinically significant CNS risks and their dependences on dose, dose-rate and radiation quality is poorly understood at this time. For late CNS effects such as increased risk of AD, the occurrence of the disease is fatal with mean time from diagnosis of early stage AD to death about 8 years. Therefore if AD risk or other late CNS risks from space radiation occur at mission relevant doses, they would naturally be included in the overall acceptable risk of exposure induced death (REID) probability for space missions. Important progress has been made in understanding CNS risks due to space radiation exposure, however in general the doses used in experimental studies have been much higher than the annual galactic cosmic ray (GCR) dose (∼0.1 Gy/y at solar maximum and ∼0.2 Gy/y at solar minimum with less than 50% from HZE particles). In this report we summarize recent space radiobiology studies of CNS effects from particle accelerators simulating space radiation using experimental models, and make a critical assessment of their relevance relative to doses and dose-rates to be incurred on a Mars mission. Prospects for understanding dose, dose-rate and radiation quality dependencies of CNS effects and extrapolation to human risk assessments are described.

  5. Evidence Report: Risk of Acute and Late Central Nervous System Effects from Radiation Exposure

    Science.gov (United States)

    Nelson, Gregory A.; Simonsen, Lisa; Huff, Janice L.

    2016-01-01

    Possible acute and late risks to the central nervous system (CNS) from galactic cosmic rays (GCR) and solar particle events (SPE) are concerns for human exploration of space. Acute CNS risks may include: altered cognitive function, reduced motor function, and behavioral changes, all of which may affect performance and human health. Late CNS risks may include neurological disorders such as Alzheimer's disease (AD), dementia and premature aging. Although detrimental CNS changes are observed in humans treated with high-dose radiation (e.g., gamma rays and 9 protons) for cancer and are supported by experimental evidence showing neurocognitive and behavioral effects in animal models, the significance of these results on the morbidity to astronauts has not been elucidated. There is a lack of human epidemiology data on which to base CNS risk estimates; therefore, risk projection based on scaling to human data, as done for cancer risk, is not possible for CNS risks. Research specific to the spaceflight environment using animal and cell models must be compiled to quantify the magnitude of CNS changes in order to estimate this risk and to establish validity of the current permissible exposure limits (PELs). In addition, the impact of radiation exposure in combination with individual sensitivity or other space flight factors, as well as assessment of the need for biological/pharmaceutical countermeasures, will be considered after further definition of CNS risk occurs.

  6. Modeller subjectivity and calibration impacts on hydrological model applications: an event-based comparison for a road-adjacent catchment in south-east Norway.

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve W; Jansson, Per-Erik; Stolte, Jannes; French, Helen K; Folkeson, Lennart; Sassner, Mona

    2015-01-01

    Identifying a 'best' performing hydrologic model in a practical sense is difficult due to the potential influences of modeller subjectivity on, for example, calibration procedure and parameter selection. This is especially true for model applications at the event scale where the prevailing catchment conditions can have a strong impact on apparent model performance and suitability. In this study, two lumped models (CoupModel and HBV) and two physically-based distributed models (LISEM and MIKE SHE) were applied to a small catchment upstream of a road in south-eastern Norway. All models were calibrated to a single event representing typical winter conditions in the region and then applied to various other winter events to investigate the potential impact of calibration period and methodology on model performance. Peak flow and event-based hydrographs were simulated differently by all models leading to differences in apparent model performance under this application. In this case-study, the lumped models appeared to be better suited for hydrological events that differed from the calibration event (i.e., events when runoff was generated from rain on non-frozen soils rather than from rain and snowmelt on frozen soil) while the more physical-based approaches appeared better suited during snowmelt and frozen soil conditions more consistent with the event-specific calibration. This was due to the combination of variations in subsurface conditions over the eight events considered, the subsequent ability of the models to represent the impact of the conditions (particularly when subsurface conditions varied greatly from the calibration event), and the different approaches adopted to calibrate the models. These results indicate that hydrologic models may not only need to be selected on a case-by-case basis but also have their performance evaluated on an application-by-application basis since how a model is applied can be equally important as inherent model structure.

  7. The Influence of the Annual Number of Storms on the Derivation of the Flood Frequency Curve through Event-Based Simulation

    Directory of Open Access Journals (Sweden)

    Alvaro Sordo-Ward

    2016-08-01

    Full Text Available This study addresses the question of how to select the minimum set of storms that should be simulated each year in order to estimate an accurate flood frequency curve for return periods ranging between 1 and 1000 years. The Manzanares basin (Spain was used as a study case. A continuous 100,000-year hourly rainfall series was generated using the stochastic spatial–temporal model RanSimV3. Individual storms were extracted from the series by applying the exponential method. For each year, the extracted storms were transformed into hydrographs by applying an hourly time-step semi-distributed event-based rainfall–runoff model, and the maximum peak flow per year was determined to generate the reference flood frequency curve. Then, different flood frequency curves were obtained considering the N storms with maximum rainfall depth per year, with 1 ≤ N ≤ total number of storms. Main results show that: (a the degree of alignment between the calculated flood frequency curves and the reference flood frequency curve depends on the return period considered, increasing the accuracy for higher return periods; (b for the analyzed case studies, the flood frequency curve for medium and high return period (50 ≤ return period ≤ 1000 years can be estimated with a difference lower than 3% (compared to the reference flood frequency curve by considering the three storms with the maximum total rainfall depth each year; (c when considering only the greatest storm of the year, for return periods higher than 10 years, the difference for the estimation of the flood frequency curve is lower than 10%; and (d when considering the three greatest storms each year, for return periods higher than 100 years, the probability of achieving simultaneously a hydrograph with the annual maximum peak flow and the maximum volume is 94%.

  8. Flood-event based metal distribution patterns in water as approach for source apportionment of pollution on catchment scale: Examples from the River Elbe

    Science.gov (United States)

    Baborowski, Martina; Einax, Jürgen W.

    2016-04-01

    With the implementation of European Water Frame Work Directive (EU-WFD), the pollution sources in the River Elbe were assessed by the River Basin Community Elbe (RBC Elbe). Contaminated old sediments played the most significant role for inorganic and organic pollution. In terms of further improvement of the water quality in the river system, a prioritization of the known pollution sources is necessary, with respect to the expected effect in the case of their remediation. This requires information on mobility of contaminated sediments. To create a tool that allows the assessment of pollution trends in the catchment area, event based flood investigations were carried out at a sampling site in the Middle Elbe. The investigations were based on a comparable, discharge related sampling strategy. Four campaigns were performed between 1995 and 2006. The majority of the investigated 16 elements (>80%) studied more intensively in 2006 reached its maximum concentration during the first five days of the event. Only the concentrations of B, Cl-, and U declined with increasing discharge during the flood. The aim of the study was to verify that each flood event is characterized by an internal structure of the water quality. This structure is formed by the appearance of maximum values of water quality parameters at different times during the event. It could be detected by descriptive and multivariate statistical methods. As a result, internal structure of the water quality during the flood was influenced primarily by the source of the metals in the catchment area and its distance from the sampling point. The transport of metals in dissolved, colloidal or particulate form and changes of their ratios during the flood were however, not decisive for the formation of the structure. Our results show that the comparison of the structures obtained from events in different years is indicative of the pollution trend in the catchment area. Exemplarily the trend of the metal pollution in the

  9. Advanced Radiation Protection (ARP): Thick GCR Shield Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The Advanced Radiation Project to date has focused on SEP events.  For long duration missions outside Earth’s geomagnetic field, the galactic cosmic ray...

  10. GCR-Induced Photon Luminescence of the Moon

    Science.gov (United States)

    Lee, K. T.; Wilson, T. L.

    2008-01-01

    It is shown that the Moon has a ubiquitous photon luminescence induced by Galactic cosmic-rays (GCRs), using the Monte Carlo particle-physics program FLUKA. Both the fluence and the flux of the radiation can be determined by this method, but only the fluence will be presented here. This is in addition to thermal radiation emitted due to the Moon s internal temperature and radioactivity. This study is a follow-up to an earlier discussion [1] that addressed several misconceptions regarding Moonshine in the Earth-Moon system (Figure 1) and predicted this effect. There also exists a related x-ray fluorescence induced by solar energetic particles (SEPs, <350 MeV) and solar photons at lower x-ray energies, although this latter fluorescence was studied on Apollo 15 and 16 [2- 5], Lunar Prospector [6], and even EGRET [7].

  11. STUDY ON SUB-DRY CUTTING GCr12

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Through the comparison study on cutting force, cutting temperature and machined surface quality with the sub-dry cutting traditional cooling method, it is shown that sub-dry cutting can retard the wear of the tooled parts. It is beneficial to realize the production without pollution and meet the demand of clean environment.

  12. What Spectrum Should the GCR Acceleration Theory Explain?

    Science.gov (United States)

    Grigorov, N.; Tolstaya, E.

    2001-08-01

    The all-particle spectra measured by three different instruments: 'Proton-1,2,3,', 'Proton-4', and TIC are discussed. It is shown that all three experiments reveal a 'knee' in the all particle spectrum at ~ 1 TeV. Analysis of all these experimental data proves, that in the energy range E > 1 TeV the all particle spectrum is the sum of two power law spectra: one with spectral index β=2.6 and the other one with β

  13. Improved version of BTOPMC model and its application in event-based hydrologic simulations%改进的BTOPMC模型及其在水文模拟中的应用

    Institute of Scientific and Technical Information of China (English)

    王国强; 周买春; 竹内邦良; 石平博

    2007-01-01

    In this paper, a grid-based distributed hydrological model BTOPMC (Block-wise use of TOPMODEL) is introduced, which was developed from the original TOPMODEL. In order to broaden the model's application to arid regions, improvement methodology is also implemented. The canopy interception and soil infiltration processes were incorporated into the original BTOPMC to model event-based runoff simulation in large arid regions. One designed infiltration model with application of time compression approximation method is emphasized and validated for improving model's performance for event hydrological simulations with a case study of Lushi River basin.

  14. Evidence Report: Risk of Acute Radiation Syndromes Due to Solar Particle Events

    Science.gov (United States)

    Carnell, Lisa; Blattnig, Steve; Hu, Shaowen; Huff, Janice; Kim, Myung-Hee; Norman, Ryan; Patel, Zarana; Simonsen, Lisa; Wu, Honglu

    2016-01-01

    Crew health and performance may be impacted by a major solar particle event (SPE), multiple SPEs, or the cumulative effect of galactic cosmic rays (GCR) and SPEs. Beyond low-Earth orbit, the protection of the Earth's magnetosphere is no longer available, such that increased shielding and protective mechanisms are necessary in order to prevent acute radiation sickness and impacts to mission success or crew survival. While operational monitoring and shielding are expected to minimize radiation exposures, there are EVA scenarios outside of low-Earth orbit where the risk of prodromal effects, including nausea, vomiting, anorexia, and fatigue, as well as skin injury and depletion of the blood-forming organs (BFO), may occur. There is a reasonable concern that a compromised immune system due to high skin doses from a SPE or due to synergistic space flight factors (e.g., microgravity) may lead to increased risk to the BFO. The primary data available at present are derived from analyses of medical patients and persons accidentally exposed to acute, high doses of low-linear energy transfer (LET) (or terrestrial) radiation. Data more specific to the space flight environment must be compiled to quantify the magnitude of increase of this risk and to develop appropriate protection strategies. In particular, information addressing the distinct differences between solar proton exposures and terrestrial exposure scenarios, including radiation quality, dose-rate effects, and non-uniform dose distributions, is required for accurate risk estimation.

  15. Risk management

    OpenAIRE

    McManus, John

    2009-01-01

    Few projects are completed on time, on budget, and to their original requirement or specifications. Focusing on what project managers need to know about risk in the pursuit of delivering projects, Risk Management covers key components of the risk management process and the software development process, as well as best practices for risk identification, risk planning, and risk analysis. The book examines risk planning, risk analysis responses to risk, the tracking and modelling of risks, intel...

  16. Review the number of accidents in Tehran over a two-year period and prediction of the number of events based on a time-series model

    Science.gov (United States)

    Teymuri, Ghulam Heidar; Sadeghian, Marzieh; Kangavari, Mehdi; Asghari, Mehdi; Madrese, Elham; Abbasinia, Marzieh; Ahmadnezhad, Iman; Gholizadeh, Yavar

    2013-01-01

    Background: One of the significant dangers that threaten people’s lives is the increased risk of accidents. Annually, more than 1.3 million people die around the world as a result of accidents, and it has been estimated that approximately 300 deaths occur daily due to traffic accidents in the world with more than 50% of that number being people who were not even passengers in the cars. The aim of this study was to examine traffic accidents in Tehran and forecast the number of future accidents using a time-series model. Methods: The study was a cross-sectional study that was conducted in 2011. The sample population was all traffic accidents that caused death and physical injuries in Tehran in 2010 and 2011, as registered in the Tehran Emergency ward. The present study used Minitab 15 software to provide a description of accidents in Tehran for the specified time period as well as those that occurred during April 2012. Results: The results indicated that the average number of daily traffic accidents in Tehran in 2010 was 187 with a standard deviation of 83.6. In 2011, there was an average of 180 daily traffic accidents with a standard deviation of 39.5. One-way analysis of variance indicated that the average number of accidents in the city was different for different months of the year (P < 0.05). Most of the accidents occurred in March, July, August, and September. Thus, more accidents occurred in the summer than in the other seasons. The number of accidents was predicted based on an auto-regressive, moving average (ARMA) for April 2012. The number of accidents displayed a seasonal trend. The prediction of the number of accidents in the city during April of 2012 indicated that a total of 4,459 accidents would occur with mean of 149 accidents per day during these three months. Conclusion: The number of accidents in Tehran displayed a seasonal trend, and the number of accidents was different for different seasons of the year. PMID:26120405

  17. Third generation cephalosporin resistant Enterobacteriaceae and multidrug resistant gram-negative bacteria causing bacteremia in febrile neutropenia adult cancer patients in Lebanon, broad spectrum antibiotics use as a major risk factor, and correlation with poor prognosis.

    Science.gov (United States)

    Moghnieh, Rima; Estaitieh, Nour; Mugharbil, Anas; Jisr, Tamima; Abdallah, Dania I; Ziade, Fouad; Sinno, Loubna; Ibrahim, Ahmad

    2015-01-01

    Bacteremia remains a major cause of life-threatening complications in patients receiving anticancer chemotherapy. The spectrum and susceptibility profiles of causative microorganisms differ with time and place. Data from Lebanon are scarce. We aim at evaluating the epidemiology of bacteremia in cancer patients in a university hospital in Lebanon, emphasizing antibiotic resistance and risk factors of multi-drug resistant organism (MDRO)-associated bacteremia. This is a retrospective study of 75 episodes of bacteremia occurring in febrile neutropenic patients admitted to the hematology-oncology unit at Makassed General Hospital, Lebanon, from October 2009-January 2012. It corresponds to epidemiological data on bacteremia episodes in febrile neutropenic cancer patients including antimicrobial resistance and identification of risk factors associated with third generation cephalosporin resistance (3GCR) and MDRO-associated bacteremia. Out of 75 bacteremias, 42.7% were gram-positive (GP), and 57.3% were gram-negative (GN). GP bacteremias were mostly due to methicillin-resistant coagulase negative staphylococci (28% of total bacteremias and 66% of GP bacteremias). Among the GN bacteremias, Escherichia coli (22.7% of total, 39.5% of GN organisms) and Klebsiella pneumoniae(13.3% of total, 23.3% of GN organisms) were the most important causative agents. GN bacteremia due to 3GC sensitive (3GCS) bacteria represented 28% of total bacteremias, while 29% were due to 3GCR bacteria and 9% were due to carbapenem-resistant organisms. There was a significant correlation between bacteremia with MDRO and subsequent intubation, sepsis and mortality. Among potential risk factors, only broad spectrum antibiotic intake >4 days before bacteremia was found to be statistically significant for acquisition of 3GCR bacteria. Using carbapenems or piperacillin/tazobactam>4 days before bacteremia was significantly associated with the emergence of MDRO (p < 0.05). Our findings have major

  18. NASA space cancer risk model-2014: Uncertainties due to qualitative differences in biological effects of HZE particles

    Science.gov (United States)

    Cucinotta, Francis

    Uncertainties in estimating health risks from exposures to galactic cosmic rays (GCR) — comprised of protons and high-energy and charge (HZE) nuclei are an important limitation to long duration space travel. HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation leading to large uncertainties in predicting risks to humans. Our NASA Space Cancer Risk Model-2012 (NSCR-2012) for estimating lifetime cancer risks from space radiation included several new features compared to earlier models from the National Council on Radiation Protection and Measurements (NCRP) used at NASA. New features of NSCR-2012 included the introduction of NASA defined radiation quality factors based on track structure concepts, a Bayesian analysis of the dose and dose-rate reduction effectiveness factor (DDREF) and its uncertainty, and the use of a never-smoker population to represent astronauts. However, NSCR-2012 did not include estimates of the role of qualitative differences between HZE particles and low LET radiation. In this report we discuss evidence for non-targeted effects increasing cancer risks at space relevant HZE particle absorbed doses in tissue (Mars exploration will be described, and compared to those of our earlier NSCR-2012 model.

  19. Concentration risk

    Directory of Open Access Journals (Sweden)

    Matić Vesna

    2016-01-01

    Full Text Available Concentration risk has been gaining a special dimension in the contemporary financial and economic environment. Financial institutions are exposed to this risk mainly in the field of lending, mostly through their credit activities and concentration of credit portfolios. This refers to the concentration of different exposures within a single risk category (credit risk, market risk, operational risk, liquidity risk.

  20. Implementation of one Keyboard-Event-Based Computer Forensics Process Modle%一种基于键盘事件计算机取证过程模型的实现

    Institute of Scientific and Technical Information of China (English)

    刘文俭

    2014-01-01

    In the paper, simple introduction of Computer Forensics and influences of Computer Anti-forensics are made,And then, A Keyboard-Event-Based forensic process flow chart according to the revised law enforcement process model combining with dynamic and static forensic technology and conventional evidence extraction technology has been drawn out.%该文简述了计算机取证技术及反取证技术对计算机取证结果的影响,结合动态、静态取证技术及常规证据提取技术,利用修改后的法律执行过程模型画出了基于键盘事件提取的取证过程流程图并将其实现。

  1. Evidence Report: Risk of Cardiovascular Disease and Other Degenerative Tissue Effects from Radiation Exposure

    Science.gov (United States)

    Patel, Zarana; Huff, Janice; Saha, Janapriya; Wang, Minli; Blattnig, Steve; Wu, Honglu; Cucinotta, Francis

    2015-01-01

    Occupational radiation exposure from the space environment may result in non-cancer or non-CNS degenerative tissue diseases, such as cardiovascular disease, cataracts, and respiratory or digestive diseases. However, the magnitude of influence and mechanisms of action of radiation leading to these diseases are not well characterized. Radiation and synergistic effects of radiation cause DNA damage, persistent oxidative stress, chronic inflammation, and accelerated tissue aging and degeneration, which may lead to acute or chronic disease of susceptible organ tissues. In particular, cardiovascular pathologies such as atherosclerosis are of major concern following gamma-ray exposure. This provides evidence for possible degenerative tissue effects following exposures to ionizing radiation in the form of the GCR or SPEs expected during long-duration spaceflight. However, the existence of low dose thresholds and dose-rate and radiation quality effects, as well as mechanisms and major risk pathways, are not well-characterized. Degenerative disease risks are difficult to assess because multiple factors, including radiation, are believed to play a role in the etiology of the diseases. As additional evidence is pointing to lower, space-relevant thresholds for these degenerative effects, particularly for cardiovascular disease, additional research with cell and animal studies is required to quantify the magnitude of this risk, understand mechanisms, and determine if additional protection strategies are required.The NASA PEL (Permissive Exposure Limit)s for cataract and cardiovascular risks are based on existing human epidemiology data. Although animal and clinical astronaut data show a significant increase in cataracts following exposure and a reassessment of atomic bomb (A-bomb) data suggests an increase in cardiovascular disease from radiation exposure, additional research is required to fully understand and quantify these adverse outcomes at lower doses (less than 0.5 gray

  2. An event-based account of conformity.

    Science.gov (United States)

    Kim, Diana; Hommel, Bernhard

    2015-04-01

    People often change their behavior and beliefs when confronted with deviating behavior and beliefs of others, but the mechanisms underlying such phenomena of conformity are not well understood. Here we suggest that people cognitively represent their own actions and others' actions in comparable ways (theory of event coding), so that they may fail to distinguish these two categories of actions. If so, other people's actions that have no social meaning should induce conformity effects, especially if those actions are similar to one's own actions. We found that female participants adjusted their manual judgments of the beauty of female faces in the direction consistent with distracting information without any social meaning (numbers falling within the range of the judgment scale) and that this effect was enhanced when the distracting information was presented in movies showing the actual manual decision-making acts. These results confirm that similarity between an observed action and one's own action matters. We also found that the magnitude of the standard conformity effect was statistically equivalent to the movie-induced effect.

  3. Event-based modularization of reactive systems

    NARCIS (Netherlands)

    Malakuti, Somayeh; Aksit, Mehmet

    2014-01-01

    There is a large number of complex software systems that have reactive behavior. As for any other software system, reactive systems are subject to evolution demands. This paper defines a set requirements that must be fulfilled so that reuse of reactive software systems can be increased. Detailed ana

  4. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  5. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States.

  6. The potential impact of bystander effects on radiation risks in a Mars mission

    Science.gov (United States)

    Brenner, D. J.; Elliston, C. D.; Hall, E. I. (Principal Investigator)

    2001-01-01

    Densely ionizing (high-LET) galactic cosmic rays (GCR) contribute a significant component of the radiation risk in free space. Over a period of a few months-sufficient for the early stages of radiation carcinogenesis to occur-a significant proportion of cell nuclei will not be traversed. There is convincing evidence, at least in vitro, that irradiated cells can send out signals that can result in damage to nearby unirradiated cells. This observation can hold even when the unirradiated cells have been exposed to low doses of low-LET radiation. We discuss here a quantitative model based on the a formalism, an approach that incorporates radiobiological damage both from a bystander response to signals emitted by irradiated cells, and also from direct traversal of high-LET radiations through cell nuclei. The model produces results that are consistent with those of a series of studies of the bystander phenomenon using a high-LET microbeam, with the end point of in vitro oncogenic transformation. According to this picture, for exposure to high-LET particles such as galactic cosmic rays other than protons, the bystander effect is significant primarily at low fluences, i.e., exposures where there are significant numbers of untraversed cells. If the mechanisms postulated here were applicable in vivo, using a linear extrapolation of risks derived from studies using intermediate doses of high-LET radiation (where the contribution of the bystander effect may be negligible) to estimate risks at very low doses (where the bystander effect may be dominant) could underestimate the true risk from low doses of high-LET radiation. It would be highly premature simply to abandon current risk projections for high-LET, low-dose radiation; however, these considerations would suggest caution in applying results derived from experiments using high-LET radiation at fluences above approximately 1 particle per nucleus to risk estimation for a Mars mission.

  7. Global cardiovascular risk stratification among hypertensive patients treated in a Family Health Unit of Parnaíba, Piauí

    Directory of Open Access Journals (Sweden)

    Elce de Seixas Nascimento

    2012-09-01

    Full Text Available Objective: To stratify the global cardiovascular risk among hypertensive patients attended in a Family Health Unit (FHU. Methods: A quantitative, cross-sectional and descriptive study with population of hypertensive patients undergoing treatment in a FHU, module 34, in Parnaíba, Piauí, Brazil, in the period from July to August 2011. The sample consisted of 45 volunteers, selected by free demand conglomerate, who filled a form with questions that support the analysis and Global Cardiovascular Risk stratification (GCR, according to the VI Brazilian Guidelines on Hypertension (VI BGH - 2010, The European Society of Cardiology (ESC and European Society of Hypertension (ESH - 2007. The subjects were then submitted to measurement of blood pressure (BP, waist circumference (WC and body mass index (BMI. Results: The most evident risk factor in the sample was overweight/obesity in 75.5% (n=34, followed by sedentary lifestyle in 73.3% (n=33 and hypercholesterolemia in 55.5% (n=25. The data collected resulted in a stratification in which 84.4% (n=38 presented high added risk and 15.5% (n=7 a very high added risk of presenting cardiovascular events in the next 10 years. Conclusion: The stratification in the population studied indicated high incidence of such factors, pointing to the need of interfering in this population segment, in order to promote changes in lifestyle that generate prevention and control of cardiovascular diseases.

  8. First field-based observations of δ(2)H and δ(18)O values of event-based precipitation, rivers and other water bodies in the Dzungarian Gobi, SW Mongolia.

    Science.gov (United States)

    Burnik Šturm, Martina; Ganbaatar, Oyunsaikhan; Voigt, Christian C; Kaczensky, Petra

    2017-05-01

    For certain remote areas like Mongolia, field-based precipitation, surface and ground water isotopic data are scarce. So far no such data exist for the Mongolian Gobi desert, which hinders the understanding of isotopic fractionation processes in this extreme, arid region. We collected 26 event-based precipitation samples, 39 Bij river samples, and 75 samples from other water bodies in the Dzungarian Gobi in SW Mongolia over a period of 16 months for hydrogen and oxygen stable isotope analysis. δ(2)H and δ(18)O values in precipitation show high seasonal variation and cover an extreme range: 175 ‰ for δ(2)H and 24 ‰ for δ(18)O values. The calculated local meteoric water line (LMWL) shows the isotopic characteristics of precipitation in an arid region. Individual water samples fall into one of three groups: within, above or below the 95 % confidence interval of LMWL. Data presented provide a basis for future studies in this region.

  9. Volatility Risk

    OpenAIRE

    Zhiguang Wang

    2009-01-01

    Classical capital asset pricing theory tells us that riskaverse investors would require higher returns to compensate for higher risk on an investment. One type of risk is price (return) risk, which reflects uncertainty in the price level and is measured by the volatility (standard deviation) of asset returns. Volatility itself is also known to be random and hence is perceived as another type of risk. Investors can bear price risk in exchange for a higher return. But are investors willing to p...

  10. Managing project risks and uncertainties

    Directory of Open Access Journals (Sweden)

    Mike Mentis

    2015-01-01

    Full Text Available This article considers threats to a project slipping on budget, schedule and fit-for-purpose. Threat is used here as the collective for risks (quantifiable bad things that can happen and uncertainties (poorly or not quantifiable bad possible events. Based on experience with projects in developing countries this review considers that (a project slippage is due to uncertainties rather than risks, (b while eventuation of some bad things is beyond control, managed execution and oversight are still the primary means to keeping within budget, on time and fit-for-purpose, (c improving project delivery is less about bigger and more complex and more about coordinated focus, effectiveness and developing thought-out heuristics, and (d projects take longer and cost more partly because threat identification is inaccurate, the scope of identified threats is too narrow, and the threat assessment product is not integrated into overall project decision-making and execution. Almost by definition, what is poorly known is likely to cause problems. Yet it is not just the unquantifiability and intangibility of uncertainties causing project slippage, but that they are insufficiently taken into account in project planning and execution that cause budget and time overruns. Improving project performance requires purpose-driven and managed deployment of scarce seasoned professionals. This can be aided with independent oversight by deeply experienced panelists who contribute technical insights and can potentially show that diligence is seen to be done.

  11. Cancer Risk in Astronauts: A Constellation of Uncommon Consequences

    Science.gov (United States)

    Milder, Caitlin M.; Elgart, S. Robin; Chappell, Lori; Charvat, Jaqueline M.; Van Baalen, Mary; Huff, Janice L.; Semones, Edward J.

    2017-01-01

    Excess cancers resulting from external radiation exposures have been noted since the early 1950s, when a rise in leukemia rates was first reported in young atomic bomb survivors [1]. Further studies in atomic bomb survivors, cancer patients treated with radiotherapy, and nuclear power plant workers have confirmed that radiation exposure increases the risk of not only leukemia, but also a wide array of solid cancers [2,3]. NASA has long been aware of this risk and limits astronauts' risk of exposure-induced death (REID) from cancer by specifying permissible mission durations (PMD) for astronauts on an individual basis. While cancer is present among astronauts, current data does not suggest any excess of known radiation-induced cancers relative to a comparable population of U.S. adults; however, very uncommon cancers have been diagnosed in astronauts including nasopharyngeal cancer, lymphoma of the brain, and acral myxoinflammatory fibroblastic sarcoma. In order to study cancer risk in astronauts, a number of obstacles must be overcome. Firstly, several factors make the astronaut cohort considerably different from the cohorts that have previously been studied for effects resulting from radiation exposure. The high rate of accidents and the much healthier lifestyle of astronauts compared to the U.S. population make finding a suitable comparison population a problematic task. Space radiation differs substantially from terrestrial radiation exposures studied in the past; therefore, analyses of galactic cosmic radiation (GCR) in animal models must be conducted and correctly applied to the human experience. Secondly, a large enough population of exposed astronauts must exist in order to obtain the data necessary to see any potential statistically significant differences between the astronauts and the control population. Thirdly, confounders and effect modifiers, such as smoking, diet, and other space stressors, must be correctly identified and controlled for in those

  12. Risk Management

    OpenAIRE

    Černák, Peter

    2009-01-01

    The Master's Thesis deals with the topic of risk management in a non-financial company. The goal of this Thesis is to create a framework for review of risk management process and to practically apply it in a case study. Objectives of the theoretical parts are: stating the reasons for risk management in non-financial companies, addressing the main parts of risk management and providing guidance for review of risk management process. A special attention is paid to financial risks. The practical...

  13. Surrounding Risks

    Directory of Open Access Journals (Sweden)

    Mogens Steffensen

    2013-05-01

    Full Text Available Research in insurance and finance was always intersecting although they were originally and generally viewed as separate disciplines. Insurance is about transferring risks between parties such that the burdens of risks are borne by those who can. This makes insurance transactions a beneficial activity for the society. It calls on detection, modelling, valuation, and controlling of risks. One of the main sources of control is diversification of risks and in that respect it becomes an issue in itself to clarify diversifiability of risks. However, many diversifiable risks are not, by nature or by contract design, separable from non-diversifiable risks that are, on the other hand, sometimes traded in financial markets and sometimes not. A key observation is that the economic risk came before the insurance contract: Mother earth destroys and kills incidentally and mercilessly, but the uncertainty of economic consequences can be more or less cleverly distributed by the introduction of an insurance market.

  14. Risk assessment

    DEFF Research Database (Denmark)

    Pedersen, Liselotte; Rasmussen, Kirsten; Elsass, Peter

    2010-01-01

    International research suggests that using formalized risk assessment methods may improve the predictive validity of professionals' predictions of risk of future violence. This study presents data on forensic psychiatric patients discharged from a forensic unit in Denmark in year 2001-2002 (n=107......). All patients were assessed for risk of future violence utilizing a structured professional judgment model: the Historical-Clinical-Risk Management-20 (HCR-20) violence risk assessment scheme. After a follow-up period of 5.6 years, recidivism outcome were obtained from the Danish National Crime...... predictive of violent recidivism compared to static items. In sum, the findings support the use of structured professional judgment models of risk assessment and in particular the HCR-20 violence risk assessment scheme. Findings regarding the importance of the (clinical) structured final risk judgment...

  15. Risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gittus, J.H.

    1986-03-01

    The article deals with the calculation of risks, as applied to living near to a) a nuclear reactor or b) an industrial complex. The application of risk assessment techniques to the pressurised water reactor (PWR) is discussed with respect to: containment, frequencies of degraded core accidents, release of radioisotopes, consequences and risk to society, and uncertainties. The risk assessment for an industrial complex concerns the work of the Safety and Reliability Directorate for the chemical complex on Canvey Island. (U.K.).

  16. Risk Love.

    Science.gov (United States)

    Asch, Peter; Quandt, Richard E.

    1990-01-01

    Notes that attitudes toward risk comprise an important topic in economics courses, whereas risk love receives limited attention, perhaps because of the lack of clear and appealing examples for teaching. Provides a definition for the term risk love and includes illustrations drawn from empirical studies of racetrack betting for teaching this…

  17. Global cardiovascular risk stratification among hypertensive patients treated in a Family Health Unit of Parnaíba, Piauí - doi: 10.5020/18061230.2012.p287

    Directory of Open Access Journals (Sweden)

    Elce de Seixas Nascimento

    2012-11-01

    Full Text Available Objective: To stratify the global cardiovascular risk among hypertensive patients attended in a Family Health Unit (FHU. Methods: A quantitative, cross-sectional and descriptive study with population of hypertensive patients undergoing treatment in a FHU, module 34, in Parnaíba, Piauí, Brazil, in the period from July to August 2011. The sample consisted of 45 volunteers, selected by free demand conglomerate, who filled a form with questions that support the analysis and Global Cardiovascular Risk stratification (GCR, according to the VI Brazilian Guidelines on Hypertension (VI BGH - 2010, The European Society of Cardiology (ESC and European Society of Hypertension (ESH - 2007. The subjects were then submitted to measurement of blood pressure (BP, waist circumference (WC and body mass index (BMI. Results: The most evident risk factor in the sample was overweight/obesity in 75.5% (n=34, followed by sedentary lifestyle in 73.3% (n=33 and hypercholesterolemia in 55.5% (n=25. The data collected resulted in a stratification in which 84.4% (n=38 presented high added risk and 15.5% (n=7 a very high added risk of presenting cardiovascular events in the next 10 years. Conclusion: The stratification in the population studied indicated high incidence of such factors, pointing to the need of interfering in this population segment, in order to promote changes in lifestyle that generate prevention and control of cardiovascular diseases.

  18. Risk management.

    Science.gov (United States)

    Chambers, David W

    2010-01-01

    Every plan contains risk. To proceed without planning some means of managing that risk is to court failure. The basic logic of risk is explained. It consists in identifying a threshold where some corrective action is necessary, the probability of exceeding that threshold, and the attendant cost should the undesired outcome occur. This is the probable cost of failure. Various risk categories in dentistry are identified, including lack of liquidity; poor quality; equipment or procedure failures; employee slips; competitive environments; new regulations; unreliable suppliers, partners, and patients; and threats to one's reputation. It is prudent to make investments in risk management to the extent that the cost of managing the risk is less than the probable loss due to risk failure and when risk management strategies can be matched to type of risk. Four risk management strategies are discussed: insurance, reducing the probability of failure, reducing the costs of failure, and learning. A risk management accounting of the financial meltdown of October 2008 is provided.

  19. SOS支持下的年报文本事件获取、管理与可视化∗%Access,management and visualization of the annual report text event based on SOS

    Institute of Scientific and Technical Information of China (English)

    杜腾飞; 毛建华; 刘学锋

    2016-01-01

    In order to meet the user’s demand for the annual report text event,a new scheme to access,manage and isualize the annual report text event.Based on SOS,a map application is designed and developed to get the text information of the enterprise virtual sensor.The application mainly includes:Get the text event by DOM tree, enterprise virtual sensor,Standardized management text event and the visualization of text event.By this application, the annual report information which user is interested in can be checked by the type of the annual report text event, displayed on the map,and user compares with the text events in the same industry.The results show that this scheme can get the various information of the annual report text event and it has the feasibility and practicality.%为满足用户对年报文本事件的获取需求,结合传感器网络提出了一种传感器观测服务(sensor observation service,SOS)支持下的年报文本事件获取、管理与可视化方案。该方案基于 SOS平台,设计和实现一种地图终端获取企业虚拟传感器文本信息的应用,主要包括:以DOM树获取文本事件、企业虚拟传感器、标准化管理文本事件、可视化文本事件。通过该应用,实现以年报文本事件类型查看用户感兴趣的年报信息,并以图形化方式显示,方便用户进行同行业对比。结果表明,此方案能够在地图终端获取年报的多种文本事件信息,具有可行性与实用性。

  20. Internal component analysis on the event-based prospective memory of schizophrenia patients%精神分裂症患者事件性前瞻记忆的内部成分分析

    Institute of Scientific and Technical Information of China (English)

    刘玉山; 杨柳; 廉志凯; 孙静; 王丽婷; 卜春莹

    2015-01-01

    目的 比较精神分裂症患者和正常人的前瞻记忆任务完成情况,利用多项式加工树(multinomial processing tree,MPT)模型分析并探究不同被试在完成前瞻记忆任务内部认知情况及影响精神分裂症患者前瞻记忆的具体成分.方法 选取17例精神分裂症患者和17例神经心理学背景相匹配的正常被试,通过颜色匹配任务,对两组被试的前瞻记忆能力进行测查,利用MPT模型对前瞻记忆的内部成分进行分析.结果 精神分裂症患者和正常对照组的前瞻记忆成绩为(21.83±2.46)%,(38.81±2.26)%,差异有统计学意义(t=2.11,P<0.05);进行中任务成绩为(75.88±0.43)%,(71.44±0.45)%,差异有统计学意义(t=2.79,P<0.05);精神分裂症患者的反应时显著长于正常人[(2.34±1.41)s,(1.81±1.19)s],差异有统计学意义(t=11.24,P<0.05);精神分裂症患者的前瞻成分参数显著低于正常人(0.82 vs 0.97),差异有统计学意义(G2(1)=9.56,P<0.01);精神分裂症患者的回溯成分参数显著高于正常人(0.54 vs0.41),差异有统计学意义(G2(1)14.17,P<0.01).结论 精神分裂症患者事件前瞻记忆的前瞻成分低于正常对照,从而降低了其前瞻记忆成绩.%Objective To compare the difference of prospective memory(PM) between schizophrenia patients and normal people,and to analyze the internal component of prospective memory using multinomial processing tree(MPT) model.Methods 17 schizophrenia patients and 17 age-and education-matched control participants completed an event-based PM task which was embedded within an ongoing computer-based color-matching task.Internal component of prospective memory was analyzed using multinomial processing tree(MPT) model.Results The scores of prospective memory performance in schizophrenia patients and normal controls were (21.83± 2.46) % and (38.81±2.26) %, and the difference was statistically significant (t=2.11, P<0.05).The difference of ongoing task between

  1. Enhanced intestinal tumor multiplicity and grade in vivo after HZE exposure: mouse models for space radiation risk estimates.

    Science.gov (United States)

    Trani, Daniela; Datta, Kamal; Doiron, Kathryn; Kallakury, Bhaskar; Fornace, Albert J

    2010-08-01

    Carcinogenesis induced by space radiation is considered a major risk factor in manned interplanetary and other extended missions. The models presently used to estimate the risk for cancer induction following deep space radiation exposure are based on data from A-bomb survivor cohorts and do not account for important biological differences existing between high-linear energy transfer (LET) and low-LET-induced DNA damage. High-energy and charge (HZE) radiation, the main component of galactic cosmic rays (GCR), causes highly complex DNA damage compared to low-LET radiation, which may lead to increased frequency of chromosomal rearrangements, and contribute to carcinogenic risk in astronauts. Gastrointestinal (GI) tumors are frequent in the United States, and colorectal cancer (CRC) is the third most common cancer accounting for 10% of all cancer deaths. On the basis of the aforementioned epidemiological observations and the frequency of spontaneous precancerous GI lesions in the general population, even a modest increase in incidence by space radiation exposure could have a significant effect on health risk estimates for future manned space flights. Ground-based research is necessary to reduce the uncertainties associated with projected cancer risk estimates and to gain insights into molecular mechanisms involved in space-induced carcinogenesis. We investigated in vivo differential effects of gamma-rays and HZE ions on intestinal tumorigenesis using two different murine models, ApcMin/+ and Apc1638N/+. We showed that gamma- and/or HZE exposure significantly enhances development and progression of intestinal tumors in a mutant-line-specific manner, and identified suitable models for in vivo studies of space radiation-induced intestinal tumorigenesis.

  2. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  3. Analysis of Operational Risks in Shipbuilding Industry

    Directory of Open Access Journals (Sweden)

    Daniela MATEI

    2012-11-01

    Full Text Available Our paper emphasizes the opportunities provided both for the academic research and companies by using a proposed model of analyzing the operational risks within business in general and shipbuilding industry in particular. The model aims to display the loss distribution from the operational risk for each business line/ type of event, based on frequency and severity estimation of the events. These estimations are derived mainly from the history logs of internal loss events. The calculations extend over a certain period of time in the future with a certain level of confidence. It should also be mentioned that the proposed model estimates unexpected losses, without making any suppositions concerning the values of the expected and unexpected losses. Several ideas could be extracted by analyzing and synthesizing the theoretical models from available literature. These ideas were analyzed in order to develop a model for operational risk analysis that is adapted to shipbuilding. This paper describes a new model, which can be applied to the naval industry to quantify operational risks.

  4. Embracing risk

    Directory of Open Access Journals (Sweden)

    Ross Cagan

    2015-08-01

    Full Text Available I entered the science field because I imagined that scientists were society's “professional risk takers”, that they like surfing out on the edge. I understood that a lot of science – perhaps even most science – has to be a solid exploration of partly understood phenomena. But any science that confronts a difficult problem has to start with risk. Most people are at least a bit suspicious of risk, and scientists such as myself are no exception. Recently, risk-taking has been under attack financially, but this Editorial is not about that. I am writing about the long view and the messages we send to our trainees. I am Senior Associate Dean of the graduate school at Mount Sinai and have had the privilege to discuss these issues with the next generation of scientists, for whom I care very deeply. Are we preparing you to embrace risk?

  5. Cardiovascular risk factors and estimated 10-year risk of fatal cardiovascular events using various equations in Greeks with metabolic syndrome.

    Science.gov (United States)

    Chimonas, Theodoros; Athyros, Vassilios G; Ganotakis, Emmanouel; Nicolaou, Vassilios; Panagiotakos, Demosthenes B; Mikhailidis, Dimitri P; Elisaf, Moses

    2010-01-01

    We investigated cardiovascular disease (CVD) risk factors in 1501 Greeks (613 men and 888 women, aged 40-65 years) referred to outpatients with metabolic syndrome (MetS) and without diabetes mellitus or CVD. The 10-year risk of fatal CVD events was calculated using European Society of Cardiology Systematic Coronary Risk Estimation (ESC SCORE), Hellenic-SCORE, and Framingham equations. Raised blood pressure (BP) and hypertriglyceridemia were more common in men (89.6% vs 84.2% and 86.8% vs 74.2%, respectively; P < .001). Low high-density lipoprotein cholesterol (HDL-C) and abdominal obesity were more common in women (58.2% vs 66.2% and 85.8% vs 97.1%, respectively; P < .001). The 10-year risk of fatal CVD events using HellenicSCORE was higher in men (6.3% +/- 4.3% vs 2.7% +/- 2.1%; P < .001). European Society of Cardiology Systematic Coronary Risk Estimation and Framingham yielded similar results. The risk equations gave similar assessments in a European Mediterranean population except for HellenicSCORE that calculated more MetS women requiring risk modification. This might justify local risk engine evaluation in event-based studies. (Clinical-Trials.gov ID: NCT00416741).

  6. Risk factors.

    Science.gov (United States)

    Robbins, Catherine J; Connors, K C; Sheehan, Timothy J; Vaughan, James S

    2005-06-01

    Minimize surprises on your financial statement by adopting a model for integrated risk management that: Examines interrelationships among operations, investments, and financing. Incorporates concepts of the capital asset pricing model to manage unexpected volatility

  7. Quantitative Risks

    Science.gov (United States)

    2015-02-24

    program risk evaluation frameworks and criteria set out, in formal documentation, for proposal and contact execution evaluation as determined by the PMO ...phase until late December 2015. During this time there was turnover in the PMO . We decided that we needed to reaffirm the collaboration agreement...pertinent to developing RLI. We reviewed prior analyses of root causes of adverse program outcome. We identified Program Management Office ( PMO ) risk

  8. RISK MANAGEMENT AND FINANCIAL RISKS

    Directory of Open Access Journals (Sweden)

    Caruntu Andreea Laura

    2011-12-01

    Full Text Available Throughout the last years the financial economic crisis has seriously affected the worldwide economies and what we know is that the risk is a phenomenon that appears daily in each company’s activity. It is seen as the probability of an event and its consequences. Each company needs to take into consideration that a good risk management is necessary if they want to survive in the economic environment. The purpose of this article is to bring into notice of the readers already known notions relating to risk and risk management so that it is understood how important it is to take immediate measures in cases of risky situations. Unfortunately, many companies do not take into consideration such a serious aspect and this only leads to serious financial problems.

  9. Managing Risks

    DEFF Research Database (Denmark)

    Alaranta, Maria Eliisa; Mathiassen, Lars

    2014-01-01

    , analyze, and mitigate risks during post-merger IS integration. They identify key risks relating to IS integration content, process, and context, and present five strategies for mitigating those risks. Their framework aims to help managers proactively reduce the impact of adverse events. Adopting......Mergers and acquisitions (M&A) require organizations to blend together different information system (IS) configurations. Unfortunately, less than 50 percent of M&A's achieve their goals, with IS integration being a major problem. Here, the authors offer a framework to help managers prepare for...... the framework supported by their templates is straightforward and the time and resources required are minimal. When properly executed, adoption increases the likelihood of successful merger outcomes; the framework is thus a valuable addition to the management tool box and can be applied in collaboration...

  10. Risk Management.

    Science.gov (United States)

    Rakich, Ronald

    1982-01-01

    Beginning on the front page, this article explains ways of establishing a sound risk management insurance program that can improve a school district's financial position. Organizations that can help are listed. Available from the American Association of School Administrators, 1801 North Moore Street, Arlington, VA 22209. (MLF)

  11. Biofilm Risks

    DEFF Research Database (Denmark)

    Wirtanen, Gun Linnea; Salo, Satu

    2016-01-01

    This chapter on biofilm risks deals with biofilm formation of pathogenic microbes, sampling and detection methods, biofilm removal, and prevention of biofilm formation. Several common pathogens produce sticky and/or slimy structures in which the cells are embedded, that is, biofilms, on various s...

  12. Heart disease - risk factors

    Science.gov (United States)

    Heart disease - prevention; CVD - risk factors; Cardiovascular disease - risk factors; Coronary artery disease - risk factors; CAD - risk ... a certain health condition. Some risk factors for heart disease you cannot change, but some you can. ...

  13. Biophysics Representation of the Two-Hit Model of Alzheimer's Disease for the Exploration of Late CNS Risks from Space Radiation

    Science.gov (United States)

    Cucinotta, Francis A.; Ponomarev, Artem

    2009-01-01

    A concern for long-term space travel outside the Earth s magnetic field is the late effects to the central nervous system (CNS) from galactic cosmic ray (GCR) or solar particle events (SPE). Human epidemiology data is severely limited for making CNS risk estimates and it is not clear such effects occur following low LET exposures. We are developing systems biology models based on biological information on specific diseases, and experimental data for proton and heavy ion radiation. A two-hit model of Alzheimer s disease (AD) has been proposed by Zhu et al.(1), which is the framework of our model. Of importance is that over 50% of the US population over the age of 75-y have mild to severe forms of AD. Therefore we recommend that risk assessment for a potential AD risk from space radiation should focus on the projection of an earlier age of onset of AD and the prevention of this possible acceleration through countermeasures. In the two-hit model, oxidative stress and aberrant cell cycle-related abnormalities leading to amyloid-beta plaques and neurofibrillary tangles are necessary and invariant steps in AD. We have formulated a stochastic cell kinetics model of the two-hit AD model. In our model a population of neuronal cells is allowed to undergo renewal through neurogenesis and is susceptible to oxidative stress or cell cycle abnormalities with age-specific accumulation of damage. Baseline rates are fitted to AD population data for specific ages, gender, and for persons with an apolipoprotein 4 allele. We then explore how low LET or heavy ions may increase either of the two-hits or neurogenesis either through persistent oxidative stress, direct mutation, or through changes to the micro-environment, and suggest possible ways to develop accurate quantitative estimates of these processes for predicting AD risks following long-term space travel.

  14. Risk Management Technologies With Logic and Probabilistic Models

    CERN Document Server

    Solozhentsev, E D

    2012-01-01

    This book presents intellectual, innovative, information technologies (I3-technologies) based on logical and probabilistic (LP) risk models. The technologies presented here consider such models for structurally complex systems and processes with logical links and with random events in economics and technology.  The volume describes the following components of risk management technologies: LP-calculus; classes of LP-models of risk and efficiency; procedures for different classes; special software for different classes; examples of applications; methods for the estimation of probabilities of events based on expert information. Also described are a variety of training courses in these topics. The classes of risk models treated here are: LP-modeling, LP-classification, LP-efficiency, and LP-forecasting. Particular attention is paid to LP-models of risk of failure to resolve difficult economic and technical problems. Amongst the  discussed  procedures of I3-technologies  are the construction of  LP-models,...

  15. Verbal risk in communicating risk

    Energy Technology Data Exchange (ETDEWEB)

    Walters, J.C. [Northern Arizona Univ., Flagstaff, AZ (United States). School of Communication; Reno, H.W. [EG and G Idaho, Inc., Idaho Falls, ID (United States). Idaho National Engineering Lab.

    1993-03-01

    When persons in the waste management industry have a conversation concerning matters of the industry, thoughts being communicated are understood among those in the industry. However, when persons in waste management communicate with those outside the industry, communication may suffer simply because of poor practices such as the use of jargon, euphemisms, acronyms, abbreviations, language usage, not knowing audience, and public perception. This paper deals with ways the waste management industry can communicate risk to the public without obfuscating issues. The waste management industry should feel obligated to communicate certain meanings within specific contexts and, then, if the context changes, should not put forth a new, more appropriate meaning to the language already used. Communication of the waste management industry does not have to be provisional. The authors suggest verbal risks in communicating risk can be reduced significantly or eliminated by following a few basic communication principles. The authors make suggestions and give examples of ways to improve communication with the general public by avoiding or reducing jargon, euphemisms, and acronyms; knowing the audience; avoiding presumptive knowledge held by the audience; and understanding public perception of waste management issues.

  16. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  17. Publishing Risks

    Directory of Open Access Journals (Sweden)

    Mogens Steffensen

    2014-02-01

    Full Text Available “What is complicated is not necessarily insightful and what is insightful is not necessarily complicated: Risks welcomes simple manuscripts that contribute with insight, outlook, understanding and overview”—a quote from the first editorial of this journal [1]. Good articles are not characterized by their level of complication but by their level of imagination, innovation, and power of penetration. Creativity sessions and innovative tasks are most elegant and powerful when they are delicately simple. This is why the articles you most remember are not the complicated ones that you struggled to digest, but the simpler ones you enjoyed swallowing.

  18. Risk Analysis

    Science.gov (United States)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  19. Lymphedema Risk Reduction Practices

    Science.gov (United States)

    ... now! Position Paper: Lymphedema Risk Reduction Practices Category: Position Papers Tags: Risks Archives Treatment risk reduction garments surgery obesity infection blood pressure trauma morbid obesity body weight ...

  20. Thyroid Cancer Risk Factors

    Science.gov (United States)

    ... Prevented? Thyroid Cancer Causes, Risk Factors, and Prevention Thyroid Cancer Risk Factors A risk factor is anything that ... Cancer? Can Thyroid Cancer Be Prevented? More In Thyroid Cancer About Thyroid Cancer Causes, Risk Factors, and Prevention ...

  1. No-migration variance petition: Draft. Volume 4, Appendices DIF, GAS, GCR (Volume 1)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-31

    The Department of Energy is responsible for the disposition of transuranic (TRU) waste generated by national defense-related activities. Approximately 2.6 million cubic feet of the se waste have been generated and are stored at various facilities across the country. The Waste Isolation Pilot Plant (WIPP), was sited and constructed to meet stringent disposal requirements. In order to permanently dispose of TRU waste, the DOE has elected to petition the US EPA for a variance from the Land Disposal Restrictions of RCRA. This document fulfills the reporting requirements for the petition. This report is volume 4 of the petition which presents details about the transport characteristics across drum filter vents and polymer bags; gas generation reactions and rates during long-term WIPP operation; and geological characterization of the WIPP site.

  2. Luminescence dating on Mars: OSL characteristics of Martian analogue materials and GCR dosimetry

    DEFF Research Database (Denmark)

    Jain, M.; Andersen, C.E.; Bøtter-Jensen, L.

    2006-01-01

    Luminescence chronology may be the key to understanding climatically and tectonically driven changes on Mars. However, the success of Martian luminescence dating will depend upon our understanding of the luminescence properties of silicates such as olivine, pyroxenes and plagioclases, and sedimen......), and the effect of radiation damage due to cosmic ray exposure are presented and implications discussed for Martian luminescence dating. (c) 2006 Elsevier Ltd. All rights reserved.......Luminescence chronology may be the key to understanding climatically and tectonically driven changes on Mars. However, the success of Martian luminescence dating will depend upon our understanding of the luminescence properties of silicates such as olivine, pyroxenes and plagioclases......-aerial transport; this may allow possibility of using deep traps for extending the age range on Mars. Dose rates on Mars are largely due to charged particles present in the galactic cosmic rays. Some new results on proton dosimetry with Al2O3:C (Bragg curve and luminescence efficiency as a function of LET...

  3. Possible effects on Earth's climate due to reduced atmospheric ionization by GCR during Forbush Decreases

    Science.gov (United States)

    Portugal, Williamary; Echer, Ezequiel; Pereira de Souza Echer, Mariza; Pacini, Alessandra Abe

    2017-10-01

    This work presents the first results of a study about possible effects on the surface temperature during short periods of lower fluxes of Galactic Cosmic Rays at Earth, called Forbush Decreases. There is a hypothesis that the Galactic Cosmic Ray flux decreases cause changes on the physical-chemical properties of the atmosphere. We have conducted a study to investigate these possible effects on several latitudinal regions, around the ten strongest FDs occurred from 1987 to 2015. We have found a possible increase on the surface temperature at middle and high latitudes during the occurence of these events.

  4. SCR and GCR exposure ages of plagioclase grains from lunar soil

    Science.gov (United States)

    Etique, P.; Baur, H.; Signer, P.; Wieler, R.

    1986-01-01

    The concentrations of solar wind implanted Ar-36 in mineral grains extracted from lunar soils show that they were exposed to the solar wind on the lunar surface for an integrated time of 10E4 to 10E5 years. From the bulk soil 61501 plagioclase separates of 8 grain size ranges was prepared. The depletion of the implanted gases was achieved by etching aliquot samples of 4 grain sizes to various degrees. The experimental results pertinent to the present discussion are: The spallogenic Ne is, as in most plagioclases from lunar soils, affected by diffusive losses and of no use. The Ar-36 of solar wind origin amounts to (2030 + or - 100) x 10E-8 ccSTP/g in the 150 to 200 mm size fraction and shows that these grains were exposed to the solar wind for at least 10,000 years. The Ne-21/Ne-22 ratio of the spallogenic Ne is 0.75 + or - 0.01 and in very good agreement with the value of this ratio in a plagioclase separate from rock 76535. This rock has had a simple exposure history and its plagioclases have a chemical composition quite similar to those studied. In addition to the noble gases, the heavy particle tracks in an aliquot of the 150 to 200 mm plagioclase separate were investigated and found 92% of the grains to contain more than 10E8 tracks/sq cm. This corresponds to a mean track density of (5 + or - 1) x 10E8 tracks/sq cm. The exploration of the exposure history of the plagioclase separates from the soil 61501 do not contradict the model for the regolith dynamics but also fail to prove it.

  5. Project Risk Management Phases

    OpenAIRE

    Claudiu-George BOCEAN

    2008-01-01

    Risk management is the human activity which integrates recognition of risk, risk assessment, developing strategies to manage it, and mitigation of risk using managerial resources. Notwithstanding the domain of activities where they are conducted, projects often entail risks, and risk management has been widely recognized as a success factor in project management. Following a concept clarification on project risk management, this paper presents a generic list steps in the risk management proce...

  6. Managing risk and uncertainty

    OpenAIRE

    Coulson-Thomas, Colin

    2015-01-01

    Examines risk management and contemporary issues concerning risk governance from a board perspective, including risk tolerance, innovation, insurance, balancing risks and other factors, risk and strategies of diversification or focus, increasing flexibility to cope with uncertainty, periodic planning versus intelligent steering, and limiting downside risks and adverse consequences.

  7. Risk Assessment Terminology: Risk Communication Part 2.

    Science.gov (United States)

    Liuzzo, Gaetano; Bentley, Stefano; Giacometti, Federica; Piva, Silvia; Serraino, Andrea

    2016-04-19

    The paper describes the terminology of risk communication in the view of food safety: different aspects of risk perception (perceived risk, media triggers, the psychometric paradigm, fright factors and cultural determinants of risk perception) are described. The risk profile elements are illustrated in the manuscript: hazard-food commodity combination(s) of concern; description of the public health problem; food production, processing, distribution and consumption; needs and questions for the risk assessors; available information and major knowledge gaps and other risk profile elements.

  8. Risk assessment terminology: risk communication part 2

    Directory of Open Access Journals (Sweden)

    Gaetano Liuzzo

    2016-04-01

    Full Text Available The paper describes the terminology of risk communication in the view of food safety: different aspects of risk perception (perceived risk, media triggers, the psychometric paradigm, fright factors and cultural determinants of risk perception are described. The risk profile elements are illustrated in the manuscript: hazard-food commodity combination(s of concern; description of the public health problem; food production, processing, distribution and consumption; needs and questions for the risk assessors; available information and major knowledge gaps and other risk profile elements.

  9. Indicators of Disaster Risk and Risk Management

    OpenAIRE

    Omar D. Cardona

    2005-01-01

    This document is the summary report of the IDB-sponsored system of disaster risk and risk management indicators presented at the World Conference on Disaster Reduction in Kobe, Japan, 2005. The indices estimate disaster risk loss, distribution, vulnerability and management for 12 countries in Latin America and the Caribbean. The objective of this program is to facilitate access to relevant information on disaster risk and risk management by national decision-makers, thus making possible the i...

  10. NASA's Risk Management System

    Science.gov (United States)

    Perera, Jeevan S.

    2013-01-01

    Phased-approach for implementation of risk management is necessary. Risk management system will be simple, accessible and promote communication of information to all relevant stakeholders for optimal resource allocation and risk mitigation. Risk management should be used by all team members to manage risks - not just risk office personnel. Each group/department is assigned Risk Integrators who are facilitators for effective risk management. Risks will be managed at the lowest-level feasible, elevate only those risks that require coordination or management from above. Risk informed decision making should be introduced to all levels of management. ? Provide necessary checks and balances to insure that risks are caught/identified and dealt with in a timely manner. Many supporting tools, processes & training must be deployed for effective risk management implementation. Process improvement must be included in the risk processes.

  11. Decreasing relative risk premium

    DEFF Research Database (Denmark)

    Hansen, Frank

    2007-01-01

    such that the corresponding relative risk premium is a decreasing function of present wealth, and we determine the set of associated utility functions. We find a new characterization of risk vulnerability and determine a large set of utility functions, closed under summation and composition, which are both risk vulnerable...... and have decreasing relative risk premium. We finally introduce the notion of partial risk neutral preferences on binary lotteries and show that partial risk neutrality is equivalent to preferences with decreasing relative risk premium...

  12. Downside Variance Risk Premium

    OpenAIRE

    Feunou, Bruno; Jahan-Parvar, Mohammad R.; Okou, Cédric

    2015-01-01

    We propose a new decomposition of the variance risk premium in terms of upside and downside variance risk premia. The difference between upside and downside variance risk premia is a measure of skewness risk premium. We establish that the downside variance risk premium is the main component of the variance risk premium, and that the skewness risk premium is a priced factor with significant prediction power for aggregate excess returns. Our empirical investigation highlights the positive and s...

  13. Event-based stormwater management pond runoff temperature model

    Science.gov (United States)

    Sabouri, F.; Gharabaghi, B.; Sattar, A. M. A.; Thompson, A. M.

    2016-09-01

    Stormwater management wet ponds are generally very shallow and hence can significantly increase (about 5.4 °C on average in this study) runoff temperatures in summer months, which adversely affects receiving urban stream ecosystems. This study uses gene expression programming (GEP) and artificial neural networks (ANN) modeling techniques to advance our knowledge of the key factors governing thermal enrichment effects of stormwater ponds. The models developed in this study build upon and compliment the ANN model developed by Sabouri et al. (2013) that predicts the catchment event mean runoff temperature entering the pond as a function of event climatic and catchment characteristic parameters. The key factors that control pond outlet runoff temperature, include: (1) Upland Catchment Parameters (catchment drainage area and event mean runoff temperature inflow to the pond); (2) Climatic Parameters (rainfall depth, event mean air temperature, and pond initial water temperature); and (3) Pond Design Parameters (pond length-to-width ratio, pond surface area, pond average depth, and pond outlet depth). We used monitoring data for three summers from 2009 to 2011 in four stormwater management ponds, located in the cities of Guelph and Kitchener, Ontario, Canada to develop the models. The prediction uncertainties of the developed ANN and GEP models for the case study sites are around 0.4% and 1.7% of the median value. Sensitivity analysis of the trained models indicates that the thermal enrichment of the pond outlet runoff is inversely proportional to pond length-to-width ratio, pond outlet depth, and directly proportional to event runoff volume, event mean pond inflow runoff temperature, and pond initial water temperature.

  14. Event-based text mining for biology and functional genomics

    Science.gov (United States)

    Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B.

    2015-01-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of ‘events’, i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research. PMID:24907365

  15. Deterministic event-based simulation of quantum phenomena

    NARCIS (Netherlands)

    De Raedt, K; De Raedt, H; Michielsen, K

    2005-01-01

    We propose and analyse simple deterministic algorithms that can be used to construct machines that have primitive learning capabilities. We demonstrate that locally connected networks of these machines can be used to perform blind classification on an event-by-event basis, without storing the inform

  16. An event-based hydrologic simulation model for bioretention systems.

    Science.gov (United States)

    Roy-Poirier, A; Filion, Y; Champagne, P

    2015-01-01

    Bioretention systems are designed to treat stormwater and provide attenuated drainage between storms. Bioretention has shown great potential at reducing the volume and improving the quality of stormwater. This study introduces the bioretention hydrologic model (BHM), a one-dimensional model that simulates the hydrologic response of a bioretention system over the duration of a storm event. BHM is based on the RECARGA model, but has been adapted for improved accuracy and integration of pollutant transport models. BHM contains four completely-mixed layers and accounts for evapotranspiration, overflow, exfiltration to native soils and underdrain discharge. Model results were evaluated against field data collected over 10 storm events. Simulated flows were particularly sensitive to antecedent water content and drainage parameters of bioretention soils, which were calibrated through an optimisation algorithm. Temporal disparity was observed between simulated and measured flows, which was attributed to preferential flow paths formed within the soil matrix of the field system. Modelling results suggest that soil water storage is the most important short-term hydrologic process in bioretention, with exfiltration having the potential to be significant in native soils with sufficient permeability.

  17. Using Event-Based Parsing to Support Dynamic Protocol Evolution

    Science.gov (United States)

    2003-03-01

    System Generator HTTP 1.0 Parser Composer EBP System Generator HTTP 1.0 Parser Composer Client... Generator HTTP 1.0 Parser Composer EBP System Generator HTTP 1.0 Parser Composer Client HTTP 1.1 Proxy Event Handler 1-7 8 8 Fig. 8: Modified...configuration and scenario events 9 though 19. Server HTTP 1.0 EBP System Generator HTTP 1.0 Parser Composer Client HTTP 1.1 Proxy

  18. Visualization of Sedentary Behavior Using an Event-Based Approach

    Science.gov (United States)

    Loudon, David; Granat, Malcolm H.

    2015-01-01

    Visualization is commonly used in the interpretation of physical behavior (PB) data, either in conjunction with or as precursor to formal analysis. Effective representations of the data can enable the identification of patterns of behavior, and how they relate to the temporal context in a single day, or across multiple days. An understanding of…

  19. Relevant sampling applied to event-based state-estimation

    NARCIS (Netherlands)

    Marck, J.W.; Sijs, J.

    2010-01-01

    To reduce the amount of data transfer in networked control systems and wireless sensor networks, measurements are usually sampled only when an event occurs, rather than synchronous in time. Today's event sampling methodologies are triggered by the current value of the sensor. State-estimators are de

  20. Transfer of Trust in Event-based Reputation Systems

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl

    2012-01-01

    choice of model from concurrency theory. In this paper, we continue this line of research, addressing the problem on how to transfer trust from one behavioural context to another. Our proposed frameworks build on morphisms between event structures, and we prove some generic results guaranteeing formal......In the Global Computing scenario, trust-based systems have been proposed and studied as an alternative to traditional security mechanisms. A promising line of research concerns the so-called reputation-based computational trust. The approach here is that trust in a computing agent is defined...... properties of transfers in the frameworks....

  1. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event-basered...... defineret og udført i en almindelig web-browser....

  2. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    relationships, i.e., via systems for computational trust. We focus here on systems where trust in a computational entity is interpreted as the expectation of certain future behaviour based on behavioural patterns of the past, and concern ourselves with the foundations of such probabilistic systems....... In particular, we aim at establishing formal probabilistic models for computational trust and their fundamental properties. In the paper we define a mathematical measure for quantitatively comparing the effectiveness of probabilistic computational trust systems in various environments. Using it, we compare some...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  3. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event-basered...

  4. Qualitative Event-Based Fault Isolation under Uncertain Observations

    Science.gov (United States)

    2014-10-02

    supplies electrical power to several loads, transmitted through several circuit breakers (CB236, CB262, CB266, and CB280) and relays (EY244, EY260, EY281... circuit breakers (“ESH”, “ISH”). Finally there is one sensor to report the operating state of a load (fan speed, “ST”) and another to report the...slope (rather than no change in slope) was incorrect, and so failures in the circuit breakers and relays become more likely). In this case, no

  5. Event-based image recognition applied in tennis training assistance

    Science.gov (United States)

    Wawrzyniak, Zbigniew M.; Kowalski, Adam

    2016-09-01

    This paper presents a concept of a real-time system for individual tennis training assistance. The system is supposed to provide user (player) with information on his strokes accuracy as well as other training quality parameters such as velocity and rotation of the ball during its flight. The method is based on image processing methods equipped with developed explorative analysis of the events and their description by parameters of the movement. There has been presented the concept for further deployment to create a complete system that could assist tennis player during individual training.

  6. Event-based simulation of neutron interferometry experiments

    CERN Document Server

    De Raedt, Hans; Michielsen, Kristel

    2012-01-01

    A discrete-event approach, which has already been shown to give a cause-and-effect explanation of many quantum optics experiments, is applied to single-neutron interferometry experiments. The simulation algorithm yields a logically consistent description in terms of individual neutrons and does not require the knowledge of the solution of a wave equation. It is shown that the simulation method reproduces the results of several single-neutron interferometry experiments, including experiments which, in quantum theoretical language, involve entanglement. Our results demonstrate that classical (non-Hamiltonian) systems can exhibit correlations which in quantum theory are associated with interference and entanglement, also when all particles emitted by the source are accounted for.

  7. Relevant sampling applied to event-based state-estimation

    NARCIS (Netherlands)

    Marck, J.W.; Sijs, J.

    2010-01-01

    To reduce the amount of data transfer in networked control systems and wireless sensor networks, measurements are usually sampled only when an event occurs, rather than synchronous in time. Today's event sampling methodologies are triggered by the current value of the sensor. State-estimators are de

  8. An Event-based Approach to Hybrid Systems Diagnosability

    Data.gov (United States)

    National Aeronautics and Space Administration — Diagnosability is an important issue in the design of diagnostic systems, because it helps identify whether sufficient information is available to distinguish all...

  9. Event-based text mining for biology and functional genomics.

    Science.gov (United States)

    Ananiadou, Sophia; Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B

    2015-05-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of 'events', i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research.

  10. Event based state estimation with time synchronous updates

    NARCIS (Netherlands)

    Sijs, J.; Lazar, M.

    2012-01-01

    To reduce the amount of data transfer in networked systems, measurements are usually taken only when an event occurs rather than at each synchronous sample instant. However, this complicates estimation problems considerably, especially in the situation when no measurement is received anymore. The go

  11. An event-based architecture for solving constraint satisfaction problems

    Science.gov (United States)

    Mostafa, Hesham; Müller, Lorenz K.; Indiveri, Giacomo

    2015-12-01

    Constraint satisfaction problems are ubiquitous in many domains. They are typically solved using conventional digital computing architectures that do not reflect the distributed nature of many of these problems, and are thus ill-suited for solving them. Here we present a parallel analogue/digital hardware architecture specifically designed to solve such problems. We cast constraint satisfaction problems as networks of stereotyped nodes that communicate using digital pulses, or events. Each node contains an oscillator implemented using analogue circuits. The non-repeating phase relations among the oscillators drive the exploration of the solution space. We show that this hardware architecture can yield state-of-the-art performance on random SAT problems under reasonable assumptions on the implementation. We present measurements from a prototype electronic chip to demonstrate that a physical implementation of the proposed architecture is robust to practical non-idealities and to validate the theory proposed.

  12. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    Science.gov (United States)

    2012-01-01

    estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements (NCRP, 2000, 2006). However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as opposed to the previous approach based on linear energy transfer, the development of a new solar particle event (SPE) model, and the updates to galactic cosmic ray (GCR) and shielding transport models. The newer epidemiological information includes updates to the cancer incidence rates from the life span study (LSS) of the Japanese atomic bomb survivors (Preston et al., 2007), transferred to the U.S. population and converted to cancer mortality rates from U.S. population statistics. In addition, the proposed model provides an alternative analysis applicable to lifetime never-smokers (NSs). Details of the uncertainty analysis in the model have also been updated and revised. NASA's proposed model and associated uncertainties are complex in their formulation and as such require a very clear and precise set of descriptions. The committee found the 2011 NASA report challenging to review largely because of the lack of clarity in the model descriptions and derivation of the various parameters used. The committee requested some clarifications from NASA throughout its review and was able to resolve many, but not all, of the ambiguities in the written description.

  13. Technical Evaluation of the NASA Model for Cancer Risk to Astronauts Due to Space Radiation

    Science.gov (United States)

    2012-01-01

    estimating risk and uncertainty in the proposed model is broadly similar to that used for the current (2005) NASA model and is based on recommendations by the National Council on Radiation Protection and Measurements (NCRP, 2000, 2006). However, NASA's proposed model has significant changes with respect to the following: the integration of new findings and methods into its components by taking into account newer epidemiological data and analyses, new radiobiological data indicating that quality factors differ for leukemia and solid cancers, an improved method for specifying quality factors in terms of radiation track structure concepts as opposed to the previous approach based on linear energy transfer, the development of a new solar particle event (SPE) model, and the updates to galactic cosmic ray (GCR) and shielding transport models. The newer epidemiological information includes updates to the cancer incidence rates from the life span study (LSS) of the Japanese atomic bomb survivors (Preston et al., 2007), transferred to the U.S. population and converted to cancer mortality rates from U.S. population statistics. In addition, the proposed model provides an alternative analysis applicable to lifetime never-smokers (NSs). Details of the uncertainty analysis in the model have also been updated and revised. NASA's proposed model and associated uncertainties are complex in their formulation and as such require a very clear and precise set of descriptions. The committee found the 2011 NASA report challenging to review largely because of the lack of clarity in the model descriptions and derivation of the various parameters used. The committee requested some clarifications from NASA throughout its review and was able to resolve many, but not all, of the ambiguities in the written description.

  14. Risks of advanced technology - Nuclear: risk comparison

    Energy Technology Data Exchange (ETDEWEB)

    Latarjet, R. (Institut du Radium, Orsay (France))

    The author presents a general definition of the concept of risk and makes a distinction between the various types of risk - the absolute and the relative; the risk for oneself and for others. The quantitative comparison of risks presupposes their ''interchangeability''. In the case of major risks in the long term - or genotoxic risks - there is a certain degree of interchangeability which makes this quantitative comparison possible. It is expressed by the concept of rad-equivalence which the author defines and explains giving as a concrete example the work conducted on ethylene and ethylene oxide.

  15. An operational procedure for rapid flood risk assessment in Europe

    Science.gov (United States)

    Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc

    2017-07-01

    The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.

  16. An experimental system for flood risk forecasting at global scale

    Science.gov (United States)

    Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.

    2016-12-01

    Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.

  17. Risk Characterization Handbook

    Science.gov (United States)

    This Handbook has two parts. The first is the Risk Characterization guidance itself. The second part comprises the Appendices which contain the Risk Characterization Policy, the risk characterization case studies and references.

  18. Cardiovascular risk calculation

    African Journals Online (AJOL)

    James A. Ker

    2014-08-20

    Aug 20, 2014 ... combined effects of several risk factors. Risk prediction ... but they have high intrinsic variance for the prediction of risk when applied to a given ... motivating patients to improve their vascular age by better adhering to therapy.6.

  19. Risk Factors for Scleroderma

    Science.gov (United States)

    ... You are here: Home For Patients Risk Factors Risk Factors for Scleroderma The cause of scleroderma is ... what biological factors contribute to scleroderma pathogenesis. Genetic Risk Scleroderma does not tend to run in families ...

  20. Nanoparticles: Uncertainty Risk Analysis

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Hansen, Steffen Foss; Baun, Anders

    2012-01-01

    Scientific uncertainty plays a major role in assessing the potential environmental risks of nanoparticles. Moreover, there is uncertainty within fundamental data and information regarding the potential environmental and health risks of nanoparticles, hampering risk assessments based on standard a...

  1. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  2. RISK MANAGEMENT IN PHARMACEUTICALS

    Directory of Open Access Journals (Sweden)

    V. SIVA RAMA KRISHNA

    2014-04-01

    Full Text Available Objective: To review the risk in pharmaceutical industries and the risk management process and tools. There is risk always in anything we do. All the industries on this globe perform actions that involve risks; risk is only dangerous when there is no anticipation to manage it. Risks if assessed and controlled properly will benefit the industries against the fall and makes stronger. Risk should not be assessed as bad, but should assess as an opportunity for making things resilient by proper management. Risk management can benefit industries from disasters either natural or human. The impact of the risk should be assessed in order to plan alternatives and minimize the effect of the impact. Risk in pharmaceutical industry is very high because it involves research, money and health. The impact is severe and the probability of the risk is more often in pharmaceutical industry. A risk management plans and control measures will help the companies to do better at the time of uncertainties and create positive opportunities to turn those risks into benefits which maximize quality. Materials and Methods: The information was collected and compiled from scientific literature present in different databases and articles, books. Results: The risk management process and tools helps to minimize the risk and its effects. Conclusion: The risk management is at the core of any organization. Risk management should be part of organization culture. The risk management is a wise investment if properly processed.

  3. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  4. On portfolio risk diversification

    Science.gov (United States)

    Takada, Hellinton H.; Stern, Julio M.

    2017-06-01

    The first portfolio risk diversification strategy was put into practice by the All Weather fund in 1996. The idea of risk diversification is related to the risk contribution of each available asset class or investment factor to the total portfolio risk. The maximum diversification or the risk parity allocation is achieved when the set of risk contributions is given by a uniform distribution. Meucci (2009) introduced the maximization of the Rényi entropy as part of a leverage constrained optimization problem to achieve such diversified risk contributions when dealing with uncorrelated investment factors. A generalization of the risk parity is the risk budgeting when there is a prior for the distribution of the risk contributions. Our contribution is the generalization of the existent optimization frameworks to be able to solve the risk budgeting problem. In addition, our framework does not possess any leverage constraint.

  5. Assessment of cardiovascular risk.

    LENUS (Irish Health Repository)

    Cooney, Marie Therese

    2010-10-01

    Atherosclerotic cardiovascular disease (CVD) is the most common cause of death worldwide. Usually atherosclerosis is caused by the combined effects of multiple risk factors. For this reason, most guidelines on the prevention of CVD stress the assessment of total CVD risk. The most intensive risk factor modification can then be directed towards the individuals who will derive the greatest benefit. To assist the clinician in calculating the effects of these multiple interacting risk factors, a number of risk estimation systems have been developed. This review address several issues regarding total CVD risk assessment: Why should total CVD risk be assessed? What risk estimation systems are available? How well do these systems estimate risk? What are the advantages and disadvantages of the current systems? What are the current limitations of risk estimation systems and how can they be resolved? What new developments have occurred in CVD risk estimation?

  6. Pipeline risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kariyawasam, S. [TransCanada PipeLines Ltd., Calgary, AB (Canada); Weir, D. [Enbridge Pipelines Inc., Calgary, AB (Canada)] (comps.)

    2009-07-01

    Risk assessments and risk analysis are system-wide activities that include site-specific risk and reliability-based decision-making, implementation, and monitoring. This working group discussed the risk management process in the pipeline industry, including reliability-based integrity management and risk control processes. Attendants at the group discussed reliability-based decision support and performance measurements designed to support corporate risk management policies. New developments and technologies designed to optimize risk management procedures were also presented. The group was divided into 3 sessions: (1) current practice, strengths and limitations of system-wide risk assessments for facility assets; (2) accounting for uncertainties to assure safety; and (3) reliability based excavation repair criteria and removing potentially unsafe corrosion defects. Presentations of risk assessment procedures used at various companies were given. The role of regulators, best practices, and effective networking environments in ensuring the success of risk assessment policies was discussed. Risk assessment models were also reviewed.

  7. VALUE AT RISK - CORPORATE RISK MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Anis Cecilia-Nicoleta

    2011-12-01

    Full Text Available The notion of 'risk' is used in a number of sciences. The Faculty of Law studies the risk depending on its legality. The Accident Theory applies this term to describe the damage and the disasters. One can find studies on the risks in the works of psychology, philosophy, medicine and within each of these areas the study of the risk is based on the given science subject and, of course, on their methods and approaches. Such a variety of risk study is explained by the diversity of this phenomenon. Under the market economy conditions, the risk is an essential component of any economic agent management policy, of the approach developed by this one, a strategy that depends almost entirely on individual ability and capacity to anticipate his evolution and to exploit his opportunities, assuming a so-called 'risk of business failure.' There are several ways to measure the risks in projects, one of the most used methods to measure this being the Value at Risk(VaR. Value at Risk (VaR was made famous by JP Morgan in the mid 1990s, by introducing the RiskMetrics approach, and hence, by far, has been sanctioned by several Governing Bodies throughout the world bank. In short, it measures the value of risk capital stocks in a given period at a certain probability of loss. This measurement can be modified for risk applications through, for example, the potential loss values affirmation in a certain amount of time during the economic life of the project- clearly, a project with a lower VaR is better. It should be noted that it is not always possible or advisable for a company to limit itself to the remote analysis of each risk because the risks and their effects are interdependent and constitute a system .In addition, there are risks which, in combination with other risks, tend to produce effects which they would not have caused by themselves and risks that tend to offset and even cancel each other out.

  8. Trading off dietary choices, physical exercise and cardiovascular disease risks.

    Science.gov (United States)

    Grisolía, José M; Longo, Alberto; Boeri, Marco; Hutchinson, George; Kee, Frank

    2013-09-01

    Despite several decades of decline, cardiovascular diseases are still the most common causes of death in Western societies. Sedentary living and high fat diets contribute to the prevalence of cardiovascular diseases. This paper analyses the trade-offs between lifestyle choices defined in terms of diet, physical activity, cost, and risk of cardiovascular disease that a representative sample of the population of Northern Ireland aged 40-65 are willing to make. Using computer assisted personal interviews, we survey 493 individuals at their homes using a Discrete Choice Experiment (DCE) questionnaire administered between February and July 2011 in Northern Ireland. Unlike most DCE studies for valuing public health programmes, this questionnaire uses a tailored exercise, based on the individuals' baseline choices. A "fat screener" module in the questionnaire links personal cardiovascular disease risk to each specific choice set in terms of dietary constituents. Individuals are informed about their real status quo risk of a fatal cardiovascular event, based on an initial set of health questions. Thus, actual risks, real diet and exercise choices are the elements that constitute the choice task. Our results show that our respondents are willing to pay for reducing mortality risk and, more importantly, are willing to change physical exercise and dietary behaviours. In particular, we find that to improve their lifestyles, overweight and obese people would be more likely to do more physical activity than to change their diets. Therefore, public policies aimed to target obesity and its related illnesses in Northern Ireland should invest public money in promoting physical activity rather than healthier diets.

  9. Identifying and Managing Risk.

    Science.gov (United States)

    Abraham, Janice M.

    1999-01-01

    The role of the college or university chief financial officer in institutional risk management is (1) to identify risk (physical, casualty, fiscal, business, reputational, workplace safety, legal liability, employment practices, general liability), (2) to develop a campus plan to reduce and control risk, (3) to transfer risk, and (4) to track and…

  10. Entrepreneurs Facing Risk

    DEFF Research Database (Denmark)

    Zichella, Giulio; Reichstein, Toke

    to choose risk vis-à-vis certainty. Drawing on prospect theory, we formulate hypotheses about the greater likelihood that entrepreneurs (compared to others) will choose risk immediately after a positive gain, but will shy away from risk compared to others as the degree of risk increases. The hypotheses...

  11. Credit Risk Management

    Directory of Open Access Journals (Sweden)

    Viorica IOAN

    2012-11-01

    Full Text Available The bank is exposed to credit risk, the risk of not being able to recuperate the debtor claims as a result of the activity of granting loans to the clientele. Also, credit risk may manifest due to investments in other local and foreign credit institutions. Credit risk may be minimized through the careful evaluation of credit solicitors, through their monitoring along the duration of the loan and through the establishing of risk exposure limits, of significant risk margins as well as the acceptable balance between risk and profit.

  12. Risk Measures and Risk Capital Allocation

    OpenAIRE

    Karabey, U.

    2014-01-01

    Risk Ölçümleri ve Risk Sermayesi Dağıtımı Riskin ölçülmesi ve dağıtılması portföy/risk yönetiminde karsılasılan temel sorunlardır. Birçok risk ölçümü risklerin ölçülmesi için çözümler sunar. Fakat son yıllarda risk ölçümleri önemli ölçüde elestirilmis ve ortaya tutarlı risk ölçümleri olarak adlandırılan yeni bir yaklasım çıkmıstır. Aynı zamanda risk dağıtımı çesitlendirmeden kaynaklanan faydaların portföyü olusturan birimlere dağıtımını yapar. Bu çalısmada risk ölçüm ve risk dağıtım yöntemler...

  13. Risk Management In SMEs

    OpenAIRE

    Agarwal, Riya

    2010-01-01

    Small and medium enterprises are the backbone for the development of the economy. They provide employment, contribute to GDP and an important source of revenue for the country especially India employing approximately 30 million people and generating 40% of the export surplus. However the SMEs have to face lot of operational risks- credit risk, liquidity risk, foreign exchange risk, interest rate risk, competition from the MNCs and foreign buyers. Despite the failure of several SMEs, those pul...

  14. Issues for Simulation of Galactic Cosmic Ray Exposures for Radiobiological Research at Ground Based Accelerators

    Directory of Open Access Journals (Sweden)

    Myung-Hee Y Kim

    2015-06-01

    Full Text Available For research on the health risks of galactic cosmic rays (GCR ground-based accelerators have been used for radiobiology research with mono-energetic beams of single high charge, Z and energy, E (HZE particles. In this paper we consider the pros and cons of a GCR reference field at a particle accelerator. At the NASA Space Radiation Laboratory (NSRL we have proposed a GCR simulator, which implements a new rapid switching mode and higher energy beam extraction to 1.5 GeV/u, in order to integrate multiple ions into a single simulation within hours or longer for chronic exposures. After considering the GCR environment and energy limitations of NSRL, we performed extensive simulation studies using the stochastic transport code, GERMcode (GCR Event Risk Model to define a GCR reference field using 9 HZE particle beam-energy combinations each with a unique absorber thickness to provide fragmentation and 10 or more energies of proton and 4He beams. The reference field is shown to well represent the charge dependence of GCR dose in several energy bins behind shielding compared to a simulated GCR environment. However a more significant challenge for space radiobiology research is to consider chronic GCR exposure of up to 3 years in relation to simulations with animal models of human risks. We discuss issues in approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks using chronic or fractionation exposures. A kinetics model of HZE particle hit probabilities suggests that experimental simulations of several weeks will be needed to avoid high fluence rate artifacts, which places limitations on the experiments to be performed. Ultimately risk estimates are limited by theoretical understanding, and focus on improving understanding of mechanisms and development of experimental models to improve this understanding should remain the highest priority for space radiobiology

  15. Raising risk preparedness through flood risk communication

    Science.gov (United States)

    Maidl, E.; Buchecker, M.

    2014-01-01

    During the last decade, most European countries have produced risk maps of natural hazards, but little is known about how to communicate these maps most effectively to the public. In October 2011, Zurich's local authorities informed owners of buildings located in the urban flood hazard area about potential flood damage, the probability of flood events and protection measures. The campaign was based on the assumptions that informing citizens increases their risk awareness and that citizens who are aware of risks are more likely to undertake appropriate actions to protect themselves and their property. This study is intended as a contribution to a better understanding the factors influencing flood risk preparedness, with a special focus on the effects of such a one-way risk communication strategy. We conducted a standardized mail survey of 1500 property owners in the hazard areas in Zurich. The questionnaire comprised items measuring respondents' risk awareness, risk preparedness, flood experience, information seeking behaviour, knowledge about flood risk, evaluation of the information material, risk acceptance, kind of property owned, attachment to the property, trust in local authorities, and socio-demographic variables. Multivariate data analysis revealed that the average level of risk awareness and preparedness was low, but our results confirmed that the campaign had a statistically significant effect on the level of preparedness. The main factors influencing the respondents' intention to prepare for a flood were the extent to which they evaluated the information material positively and their risk awareness. Those who had never taken any interest in floods previously were less likely to read the material. For future campaigns, we therefore recommend repeated communication of relevant information tailored to the needs of the target population.

  16. NASA study of cataract in astronauts (NASCA). Report 1: Cross-sectional study of the relationship of exposure to space radiation and risk of lens opacity.

    Science.gov (United States)

    Chylack, Leo T; Peterson, Leif E; Feiveson, Alan H; Wear, Mary L; Manuel, F Keith; Tung, William H; Hardy, Dale S; Marak, Lisa J; Cucinotta, Francis A

    2009-07-01

    The NASA Study of Cataract in Astronauts (NASCA) is a 5-year longitudinal study of the effect of space radiation exposure on the severity/progression of nuclear, cortical and posterior subcapsular (PSC) lens opacities. Here we report on baseline data that will be used over the course of the longitudinal study. Participants include 171 consenting astronauts who flew at least one mission in space and a comparison group made up of three components: (a) 53 astronauts who had not flown in space, (b) 95 military aircrew personnel, and (c) 99 non-aircrew ground-based comparison subjects. Continuous measures of nuclear, cortical and PSC lens opacities were derived from Nidek EAS 1000 digitized images. Age, demographics, general health, nutritional intake and solar ocular exposure were measured at baseline. Astronauts who flew at least one mission were matched to comparison subjects using propensity scores based on demographic characteristics and medical history stratified by gender and smoking (ever/never). The cross-sectional data for matched subjects were analyzed by fitting customized non-normal regression models to examine the effect of space radiation on each measure of opacity. The variability and median of cortical cataracts were significantly higher for exposed astronauts than for nonexposed astronauts and comparison subjects with similar ages (P=0.015). Galactic cosmic space radiation (GCR) may be linked to increased PSC area (P=0.056) and the number of PSC centers (P=0.095). Within the astronaut group, PSC size was greater in subjects with higher space radiation doses (P=0.016). No association was found between space radiation and nuclear cataracts. Cross-sectional data analysis revealed a small deleterious effect of space radiation for cortical cataracts and possibly for PSC cataracts. These results suggest increased cataract risks at smaller radiation doses than have been reported previously.

  17. Probabilistic risk analysis and terrorism risk.

    Science.gov (United States)

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  18. Risk assessment and risk management of mycotoxins.

    Science.gov (United States)

    2012-01-01

    Risk assessment is the process of quantifying the magnitude and exposure, or probability, of a harmful effect to individuals or populations from certain agents or activities. Here, we summarize the four steps of risk assessment: hazard identification, dose-response assessment, exposure assessment, and risk characterization. Risk assessments using these principles have been conducted on the major mycotoxins (aflatoxins, fumonisins, ochratoxin A, deoxynivalenol, and zearalenone) by various regulatory agencies for the purpose of setting food safety guidelines. We critically evaluate the impact of these risk assessment parameters on the estimated global burden of the associated diseases as well as the impact of regulatory measures on food supply and international trade. Apart from the well-established risk posed by aflatoxins, many uncertainties still exist about risk assessments for the other major mycotoxins, often reflecting a lack of epidemiological data. Differences exist in the risk management strategies and in the ways different governments impose regulations and technologies to reduce levels of mycotoxins in the food-chain. Regulatory measures have very little impact on remote rural and subsistence farming communities in developing countries, in contrast to developed countries, where regulations are strictly enforced to reduce and/or remove mycotoxin contamination. However, in the absence of the relevant technologies or the necessary infrastructure, we highlight simple intervention practices to reduce mycotoxin contamination in the field and/or prevent mycotoxin formation during storage.

  19. Risk assessment terminology: risk communication part 1

    Directory of Open Access Journals (Sweden)

    Gaetano Liuzzo

    2016-03-01

    Full Text Available The paper describes the terminology of risk communication in the view of food safety: the theory of stakeholders, the citizens’ involvement and the community interest and consultation are reported. Different aspects of risk communication (public communication, scientific uncertainty, trust, care, consensus and crisis communication are discussed.

  20. Decreasing Relative Risk Premium

    DEFF Research Database (Denmark)

    Hansen, Frank

    We consider the risk premium demanded by a decision maker with wealth x in order to be indifferent between obtaining a new level of wealth y1 with certainty, or to participate in a lottery which either results in unchanged present wealth or a level of wealth y2 > y1. We define the relative risk...... premium as the quotient between the risk premium and the increase in wealth y1–x which the decision maker puts on the line by choosing the lottery in place of receiving y1 with certainty. We study preferences such that the relative risk premium is a decreasing function of present wealth, and we determine...... relative risk premium in the small implies decreasing relative risk premium in the large, and decreasing relative risk premium everywhere implies risk aversion. We finally show that preferences with decreasing relative risk premium may be equivalently expressed in terms of certain preferences on risky...

  1. Diversity in Risk Communication

    Directory of Open Access Journals (Sweden)

    Agung Nur Probohudono

    2013-03-01

    Full Text Available This study analyses the communication of the five major categories of risk (business, strategy, market and credit risk disclosure over the volatile 2007-2009 Global Financial Crisis (GFC time period in key South East Asian countries’ manufacturing listed companies. This study is important as it contributes to the literature by providing insights into the voluntary risk disclosure practices using sample countries with different economic scenarios. Key findings are that business risk is the most disclosed category and strategy risk is the least disclosed. Business and credit risk disclosure consistently increase over the three year period, while operating, market and strategy risk disclosure increase in 2008, but then decrease slightly in 2009. Statistical analysis reveals that country of incorporation and size help predict risk disclosure levels. The overall low disclosure levels (26-29% highlight the potential for far higher communication of key risk factors.

  2. Preferences over Social Risk

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Lau, Morten; Rutström, E. Elisabet;

    2013-01-01

    the methodological issues extend to larger groups that form endogenously (e.g., families, committees, communities). Preferences over social risk can be closely approximated by individual risk attitudes when subjects have no information about the risk preferences of other group members. We find no evidence......We elicit individual preferences over social risk. We identify the extent to which these preferences are correlated with preferences over individual risk and the well-being of others. We examine these preferences in the context of laboratory experiments over small, anonymous groups, although...... that subjects systematically reveal different risk attitudes in a social setting with no prior knowledge about the risk preferences of others compared to when they solely bear the consequences of the decision. However, we also find that subjects are significantly more risk averse when they know the risk...

  3. Risk, Resources and Structures

    DEFF Research Database (Denmark)

    Lyng Jensen, Jesper; Ponsaing, Claus Due; Thrane, Sof

    2012-01-01

    In this article, we describe a basic mechanism by which risk events can induce indirect value losses to the risk owner: a value loss arising from a risk event interfering with activities that have no logical connection with the risk event other than that of having the same owner; a mechanism we h...... and implications for ERM, suggesting the addition of a risk resource forecast and discussing implications for four types of risk mitigation strategies: capital requirements, risk diversification, network relations and insurance.......In this article, we describe a basic mechanism by which risk events can induce indirect value losses to the risk owner: a value loss arising from a risk event interfering with activities that have no logical connection with the risk event other than that of having the same owner; a mechanism we...... have named Structural Risk. This effect is caused by the occurrence of a resource fluctuation which challenges the risk owner's ability to gain control of adequate resources, thus forcing the risk owner to prioritize and terminate other activities and projects. In this process value is destroyed...

  4. Exploration Health Risks: Probabilistic Risk Assessment

    Science.gov (United States)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of

  5. Risk perception and credibility of risk communication

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeberg, L

    1992-10-01

    Experts and the public frequently disagree when it comes to risk assessment. The reasons for such disagreement are discussed, and it is pointed out that disagreement among experts and lack of full understanding of real risks contributes to skepticism among the public. The notion that people are in general reacting in a highly emotional and non-rational, phobic, manner is rejected. The very conditions for risk assessment present to the public, and common-sense cognitive dynamics, are better explanations of risk perception, as are some social psychological concepts. If trust is to be established in a country where it is quite low some kind of politically regulated public influence on decision making and risk monitoring is probably needed, e.g. by means of a publicly elected and responsible ombudsman. 57 refs, 8 figs, 5 tabs.

  6. Raising risk preparedness by flood risk communication

    Science.gov (United States)

    Maidl, E.; Buchecker, M.

    2015-07-01

    During the last decade, most European countries have produced hazard maps of natural hazards, but little is known about how to communicate these maps most efficiently to the public. In October 2011, Zurich's local authorities informed owners of buildings located in the urban flood hazard zone about potential flood damage, the probability of flood events and protection measures. The campaign was based on the assumptions that informing citizens increases their risk awareness and that citizens who are aware of risks are more likely to undertake actions to protect themselves and their property. This study is intended as a contribution to better understand the factors that influence flood risk preparedness, with a special focus on the effects of such a one-way risk communication strategy. We conducted a standardized mail survey of 1500 property owners in the hazard zones in Zurich (response rate main survey: 34 %). The questionnaire included items to measure respondents' risk awareness, risk preparedness, flood experience, information-seeking behaviour, knowledge about flood risk, evaluation of the information material, risk acceptance, attachment to the property and trust in local authorities. Data about the type of property and socio-demographic variables were also collected. Multivariate data analysis revealed that the average level of risk awareness and preparedness was low, but the results confirmed that the campaign had a statistically significant effect on the level of preparedness. The main influencing factors on the intention to prepare for a flood were the extent to which respondents evaluated the information material positively as well as their risk awareness. Respondents who had never taken any previous interest in floods were less likely to read the material. For future campaigns, we therefore recommend repeated communication that is tailored to the information needs of the target population.

  7. Observations on risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, W.A. Jr.

    1979-11-01

    This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested.

  8. Model Building for the Behavior of Reporting Nursing Adverse Events Based on the Theory of Planned Behavior and Its Empirical Research%基于计划行为理论的护理人员不良事件报告行为模型构建及实证研究

    Institute of Scientific and Technical Information of China (English)

    曹志辉; 陈丽丽; 张金燕; 郑贺英; 陈桂芝

    2015-01-01

    目的:构建以计划行为理论为指导的护理人员不良事件报告行为模型,并进行实证研究。方法2014年4月,对河北省某医院的436名在职护理人员进行问卷调查,问卷包括:护理人员的基本情况、护士护理不良事件报告行为量表和护理不良事件报告意向量表、护士对护理不良事件报告行为的态度问卷、护理不良事件报告行为规范问卷、护理不良事件报告知觉行为控制问卷。采用Spearman秩相关和多元线性回归等统计学方法对理论模型进行验证。结果态度、描述性规范、知觉行为控制对护理不良事件报告意向的偏回归系数分别为0.278、0.193和0.315,报告意向对报告行为的偏回归系数为0.496,且均有统计学意义( P<0.05)。报告意向是态度、描述性规范、知觉行为控制影响报告行为的中介变量。结论以计划行为理论为基础的护士护理不良事件报告行为模型对护理不良事件报告行为具有较好的解释和预测作用,应从态度、指令性规范、描述性规范、知觉行为控制入手,提出有效干预策略。%Objective To build a model for the behavior of reporting nursing adverse events based on the theory of planned behavior and to conduct empirical research. Methods In April of 2014,We conducted questionnaire survey on 436 in-service nurses in a hospital of Hebei Province. The questionnaire survey included a survey on basic information of nurses,a scale of the behavior of reporting nursing adverse events, a scale of the intention of reporting nursing adverse events, a questionnaire on nurses'attitude toWards the behavior of reporting nursing adverse events,a questionnaire on the behavior norm of reporting nursing adverse events and a questionnaire on the perceived control of reporting nursing adverse events. The theoretical model Was verified using spearman rank correlation method and multiple linear regression method

  9. Quantitative microbiological risk assessment.

    Science.gov (United States)

    Hoornstra, E; Notermans, S

    2001-05-21

    The production of safe food is being increasingly based on the use of risk analysis, and this process is now in use to establish national and international food safety objectives. It is also being used more frequently to guarantee that safety objectives are met and that such guarantees are achieved in a cost-effective manner. One part of the overall risk analysis procedure-risk assessment-is the scientific process in which the hazards and risk factors are identified, and the risk estimate or risk profile is determined. Risk assessment is an especially important tool for governments when food safety objectives have to be developed in the case of 'new' contaminants in known products or known contaminants causing trouble in 'new' products. Risk assessment is also an important approach for food companies (i) during product development, (ii) during (hygienic) process optimalization, and (iii) as an extension (validation) of the more qualitative HACCP-plan. This paper discusses these two different types of risk assessment, and uses probability distribution functions to assess the risks posed by Escherichia coli O157:H7 in each case. Such approaches are essential elements of risk management, as they draw on all available information to derive accurate and realistic estimations of the risk posed. The paper also discusses the potential of scenario-analysis in simulating the impact of different or modified risk factors during the consideration of new or improved control measures.

  10. Risk and cognition

    CERN Document Server

    Faucher, Colette

    2015-01-01

    This book presents recent research using cognitive science to apprehend risk situations and elaborate new organizations, new systems and new methodological tools in response. The book demonstrates the reasons, advantages and implications of the association of the concepts of cognition and risk. It is shown that this association has strong consequences on how to apprehend critical situations that emerge  within various activity domains, and how to elaborate responses to these critical situations.. The following topics are covered by the book: ·     Influence of the culture in risk management, ·     Influence of the risk communication in risk management, ·     User-centred design to improve risk situation management, ·     Designing new tools to assist risk situation management, ·     Risk prevention in industrial activities.

  11. Risk a multidisciplinary introduction

    CERN Document Server

    Straub, Daniel; Welpe, Isabell

    2014-01-01

    This is a unique book addressing the integration of risk methodology from various fields. It stimulates intellectual debate and communication across disciplines, promotes better risk management practices and contributes to the development of risk management methodologies. Book chapters explain fundamental risk models and measurement, and address risk and security issues from diverse areas such as finance and insurance, health sciences, life sciences, engineering and information science. Integrated Risk Sciences is an emerging field, that considers risks in different fields aiming at a common language, and at sharing and improving methods developed in different fields. Readers should have a Bachelor degree and at least one basic university course in statistics and probability. The main goal of the book is to provide basic knowledge on risk and security in a common language; the authors have taken particular care to ensure that each chapter can be understood by doctoral students and researchers across disciplin...

  12. Risk Factors and Prevention

    Science.gov (United States)

    ... Factors & Prevention Back to Patient Resources Risk Factors & Prevention Even people who look healthy and free of ... as possible. Share: The Normal Heart Risk Factors & Prevention Heart Diseases & Disorders Substances & Heart Rhythm Disorders Symptoms & ...

  13. Depression and Suicide Risk

    Science.gov (United States)

    Depression and Suicide Risk (2014) Definition: A mood disorder that causes a persistent feeling of sadness and ... i Prevalence: 1. Ranges of lifetime risk for depression: from 6.7% overall to 40% in men, ...

  14. Decreasing relative risk premium

    DEFF Research Database (Denmark)

    Hansen, Frank

    2007-01-01

    We consider the risk premium demanded by a decision maker in order to be indifferent between obtaining a new level of wealth with certainty, or to participate in a lottery which either results in unchanged wealth or an even higher level than what can be obtained with certainty. We study preferences...... such that the corresponding relative risk premium is a decreasing function of present wealth, and we determine the set of associated utility functions. We find a new characterization of risk vulnerability and determine a large set of utility functions, closed under summation and composition, which are both risk vulnerable...... and have decreasing relative risk premium. We finally introduce the notion of partial risk neutral preferences on binary lotteries and show that partial risk neutrality is equivalent to preferences with decreasing relative risk premium...

  15. High-Risk Pregnancy

    Science.gov (United States)

    ... NICHD Research Information Clinical Trials Resources and Publications High-Risk Pregnancy: Condition Information Skip sharing on social media links Share this: Page Content A high-risk pregnancy refers to anything that puts the ...

  16. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    Science.gov (United States)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  17. Measuring Systemic Risk

    DEFF Research Database (Denmark)

    Acharya, Viral V.; Heje Pedersen, Lasse; Philippon, Thomas

    2017-01-01

    We present an economic model of systemic risk in which undercapitalization of the financial sector as a whole is assumed to harm the real economy, leading to a systemic risk externality. Each financial institution’s contribution to systemic risk can be measured as its systemic expected shortfall...... of components of SES to predict emerging systemic risk during the financial crisis of 2007–2009....

  18. Risks and perceptions

    Energy Technology Data Exchange (ETDEWEB)

    Lee, T.

    1987-11-01

    The article on the risks and perceptions of nuclear power was previously published in the Times Higher Education Supplement, May 1987. The public attitude towards risks associated with nuclear power, compared with other risks in everyday life, is examined. Results of psychological studies of the perceived risk of nuclear power are also discussed. The author argues that fear of nuclear catastrophe is not one which can be brushed aside by statistics or punditry.

  19. International Correlation Risk

    OpenAIRE

    Philippe Mueller; Andreas Stathopoulos; Andrea Vedolin

    2012-01-01

    Foreign exchange correlation is a key driver of risk premia in the cross-section of carry trade returns. First, we show that the correlation risk premium, defined as the difference between the risk-neutral and objective measure correlation is large (15% per year) and highly time-varying. Second, sorting currencies according to their exposure with correlation innovations yields portfolios with attractive risk and return characteristics. We also find that high (low) interest rate currencies hav...

  20. Parametric Risk Parity

    OpenAIRE

    Lorenzo Mercuri; Edit Rroji

    2014-01-01

    Any optimization algorithm based on the risk parity approach requires the formulation of portfolio total risk in terms of marginal contributions. In this paper we use the independence of the underlying factors in the market to derive the centered moments required in the risk decomposition process when the modified versions of Value at Risk and Expected Shortfall are considered. The choice of the Mixed Tempered Stable distribution seems adequate for fitting skewed and heavy tailed distribution...

  1. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    KYU-HWAN; YANG

    2001-01-01

    Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).……

  2. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@Risk analysis is a useful tool for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data, contaminant residue levels, statistical tools, exposure values and relevant variants. Risk managers consider scientific evidence and risk estimates, along with statutory, engineering, economic, social, and political factors, in evaluating alternative regulatory options and choosing among those options (NRC, 1983).

  3. Risk Analysis in Action

    Institute of Scientific and Technical Information of China (English)

    KYU-HAWNYANG

    2001-01-01

    Risk analysis is a useful too for making good decisions on the risks of certain potentially hazardous agents and suggests a safe margin through scientific processes using toxicological data.contaminant residue levels,statistical tools,exposure values and relevant variants,Risk managers consider scientific evidence and risk estimates,along with statutory,engineering,economic,social,and political factors,in evaluating alternative regulatory options and choosing among those options(NRC,1983).

  4. IT Risk register

    OpenAIRE

    2011-01-01

    The theoretical part of the thesis analyzes several selected methodologies and best-practices related to information technology risks management, with focus on documents and guidance developed by ISACA. It builds a set of ideas and basic requirements for effective model of an IT risk register. Strong emphasis is placed on mapping CobiT 4.1 based Risk IT to COBIT 5. The practical part describes implementation of an exploratory web-based IT risk register in Python programming language utilizing...

  5. Catastrophic Medical Expenditure Risk

    NARCIS (Netherlands)

    G. Flores (Gabriela); O.A. O'Donnell (Owen)

    2012-01-01

    textabstractMedical expenditure risk can pose a major threat to living standards. We derive decomposable measures of catastrophic medical expenditure risk from reference-dependent utility with loss aversion. We propose a quantile regression based method of estimating risk exposure from cross-section

  6. Financial Risk Management

    OpenAIRE

    Catalin-Florinel Stanescu; Laurentiu Mircea Simion

    2011-01-01

    Concerns about the financial risk is increasing. In this climate, companies of all types and sizes want a robust framework for financial risk management to meet compliance requirements, contribute to better decision making and increase performance. Financial risk management professionals working with financial institutions and other corporate clients to achieve these objectives.

  7. Catastrophic Medical Expenditure Risk

    NARCIS (Netherlands)

    G. Flores (Gabriela); O.A. O'Donnell (Owen)

    2012-01-01

    textabstractMedical expenditure risk can pose a major threat to living standards. We derive decomposable measures of catastrophic medical expenditure risk from reference-dependent utility with loss aversion. We propose a quantile regression based method of estimating risk exposure from cross-section

  8. Risk communication, risk perception, and public health.

    Science.gov (United States)

    Aakko, Eric

    2004-01-01

    Risk communication is about building trust while deploying an interactive and ongoing communication process in which audience members are active participants. This interactive participation may not solve a public health crisis, but it will help reduce unwarranted fear, anxiety and distrust. Consequently, if a government agency fails to understand how to effectively communicate about health risks, their trustworthiness and credibility may suffer, and a crisis event may go from bad to worse.

  9. Risk Assessment Overview

    Science.gov (United States)

    Prassinos, Peter G.; Lyver, John W., IV; Bui, Chinh T.

    2011-01-01

    Risk assessment is used in many industries to identify and manage risks. Initially developed for use on aeronautical and nuclear systems, risk assessment has been applied to transportation, chemical, computer, financial, and security systems among others. It is used to gain an understanding of the weaknesses or vulnerabilities in a system so modification can be made to increase operability, efficiency, and safety and to reduce failure and down-time. Risk assessment results are primary inputs to risk-informed decision making; where risk information including uncertainty is used along with other pertinent information to assist management in the decision-making process. Therefore, to be useful, a risk assessment must be directed at specific objectives. As the world embraces the globalization of trade and manufacturing, understanding the associated risk become important to decision making. Applying risk assessment techniques to a global system of development, manufacturing, and transportation can provide insight into how the system can fail, the likelihood of system failure and the consequences of system failure. The risk assessment can identify those elements that contribute most to risk and identify measures to prevent and mitigate failures, disruptions, and damaging outcomes. In addition, risk associated with public and environment impact can be identified. The risk insights gained can be applied to making decisions concerning suitable development and manufacturing locations, supply chains, and transportation strategies. While risk assessment has been mostly applied to mechanical and electrical systems, the concepts and techniques can be applied across other systems and activities. This paper provides a basic overview of the development of a risk assessment.

  10. Risk Management Implementation Tool

    Science.gov (United States)

    Wright, Shayla L.

    2004-01-01

    Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.

  11. Risk management in Takaful

    OpenAIRE

    2010-01-01

    Risk management is of vital importance in Islam and Takāful provides a way to manage risks in business according to Sharī’ah principles. This research paper attempts to identify various types of risks involved in Takāful business that affect operational and investment functions of Takāful operators across the globe. It lays down criteria for Takāful operator to manage those risks effectively. However, Takāful operators often face difficulty in managing market and credit risks as Sharī’ah comp...

  12. Measuring Idiosyncratic Risk

    DEFF Research Database (Denmark)

    Sunesen, Eva Rytter

    This paper offers two refinements of the traditional risk measure based on the volatility of growth. First, we condition GDP growth on structural characteristics of the host country that move only slowly and therefore can be partly predicted by an investor. Second, we adjust conditional risk...... and Latin American countries, but the idiosyncratic risk factor also represents a larger share than in other developing countries. As a final contribution, we search the empirical literature on foreign direct investment and risk in order to determine which of the suggested risk measures provide the best...

  13. Agile risk management

    CERN Document Server

    Moran, Alan

    2014-01-01

    This work is the definitive guide for IT managers and agile practitioners. It elucidates the principles of agile risk management and how these relate to individual projects. Explained in clear and concise terms, this synthesis of project risk management and agile techniques is illustrated using the major methodologies such as XP, Scrum and DSDM.Although the agile community frequently cites risk management, research suggests that risk is often narrowly defined and, at best, implicitly treated, which in turn leads to an inability to make informed decisions concerning risk and reward and a poor u

  14. Enterprise Risk Management Models

    CERN Document Server

    Olson, David L

    2010-01-01

    Enterprise risk management has always been important. However, the events of the 21st Century have made it even more critical. The top level of business management became suspect after scandals at ENRON, WorldCom, and other business entities. Financially, many firms experienced difficulties from bubbles. The problems of interacting cultures demonstrated risk from terrorism as well, with numerous terrorist attacks, to include 9/11 in the U.S. Risks can arise in many facets of business. Businesses in fact exist to cope with risk in their area of specialization. Financial risk management has focu

  15. Individual Property Risk Management

    Directory of Open Access Journals (Sweden)

    Michael S. Finke

    2010-01-01

    Full Text Available This paper reviews household property risk management and estimates normatively optimal choice under theoretical assumptions. Although risk retention limits are common in the financial planning industry, estimates of optimal risk retention that include both financial and human wealth far exceed limits commonly recommended. Households appear to frame property losses differently from other wealth losses leading to wealth-reducing, excess risk transfer. Possible theoretical explanations for excess sensitivity to loss are reviewed. Differences between observed and optimal risk management imply a large potential gain from improved choice.

  16. Liquidity and Risk Management

    OpenAIRE

    Holmström, Bengt; Tirole, Jean

    2007-01-01

    This paper provides a model of the interaction between risk-management practices and market liquidity. On one hand, tighter risk management reduces the maximum position an institution can take, thus the amount of liquidity it can offer to the market. On the other hand, risk managers can take into account that lower liquidity amplifies the effective risk of a position by lengthening the time it takes to sell it. The main result of the paper is that a feedback effect can arise: tighter risk man...

  17. Models of Credit Risk Measurement

    OpenAIRE

    Hagiu Alina

    2011-01-01

    Credit risk is defined as that risk of financial loss caused by failure by the counterparty. According to statistics, for financial institutions, credit risk is much important than market risk, reduced diversification of the credit risk is the main cause of bank failures. Just recently, the banking industry began to measure credit risk in the context of a portfolio along with the development of risk management started with models value at risk (VAR). Once measured, credit risk can be diversif...

  18. MASTERING SUPPLY CHAIN RISKS

    Directory of Open Access Journals (Sweden)

    Borut Jereb

    2012-11-01

    Full Text Available Risks in supply chains represent one of the major business issues today. Since every organizationstrives for success and uninterrupted operations, efficient supply chain risk management is crucial.During supply chain risk research at the Faculty of Logistics in Maribor (Slovenia some keyissues in the field were identified, the major being the lack of instruments which can make riskmanagement in an organization easier and more efficient. Consequently, a model which captures anddescribes risks in an organization and its supply chain was developed. It is in accordance with thegeneral risk management and supply chain security standards, the ISO 31000 and ISO 28000families. It also incorporates recent finding from the risk management field, especially from theviewpoint of segmenting of the public.The model described in this paper focuses on the risks itself by defining them by different keydimensions, so that risk management is simplified and can be undertaken in every supply chain andorganizations within them. Based on our mode and consequent practical research in actualorganizations, a freely accessible risk catalog has been assembled and published online from the risksthat have been identified so far. This catalog can serve as a checklist and a starting point in supplychain risk management in organizations. It also incorporates experts from the field into a community,in order to assemble an ever growing list of possible risks and to provide insight into the model andits value in practice.

  19. Biosafety Risk Assessment Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Caskey, Susan Adele [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). International Biological Threat Reduction Program; Gaudioso, Jennifer M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). International Biological Threat Reduction Program; Salerno, Reynolds Mathewson [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). International Biological Threat Reduction Program; Wagner, Stefan M. [Public Health Agency of Canada, Winnipeg, MB (Canada). Canadian Science Centre for Human and Animal Health (CSCHAH); Shigematsu, Mika [National Inst. of Infectious Diseases (NIID), Tokyo (Japan); Risi, George [Infectious Disease Specialists, P.C, Missoula, MT (United States); Kozlovac, Joe [US Dept. of Agriculture (USDA)., Beltsville, MD (United States); Halkjaer-Knudsen, Vibeke [Statens Serum Inst., Copenhagen (Denmark); Prat, Esmeralda [Bayer CropScience, Monheim am Rhein (Germany)

    2010-10-01

    Laboratories that work with biological agents need to manage their safety risks to persons working the laboratories and the human and animal community in the surrounding areas. Biosafety guidance defines a wide variety of biosafety risk mitigation measures, which include measures which fall under the following categories: engineering controls, procedural and administrative controls, and the use of personal protective equipment; the determination of which mitigation measures should be used to address the specific laboratory risks are dependent upon a risk assessment. Ideally, a risk assessment should be conducted in a manner which is standardized and systematic which allows it to be repeatable and comparable. A risk assessment should clearly define the risk being assessed and avoid over complication.

  20. Offshore risk assessment

    CERN Document Server

    Vinnem, Jan-Erik

    2014-01-01

      Offshore Risk Assessment was the first book to deal with quantified risk assessment (QRA) as applied specifically to offshore installations and operations. Risk assessment techniques have been used for more than three decades in the offshore oil and gas industry, and their use is set to expand increasingly as the industry moves into new areas and faces new challenges in older regions.   This updated and expanded third edition has been informed by a major R&D program on offshore risk assessment in Norway and summarizes research from 2006 to the present day. Rooted with a thorough discussion of risk metrics and risk analysis methodology,  subsequent chapters are devoted to analytical approaches to escalation, escape, evacuation and rescue analysis of safety and emergency systems.   Separate chapters analyze the main hazards of offshore structures: fire, explosion, collision, and falling objects as well as structural and marine hazards. Risk mitigation and control are discussed, as well as an illustrat...

  1. Low Risk Anomalies?

    DEFF Research Database (Denmark)

    Schneider, Paul; Wagner, Christian; Zechner, Josef

    risk, the standard capital asset pricing model (CAPM) increasingly overestimates expected equity returns relative to firms' true (skew-adjusted) market risk. Empirically, the profitability of betting against beta/volatility increases with firms' downside risk, and the risk-adjusted return differential...... of betting against beta/volatility among low skew firms compared to high skew firms is economically large. Our results suggest that the returns to betting against beta or volatility do not necessarily pose asset pricing puzzles but rather that such strategies collect premia that compensate for skew risk......This paper shows theoretically and empirically that beta- and volatility-based low risk anomalies are driven by return skewness. The empirical patterns concisely match the predictions of our model that endogenizes the role of skewness for stock returns through default risk. With increasing downside...

  2. Risk Factors in Derivatives Markets

    Directory of Open Access Journals (Sweden)

    Raimonda Martinkutė-Kaulienė

    2015-02-01

    Full Text Available The objective of the article is to analyse and present the classification of risks actual to derivative securities. The analysis is based on classical and modern literature findings and analysis of newest statistical data. The analysis led to the conclusion, that the main risks typical for derivatives contracts and their traders are market risk, liquidity risk, credit and counterparty risk, legal risk and transactions risk. Pricing risk and systemic risk is also quite important. The analysis showed that market risk is the most important kind of risk that in many situations influences the level of remaining risks.

  3. Risk factors for diseases of the cardiovascular system among Catholics living in areas of southern Poland

    Directory of Open Access Journals (Sweden)

    Anna Majda

    2017-06-01

    Full Text Available Introduction : Cardiovascular diseases (CVD are the most frequent cause of mortality of Polish residents. In Poland, there are few publications regarding research on the influence of people’s religiosity on their health. Aim of the research : To determine some factors of cardiovascular risk and the risk of cardiovascular events among Catholics. Material and methods: This cross-sectional study was conducted among 134 randomly selected Catholics and based on the results of: questionnaire survey, anthropometric measurements, physical examination, the SCORE scale, laboratory tests (CRP, homocysteine. glucose, total cholesterol, HDL, and triglycerides and assessing the risk of cardiovascular events based on the SCORE scale. Statistical analysis was based on the χ 2 test. Founded significance level was 0.05. Results: More than half of the respondents were diagnosed delevated homocysteine level and gluteal-femoral obesity. A little more than half of those surveyed had elevated total cholesterol levels and increased blood pressure, a little more than one-quarter of the respondents had raised triglyceride levels, and one-tenth had heightened glucose and C-reactive protein levels. The higher the age of the respondents, the more often the results of their biochemical exceed standards. Over half of those examined were diagnosed with overweight or obesity. Among examined gynoid obesity prevailed over android obesity. The risk assessment of CVD Catholics revealed that among the modifiable factors, biochemical levels of homocysteine proved to be the most important new risk factor, but among the classic factors it was blood pressure value. More than half of the respondents had moderate risk of cardiovascular events in the study group. Conclusions : Nurses should promote pro-health attitudes, and should encourage the elimination of risk factors and biochemical testing and measurement among Catholics, who are a religious group at higher risk of cardiovascular

  4. Fuel related risks; Braenslerisker

    Energy Technology Data Exchange (ETDEWEB)

    Englund, Jessica; Sernhed, Kerstin; Nystroem, Olle; Graveus, Frank (Grontmij AB, (Sweden))

    2012-02-15

    The project, within which this work report was prepared, aimed to complement the Vaermeforsk publication 'Handbook of fuels' on fuel related risks and measures to reduce the risks. The fuels examined in this project where the fuels included in the first version of the handbook from 2005 plus four additional fuels that will be included in the second and next edition of the handbook. Following fuels were included: woodfuels (sawdust, wood chips, powder, briquettes), slash, recycled wood, salix, bark, hardwood, stumps, straw, reed canary grass, hemp, cereal, cereal waste, olive waste, cocoa beans, citrus waste, shea, sludge, forest industrial sludge, manure, Paper Wood Plastic, tyre, leather waste, cardboard rejects, meat and bone meal, liquid animal and vegetable wastes, tall oil pitch, peat, residues from food industry, biomal (including slaughterhouse waste) and lignin. The report includes two main chapters; a general risk chapter and a chapter of fuel specific risks. The first one deals with the general concept of risk, it highlights laws and rules relevant for risk management and it discuss general risks that are related to the different steps of fuel handling, i.e. unloading, storing, processing the fuel, transportation within the facility, combustion and handling of ashes. The information that was used to produce this chapter was gathered through a literature review, site visits, and the project group's experience from risk management. The other main chapter deals with fuel-specific risks and the measures to reduce the risks for the steps of unloading, storing, processing the fuel, internal transportation, combustion and handling of the ashes. Risks and measures were considered for all the biofuels included in the second version in the handbook of fuels. Information about the risks and risk management was gathered through interviews with people working with different kinds of fuels in electricity and heat plants in Sweden. The information from

  5. Seismic risk perception test

    Science.gov (United States)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro

    2013-04-01

    The perception of risks involves the process of collecting, selecting and interpreting signals about uncertain impacts of events, activities or technologies. In the natural sciences the term risk seems to be clearly defined, it means the probability distribution of adverse effects, but the everyday use of risk has different connotations (Renn, 2008). The two terms, hazards and risks, are often used interchangeably by the public. Knowledge, experience, values, attitudes and feelings all influence the thinking and judgement of people about the seriousness and acceptability of risks. Within the social sciences however the terminology of 'risk perception' has become the conventional standard (Slovic, 1987). The mental models and other psychological mechanisms which people use to judge risks (such as cognitive heuristics and risk images) are internalized through social and cultural learning and constantly moderated (reinforced, modified, amplified or attenuated) by media reports, peer influences and other communication processes (Morgan et al., 2001). Yet, a theory of risk perception that offers an integrative, as well as empirically valid, approach to understanding and explaining risk perception is still missing". To understand the perception of risk is necessary to consider several areas: social, psychological, cultural, and their interactions. Among the various research in an international context on the perception of natural hazards, it seemed promising the approach with the method of semantic differential (Osgood, C.E., Suci, G., & Tannenbaum, P. 1957, The measurement of meaning. Urbana, IL: University of Illinois Press). The test on seismic risk perception has been constructed by the method of the semantic differential. To compare opposite adjectives or terms has been used a Likert's scale to seven point. The test consists of an informative part and six sections respectively dedicated to: hazard; vulnerability (home and workplace); exposed value (with reference to

  6. Sociocultural definitions of risk

    Energy Technology Data Exchange (ETDEWEB)

    Rayner, S.

    1990-10-01

    Public constituencies frequently are criticized by technical experts as being irrational in response to low-probability risks. This presentation argued that most people are concerned with a variety of risk attributes other than probability and that is rather irrational to exclude these from the definition and analysis of technological risk. Risk communication, which is at the heart of the right-to-know concept, is described as the creation of shared meaning rather than the mere transmission of information. A case study of utilities, public utility commissions, and public interest groups illustrates how the diversity of institutional cultures in modern society leads to problems for the creation of shared meanings in establishing trust, distributing liability, and obtaining consent to risk. This holistic approach to risk analysis is most appropriate under conditions of high uncertainty and/or decision stakes. 1 fig., 5 tabs.

  7. Ransomware: Minimizing the Risks.

    Science.gov (United States)

    Pope, Justin

    2016-01-01

    This ongoing column is dedicated to providing information to our readers on managing legal risks associated with medical practice. We invite questions from our readers. The answers are provided by PRMS, Inc. (www.prms.com), a manager of medical professional liability insurance programs with services that include risk management consultation, education and onsite risk management audits, and other resources to healthcare providers to help improve patient outcomes and reduce professional liability risk. The answers published in this column represent those of only one risk management consulting company. Other risk management consulting companies or insurance carriers may provide different advice, and readers should take this into consideration. The information in this column does not constitute legal advice. For legal advice, contact your personal attorney. Note: The information and recommendations in this article are applicable to physicians and other healthcare professionals so "clinician" is used to indicate all treatment team members.

  8. Entrepreneurs Facing Risk

    DEFF Research Database (Denmark)

    Zichella, Giulio; Reichstein, Toke

    are tested using data collected in laboratory-based real money games experiments. We find support for our hypotheses, indicating that entrepreneurs’ bias towards risk is circumstantial. These results have fundamental implications for our understanding of factors guiding entrepreneurial choices under risk......Theory conjectures that entrepreneurs are more likely than others to make risky choices. However, the empirical evidence is mixed. This paper offers new insights into entrepreneurs’ tendencies to make risky choices, by investigating the circumstances in which entrepreneurs are more/less likely...... to choose risk vis-à-vis certainty. Drawing on prospect theory, we formulate hypotheses about the greater likelihood that entrepreneurs (compared to others) will choose risk immediately after a positive gain, but will shy away from risk compared to others as the degree of risk increases. The hypotheses...

  9. Microbiological Quantitative Risk Assessment

    Science.gov (United States)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  10. Risk, Uncertainty, and Entrepreneurship

    DEFF Research Database (Denmark)

    Koudstaal, Martin; Sloof, Randolph; Van Praag, Mirjam

    2016-01-01

    Theory predicts that entrepreneurs have distinct attitudes toward risk and uncertainty, but empirical evidence is mixed. To better understand these mixed results, we perform a large “lab-in-the-field” experiment comparing entrepreneurs to managers (a suitable comparison group) and employees (n D...... 21288). The results indicate that entrepreneurs perceive themselves as less risk averse than managers and employees, in line with common wisdom. However, when using experimental incentivized measures, the differences are subtler. Entrepreneurs are only found to be unique in their lower degree of loss...... aversion, and not in their risk or ambiguity aversion. This combination of results might be explained by our finding that perceived risk attitude is not only correlated to risk aversion but also to loss aversion. Overall, we therefore suggest using a broader definition of risk that captures this unique...

  11. Managing Risk and Opportunity

    DEFF Research Database (Denmark)

    Andersen, Torben Juul; Garvey, Maxine; Roggi, Oliviero

    This book promotes good risk governance and risk management practices to corporate managers, executives, and directors wherever they operate around the world. The major corporate scandals have their roots in governance failure pointing to the link between risk governance and good performance...... outcomes. This topic is timely and of interest both to the academic community as well as to practicing managers, executives, and directors. The volume focuses on contemporary risk leadership issues based on recent research insights but avoids excessive technical language and mathematical formulas. The book....... The underlying logic is built on the principles of financial economics where benefits derive from reducing bankruptcy costs and increasing future cash inflows. This provides a stringent framework for analyzing the effect of different risk management actions and behaviors in effective risk-taking organizations...

  12. Risks, risk assessment and risk competence in toxicology.

    Science.gov (United States)

    Stahlmann, Ralf; Horvath, Aniko

    2015-01-01

    Understanding the toxic effects of xenobiotics requires sound knowledge of physiology and biochemistry. The often described lack of understanding pharmacology/toxicology is therefore primarily caused by the general absence of the necessary fundamental knowledge. Since toxic effects depend on exposure (or dosage) assessing the risks arising from toxic substances also requires quantitative reasoning. Typically public discussions nearly always neglect quantitative aspects and laypersons tend to disregard dose-effect-relationships. One of the main reasons for such disregard is the fact that exposures often occur at extremely low concentrations that can only be perceived intellectually but not by the human senses. However, thresholds in the low exposure range are often scientifically disputed. At the same time, ignorance towards known dangers is wide-spread. Thus, enhancing the risk competence of laypersons will have to be initially restricted to increasing the awareness of existing problems.

  13. Risks, risk assessment and risk competence in toxicology

    Directory of Open Access Journals (Sweden)

    Stahlmann, Ralf

    2015-07-01

    Full Text Available Understanding the toxic effects of xenobiotics requires sound knowledge of physiology and biochemistry. The often described lack of understanding pharmacology/toxicology is therefore primarily caused by the general absence of the necessary fundamental knowledge. Since toxic effects depend on exposure (or dosage assessing the risks arising from toxic substances also requires quantitative reasoning. Typically public discussions nearly always neglect quantitative aspects and laypersons tend to disregard dose-effect-relationships. One of the main reasons for such disregard is the fact that exposures often occur at extremely low concentrations that can only be perceived intellectually but not by the human senses. However, thresholds in the low exposure range are often scientifically disputed. At the same time, ignorance towards known dangers is wide-spread. Thus, enhancing the risk competence of laypersons will have to be initially restricted to increasing the awareness of existing problems.

  14. Risk governance in agriculture

    OpenAIRE

    Bachev, Hrabrin

    2008-01-01

    This paper identifies and assesses the efficiency of major modes for risk governance in agriculture on the base of Bulgarian dairy farming. Firstly, the New Institutional and Transaction Costs Economics is incorporated and a framework for analysis of the governance of natural, market, private, and social (institutional) risks presented. Next, the pace and challenges of the dairy farming development during the post-communist transition and EU integration is outlined. Third, major types of risk...

  15. Liquidity risk management

    Directory of Open Access Journals (Sweden)

    Milošević Miloš

    2014-01-01

    Full Text Available Liquidity risk management is a major activity of every bank. To be able to honor its matured liabilities, a bank strives to provide and maintain the required level of liquidity on a daily basis. Although each commercial bank has its own methodology of calculating the required liquidity level, in line with its adopted policies, the central bank has enacted the Decision on Liquidity Risk Management, prescribing the obligatory liquidity risk management policy.

  16. Cardiovascular risk prediction

    DEFF Research Database (Denmark)

    Graversen, Peter; Abildstrøm, Steen Z; Jespersen, Lasse

    2016-01-01

    (ECG) abnormalities, heart rate, family history (of ischaemic heart disease), body mass index (BMI), waist-hip ratio, walking duration and pace, leisure time physical activity, forced expiratory volume (FEV)1%pred, household income, education, vital exhaustion, high-density lipoprotein (HDL......AIM: European society of cardiology (ESC) guidelines recommend that cardiovascular disease (CVD) risk stratification in asymptomatic individuals is based on the Systematic Coronary Risk Evaluation (SCORE) algorithm, which estimates individual 10-year risk of death from CVD. We assessed...

  17. ACCOUNTING AUTOMATIONS RISKS

    OpenAIRE

    Муравський, В. В.; Хома, Н. Г.

    2015-01-01

    Accountant accepts active voice in organization of the automated account in the conditions of the informative systems introduction in enterprise activity. Effective accounting automation needs identification and warning of organizational risks. Authors researched, classified and generalized the risks of introduction of the informative accounting systems. The ways of liquidation of the organizational risks sources andminimization of their consequences are gives. The method of the effective con...

  18. Asset Class Liquidity Risk

    OpenAIRE

    Ronnie Sadka

    2014-01-01

    This paper studies liquidity risk, as measured by the covariation of returns with unexpected changes in aggregate liquidity, across 106 indices covering global equity, industry sectors, fixed income, and hedge funds. Roughly 20% of all sample indices, and over 50% of hedge-fund indices, display a significant exposure to liquidity risk. The annualized cross-sectional liquidity risk premium is estimated at about 2%. The results are robust to various controls and methodological choices. Practica...

  19. GM Risk Assessment

    Science.gov (United States)

    Sparrow, Penny A. C.

    GM risk assessments play an important role in the decision-making process surrounding the regulation, notification and permission to handle Genetically Modified Organisms (GMOs). Ultimately the role of a GM risk assessment will be to ensure the safe handling and containment of the GMO; and to assess any potential impacts on the environment and human health. A risk assessment should answer all ‘what if’ scenarios, based on scientific evidence.

  20. Approaches to acceptable risk

    Energy Technology Data Exchange (ETDEWEB)

    Whipple, C.

    1997-04-30

    Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, in a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.

  1. Agricultural risk management

    DEFF Research Database (Denmark)

    Lund, Mogens; Oksen, Arne; Larsen, Torben U.

    2005-01-01

    A new model for risk management in agriculture is described in the paper. The risk model is constructed as a context dependent process, which includes four main phases. The model is aimed at agricultural advisors, who wish to facilitate and disseminate risk management to farmers. It is developed...... and tested by an action research approach in an attempt to make risk management more applicable on family farms. Our obtained experiences indicate that farmers don’t apply probabilistic thinking and other concepts according to formal decision theory....

  2. Mortgage Default Risk

    DEFF Research Database (Denmark)

    Chauvet, Marcelle; Gabriel, Stuart; Lutz, Chandler

    2016-01-01

    We use Google search query data to develop a broad-based and real-time index of mortgage default risk. Unlike established indicators, our Mortgage Default Risk Index (MDRI) directly reflects households’concerns regarding their risk of mortgage default. The MDRI predicts housing returns, mortgage...... delinquency indicators, and subprime credit default swaps. These results persist both in- and out-of-sample and at multiple data frequencies. Together, research findings suggest internet search queries yield valuable new insights into household mortgage default risk....

  3. Risks: diagnosing and eliminating

    Directory of Open Access Journals (Sweden)

    Yuriy A. Tikhomirov

    2016-01-01

    Full Text Available Objective to develop conceptual theoretical and legal provisions and scientific recommendations on the identification analysis and elimination of risk. Methods universal dialectic method of cognition as well as scientific and private research methods based on it. Results the system was researched of risks diagnostics in the legal sphere and mechanism of influencing the quotrisk situationsquot and their consequences damage to the environment and harm to society. The concept of risk in the legal sphere was formulated the author39s classification of risks in the legal sphere is presented. The rules of analysis evaluation and prevention of risks and the model risk management framework are elaborated. Scientific novelty the mechanism for the identification analysis and elimination of risk has been developed and introduced into scientific circulation the author has proposed the classification and types of risks the reasons and the conditions promoting the risk occurrence. Practical significance the provisions and conclusions of the article can be used in the scientific lawmaking and lawenforcement activity as well as in the educational process of higher educational establishments. nbsp

  4. Systemic risk measures

    Science.gov (United States)

    Guerra, Solange Maria; Silva, Thiago Christiano; Tabak, Benjamin Miranda; de Souza Penaloza, Rodrigo Andrés; de Castro Miranda, Rodrigo César

    2016-01-01

    In this paper we present systemic risk measures based on contingent claims approach and banking sector multivariate density. We also apply network measures to analyze bank common risk exposure. The proposed measures aim to capture credit risk stress and its potential to become systemic. These indicators capture not only individual bank vulnerability, but also the stress dependency structure between them. Furthermore, these measures can be quite useful for identifying systemically important banks. The empirical results show that these indicators capture with considerable fidelity the moments of increasing systemic risk in the Brazilian banking sector in recent years.

  5. From Hazard to Risk - Assessing the Risk

    NARCIS (Netherlands)

    Madsen, C.B.; Houben, G.; Hattersley, S.; Crevel, R.W.R.; Remington, B.C.; Baumert, J.L.

    2013-01-01

    Regulatory thresholds for allergenic foods have not yet been developed. This means that public and industrial risk managers do not have regulatory thresholds to decide if a content or level of contamination is acceptable or not. For a long time, data have been inadequate to define safe thresholds fo

  6. Risks from nuclear waste

    Energy Technology Data Exchange (ETDEWEB)

    Liljenzin, J.O.; Rydberg, J. [Radiochemistry Consultant Group, Vaestra Froelunda (Sweden)

    1996-11-01

    The first part of this review discusses the importance of risk. If there is any relation between the emotional and rational risk perceptions (for example, it is believed that increased knowledge will decrease emotions), it will be a desirable goal for society, and the nuclear industry in particular, to improve the understanding by the laymen of the rational risks from nuclear energy. This review surveys various paths to a more common comprehension - perhaps a consensus - of the nuclear waste risks. The second part discusses radioactivity as a risk factor and concludes that it has no relation in itself to risk, but must be connected to exposure leading to a dose risk, i.e. a health detriment, which is commonly expressed in terms of cancer induction rate. Dose-effect relations are discussed in light of recent scientific debate. The third part of the report describes a number of hazard indexes for nuclear waste found in the literature and distinguishes between absolute and relative risk scales. The absolute risks as well as the relative risks have changed over time due to changes in radiological and metabolic data and by changes in the mode of calculation. To judge from the literature, the risk discussion is huge, even when it is limited to nuclear waste. It would be very difficult to make a comprehensive review and extract the essentials from that. Therefore, we have chosen to select some publications, out of the over 100, which we summarize rather comprehensively; in some cases we also include our remarks. 110 refs, 22 figs.

  7. Judging risk behaviour and risk preference: the role of the evaluative connotation of risk terms.

    NARCIS (Netherlands)

    van Schie, E.C.M.; van der Pligt, J.; van Baaren, K.

    1993-01-01

    Two experiments investigated the impact of the evaluative connotation of risk terms on the judgment of risk behavior and on risk preference. Exp 1 focused on the evaluation congruence of the risk terms with a general risk norm and with Ss' individual risk preference, and its effects on the extremity

  8. Measuring Idiosyncratic Risk

    DEFF Research Database (Denmark)

    Sunesen, Eva Rytter

    This paper offers two refinements of the traditional risk measure based on the volatility of growth. First, we condition GDP growth on structural characteristics of the host country that move only slowly and therefore can be partly predicted by an investor. Second, we adjust conditional risk for ...

  9. Risks of Large Portfolios.

    Science.gov (United States)

    Fan, Jianqing; Liao, Yuan; Shi, Xiaofeng

    2015-06-01

    The risk of a large portfolio is often estimated by substituting a good estimator of the volatility matrix. However, the accuracy of such a risk estimator is largely unknown. We study factor-based risk estimators under a large amount of assets, and introduce a high-confidence level upper bound (H-CLUB) to assess the estimation. The H-CLUB is constructed using the confidence interval of risk estimators with either known or unknown factors. We derive the limiting distribution of the estimated risks in high dimensionality. We find that when the dimension is large, the factor-based risk estimators have the same asymptotic variance no matter whether the factors are known or not, which is slightly smaller than that of the sample covariance-based estimator. Numerically, H-CLUB outperforms the traditional crude bounds, and provides an insightful risk assessment. In addition, our simulated results quantify the relative error in the risk estimation, which is usually negligible using 3-month daily data.

  10. Health Risk of Radon

    Science.gov (United States)

    ... Radon in Homes EPA 402-R-03-003. Summary Fact Sheet on the updated risk assessment . Top of Page Former U.S. Surgeon General ... WHO) launched an international radon project to help countries increase ... reduce radon-related risks. The U.S. EPA is one of several government ...

  11. Mind Your Risks

    Science.gov (United States)

    ... dementia later in life. STEPS TO MANAGE YOUR RISKS Control high blood pressure. Know your blood pressure! If left unchecked, high ... on a regular basis will significantly lower your risk for heart disease, high blood pressure, type 2 diabetes, and other chronic and debilitating ...

  12. Insurance Sector Risk

    NARCIS (Netherlands)

    J.F. Slijkerman

    2006-01-01

    textabstractWe model and measure simultaneous large losses of the market value of insurers to understand the impact of shocks on the insurance sector. The downside risk of insurers is explicitly modelled by common and idiosyncratic risk factors. Since reinsurance is important for the capacity of ins

  13. Measuring Systemic Risk

    DEFF Research Database (Denmark)

    Acharya, Viral V.; Heje Pedersen, Lasse; Philippon, Thomas

    We present a simple model of systemic risk and we show that each financial institution's contribution to systemic risk can be measured as its systemic expected shortfall (SES), i.e., its propensity to be undercapitalized when the system as a whole is undercapitalized. SES increases...

  14. Smokers at Risk.

    Science.gov (United States)

    Wilner, Susan

    1984-01-01

    Discusses current information on the health consequences of smoking and two types of risks: those associated with all smokers and the higher risks associated with other characteristics, such as to pregnant women, teenagers, heavy smokers, those with cardiovascular disease, users of alcohol, and smokers in certain occupations. (SK)

  15. Risk capital allocation

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Smilgins, Aleksandrs

    Risk capital allocation problems have been widely discussed in the academic literature. We consider a company with multiple subunits having individual portfolios. Hence, when portfolios of subunits are merged, a diversification benefit arises: the risk of the company as a whole is smaller than...

  16. Perspectives: Intellectual Risk Management

    Science.gov (United States)

    Hall, James C.

    2013-01-01

    Ask a college administrator about students and risk management, and you're likely to get a quick and agitated speech about alcohol consumption and bad behavior or a meditation on mental health and campus safety. But in colleges and universities, we manage intellectual risk-taking too. Bring that up, and you'll probably get little out of that same…

  17. Risk capital allocation

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Smilgins, Aleksandrs

    Risk capital allocation problems have been widely discussed in the academic literature. We consider a company with multiple subunits having individual portfolios. Hence, when portfolios of subunits are merged, a diversification benefit arises: the risk of the company as a whole is smaller than...

  18. Consumer perception of risk

    DEFF Research Database (Denmark)

    Scholderer, Joachim

    2001-01-01

    ' in risk perception research covering structure, process, and the social dynamics of risk debates. After that I will present results from a recently completed research project. In this project, we specifically looked into consumers' perceptions of gene technology applied to brewing, and how...... these perceptions related to consumers' attitudes and choice behavior....

  19. Melanoma Risk Prediction Models

    Science.gov (United States)

    Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.

  20. Cardiac Risk Assessment

    Science.gov (United States)

    ... Risk Assessment Related tests: Lipid Profile , VLDL Cholesterol , hs-CRP , Lp(a) Overview | Common Questions | Related Pages What ... cardiac risk include: High-sensitivity C-reactive protein (hs-CRP) : Studies have shown that measuring CRP with a ...

  1. ORGANIZATIONAL RISK COMMUNICATION

    Science.gov (United States)

    Ris communication tools in organizations differs in several ways from many of tools and techniques developed for public meetings. The traditional view of risk communication seeks to manage the public outrage ssociated with site-based issues. Organizational risk communication seek...

  2. Perspectives: Intellectual Risk Management

    Science.gov (United States)

    Hall, James C.

    2013-01-01

    Ask a college administrator about students and risk management, and you're likely to get a quick and agitated speech about alcohol consumption and bad behavior or a meditation on mental health and campus safety. But in colleges and universities, we manage intellectual risk-taking too. Bring that up, and you'll probably get little out of that same…

  3. Managing Risk and Opportunity

    DEFF Research Database (Denmark)

    Andersen, Torben Juul; Garvey, Maxine; Roggi, Oliviero

    . Hence, the book addresses the potential for upside gains as much as the threats of downside losses that represent the conventional risk perspectives. It states the simple fact that you must be willing to take risk to increase strategic responsiveness and corporate manoeuverability. The text builds...

  4. Risk Management and Simulation

    DEFF Research Database (Denmark)

    Skovmand, David

    2014-01-01

    Review of: Risk Management and Simulation / Aparna Gupta. Boca Raton, FL: CRC Press, 2013, xxix + 491 pp., $99.95(H), ISBN: 978-1-4398-3594-4.......Review of: Risk Management and Simulation / Aparna Gupta. Boca Raton, FL: CRC Press, 2013, xxix + 491 pp., $99.95(H), ISBN: 978-1-4398-3594-4....

  5. Information Security Risk Analysis

    CERN Document Server

    Peltier, Thomas R

    2010-01-01

    Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.

  6. Decreasing Relative Risk Premium

    DEFF Research Database (Denmark)

    Hansen, Frank

    We consider the risk premium demanded by a decision maker with wealth x in order to be indifferent between obtaining a new level of wealth y1 with certainty, or to participate in a lottery which either results in unchanged present wealth or a level of wealth y2 > y1. We define the relative risk...

  7. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  8. Mapping of nitrogen risks

    DEFF Research Database (Denmark)

    Blicher-Mathiesen, Gitte; Andersen, Hans Estrup; Carstensen, Jacob

    2014-01-01

    will be more effective if they are implemented in N loss hot spots or risk areas. Additionally, the highly variable N reduction in groundwater and surface waters needs to be taken into account as this strongly influences the resulting effect of mitigation measures. The objectives of this study were to develop...... risk mapping part of the tool, we combined a modelled root zone N leaching with a catchment-specific N reduction factor which in combination determines the N load to the marine recipient. N leaching was calculated using detailed information of agricultural management from national databases as well...... and apply an N risk tool to the entire agricultural land area in Denmark. The purpose of the tool is to identify high risk areas, i.e. areas which contribute disproportionately much to diffuse N losses to the marine recipient, and to suggest cost-effective measures to reduce losses from risk areas. In the N...

  9. Explosion risks from nanomaterials

    Science.gov (United States)

    Bouillard, Jacques; Vignes, Alexis; Dufaud, Olivier; Perrin, Laurent; Thomas, Dominique

    2009-05-01

    Emerging nanomanufactured products are being incorporated in a variety of consumer products ranging from closer body contact products (i.e. cosmetics, sunscreens, toothpastes, pharmaceuticals, clothing) to more remote body-contact products (electronics, plastics, tires, automotive and aeronautical), hence posing potential health and environmental risks. The new field of nanosafety has emerged and needs to be explored now rather than after problems becomes so ubiquitous and difficult to treat that their trend become irreversible. Such endeavour necessitates a transdisciplinary approach. A commonly forgotten and/or misunderstood risk is that of explosion/detonation of nanopowders, due to their high specific active surface areas. Such risk is emphasized and illustrated with the present development of an appropriate risk analysis. For this particular risk, a review of characterization methods and their limitations with regard to nanopowders is presented and illustrated for a few organic and metallic nanopowders.

  10. Markovian risk process

    Institute of Scientific and Technical Information of China (English)

    WANG Han-xing; YAN Yun-zhi; ZHAO Fei; FANG Da-fan

    2007-01-01

    A Markovian risk process is considered in this paper, which is the generalization of the classical risk model. It is proper that a risk process with large claims is modelled as the Markovian risk model. In such a model, the occurrence of claims is described by a point process {N(t)}t≥o with N(t) being the number of jumps during the interval (0, t] for a Markov jump process. The ruin probability Ψ(u) of a company facing such a risk model is mainly studied. An integral equation satisfied by the ruin probability function Ψ(u) is obtained and the bounds for the convergence rate of the ruin probability Ψ(u) are given by using a generalized renewal technique developed in the paper.

  11. Supply chain risk management

    Directory of Open Access Journals (Sweden)

    Christian Hollstein

    2013-03-01

    Full Text Available Background: Supply chain risk management increasingly gains prominence in many international industries. In order to strengthen supply chain structures, processes, and networks, adequate potentials for risk management need to be built (focus on effective logistics and to be utilized (focus on efficient logistics. Natural-based disasters, such as the case of Fukushima, illustrate how crucial risk management is. Method: By aligning a theoretical-conceptual framework with empirical-inductive findings, it may be hypothesized that logistical systems do have a positive effect on supply chain risk management activities.  Result/conclusion:  Flexibility and capacity, as well as redundancy and standardization, are often viewed as being conflictionary. It shows, however, that in the light of supply chain risk management, those factors may yield a common benefit if proper logistics systems are applied.  

  12. Communication of Audit Risk to Students.

    Science.gov (United States)

    Alderman, C. Wayne; Thompson, James H.

    1986-01-01

    This article focuses on audit risk by examining it in terms of its components: inherent risk, control risk, and detection risk. Discusses applying audit risk, a definition of audit risk, and components of audit risk. (CT)

  13. Internal Audit and Risk Management

    OpenAIRE

    Constantin Nicolae Vasile; Alexandru Georgiana

    2011-01-01

    Internal audit and risk management have the same goal: the control of risk. There are various roles for the internal audit in respect of risk management. The main limitations of internal audit in respect of risk management regards assuming risk management tasks. One of the main issues regarding risk management is to make sure that the key risks are taken into consideration and that the management and the board of the organization take action as needed. Internal audit could give advice to mana...

  14. Space Radiation Cancer Risks

    Science.gov (United States)

    Cucinotta, Francis A.

    2007-01-01

    Space radiation presents major challenges to astronauts on the International Space Station and for future missions to the Earth s moon or Mars. Methods used to project risks on Earth need to be modified because of the large uncertainties in projecting cancer risks from space radiation, and thus impact safety factors. We describe NASA s unique approach to radiation safety that applies uncertainty based criteria within the occupational health program for astronauts: The two terrestrial criteria of a point estimate of maximum acceptable level of risk and application of the principle of As Low As Reasonably Achievable (ALARA) are supplemented by a third requirement that protects against risk projection uncertainties using the upper 95% confidence level (CL) in the radiation cancer projection model. NASA s acceptable level of risk for ISS and their new lunar program have been set at the point-estimate of a 3-percent risk of exposure induced death (REID). Tissue-averaged organ dose-equivalents are combined with age at exposure and gender-dependent risk coefficients to project the cumulative occupational radiation risks incurred by astronauts. The 95% CL criteria in practice is a stronger criterion than ALARA, but not an absolute cut-off as is applied to a point projection of a 3% REID. We describe the most recent astronaut dose limits, and present a historical review of astronaut organ doses estimates from the Mercury through the current ISS program, and future projections for lunar and Mars missions. NASA s 95% CL criteria is linked to a vibrant ground based radiobiology program investigating the radiobiology of high-energy protons and heavy ions. The near-term goal of research is new knowledge leading to the reduction of uncertainties in projection models. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. The current model for projecting space radiation

  15. An integrated risk estimation methodology: Ship specific incident type risk

    NARCIS (Netherlands)

    S. Knapp (Sabine)

    2013-01-01

    textabstractShipping activity has increased worldwide, including parts of Australia, and maritime administrations are trying to gain a better understanding of total risk exposure in order to mitigate risk. Total risk exposure integrates risk at the individual ship level, risk due to vessel traffic d

  16. Risk and Risk society in Historical Perspective

    OpenAIRE

    2007-01-01

    International audience; Since the mid-1980s “risk” has constituted a sort of banner to which the social sciences have rallied. It has given rise to a whole range of research in the political science, sociology and economics spheres. This paper is a general introduction to a History and Technology special issue which attempt to construct analytical frameworks and research proposals that may contribute to a historization and a denaturalization of risk. This paper considers the role of history i...

  17. Introduction: Learning about Risk

    Directory of Open Access Journals (Sweden)

    Jens O. Zinn

    2006-01-01

    Full Text Available The special issue "Learning about Risk" draws on the launch conference of the ESRC "Social Contexts and Responses to Risk" network (SCARR held at 28th – 29th January 2005 in Canterbury. The SCARR network is an interdisciplinary network on risk which examines perceptions of and responses to risk in a range of areas, including sexual behaviour and partnering choices, the mass media, faith and ethnicity, pensions and financial planning, industrial pollution, crime, transport, energy policy and environmental hazards. The network's launch conference reflected the interdisciplinary character of risk research including a range of different methods and approaches to risk, directed at diverse objects of interest. The idea of the special issue is to link together this diversity and interdisciplinarity in risk research, and to encourage perspectives that look beyond the boundaries of single disciplines and methodological approaches. The papers in this publication demonstrate the value of insights from different disciplinary backgrounds in this area and point to the opportunities and challenges in the work that remains to be done in drawing these several perspectives more closely together. URN: urn:nbn:de:0114-fqs0601246

  18. Adaptation and risk management

    Energy Technology Data Exchange (ETDEWEB)

    Preston, Benjamin L [ORNL

    2011-01-01

    Adaptation assessment methods are compatible with the international risk management standard ISO:31000. Risk management approaches are increasingly being recommended for adaptation assessments at both national and local levels. Two orientations to assessments can commonly be identified: top-down and bottom-up, and prescriptive and diagnostic. Combinations of these orientations favor different types of assessments. The choice of orientation can be related to uncertainties in prediction and taking action, in the type of adaptation and in the degree of system stress. Adopting multiple viewpoints is to be encouraged, especially in complex situations. The bulk of current guidance material is consistent with top-down and predictive approaches, thus is most suitable for risk scoping and identification. Abroad range ofmaterial fromwithin and beyond the climate change literature can be used to select methods to be used in assessing and implementing adaptation. The framing of risk, correct formulation of the questions being investigated and assessment methodology are critical aspects of the scoping phase. Only when these issues have been addressed should be issue of specific methods and tools be addressed. The reorientation of adaptation from an assessment focused solely on anthropogenic climate change to broader issues of vulnerability/resilience, sustainable development and disaster risk, especially through a risk management framework, can draw from existing policy and management understanding in communities, professions and agencies, incorporating existing agendas, knowledge, risks, and issues they already face.

  19. Targeted assets risk analysis.

    Science.gov (United States)

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians.

  20. 不同处理状态的GCr15钢球应力分布分析%Analysis on Stress Distribution for Steel Balls Made of GCr15 Under Different Treatment States

    Institute of Scientific and Technical Information of China (English)

    刘传铭; 杨建虹; 雷建中; 王浩; 郭浩

    2015-01-01

    The stress distribution state is tested for steel balls under different heat treatment conditions and finished steel balls before and after surface shot peening strengthening process.The results show that the stress distribution state of steel balls is related to difference value between carbon potential in heat treatment furnace and carbon content in ma-trix after heat treatment.The surface of steel balls have higher compressive stress before strengthening process,and strengthening peak value appear in the subsurface of steel balls after strengthening process,which is helpful to improve life of steel balls.%对不同热处理工艺条件下的钢球,以及未经过和经过表面喷丸强化加工的成品钢球的应力分布状态进行了测试。结果表明:热处理后,钢球的应力分布状态与热处理炉内的碳势和基体碳含量的差值有关;未经过强化工艺的钢球表面有较高的压应力,经过强化工艺的钢球次表面出现强化峰值,有利于提高钢球寿命。

  1. PREVENTION OF COMPANY RISKS

    Directory of Open Access Journals (Sweden)

    SUCI U GHEORGHE

    2014-10-01

    Full Text Available A company’s manager has to create and maintain a healthy internal control system. An efficient internal control system implies the implementation in the company of risk management. Each company, but also each individual, who tries to attain certain objectives, establishes the activities which lead to the achievement of goals and, at the same time, tries to identify as many “threats” as possible, in order to take the necessary measures to eliminate them. Thus, even if one is not familiar with the concepts of risk and risk management, one acts, consciously or not, for that purpose.

  2. Low Risk Anomalies?

    DEFF Research Database (Denmark)

    Schneider, Paul; Wagner, Christian; Zechner, Josef

    This paper shows theoretically and empirically that beta- and volatility-based low risk anomalies are driven by return skewness. The empirical patterns concisely match the predictions of our model that endogenizes the role of skewness for stock returns through default risk. With increasing downside...... of betting against beta/volatility among low skew firms compared to high skew firms is economically large. Our results suggest that the returns to betting against beta or volatility do not necessarily pose asset pricing puzzles but rather that such strategies collect premia that compensate for skew risk...

  3. Landslide risk assessment

    Science.gov (United States)

    Lessing, P.; Messina, C.P.; Fonner, R.F.

    1983-01-01

    Landslide risk can be assessed by evaluating geological conditions associated with past events. A sample of 2,4 16 slides from urban areas in West Virginia, each with 12 associated geological factors, has been analyzed using SAS computer methods. In addition, selected data have been normalized to account for areal distribution of rock formations, soil series, and slope percents. Final calculations yield landslide risk assessments of 1.50=high risk. The simplicity of the method provides for a rapid, initial assessment prior to financial investment. However, it does not replace on-site investigations, nor excuse poor construction. ?? 1983 Springer-Verlag New York Inc.

  4. Assessment of fracture risk

    Energy Technology Data Exchange (ETDEWEB)

    Kanis, John A. [WHO Collaborating Centre for Metabolic Bone Diseases, University of Sheffield Medical School, Beech Hill Road, Sheffield S10 2RX (United Kingdom)], E-mail: w.j.pontefract@sheffield.ac.uk; Johansson, Helena; Oden, Anders [WHO Collaborating Centre for Metabolic Bone Diseases, University of Sheffield Medical School, Beech Hill Road, Sheffield S10 2RX (United Kingdom); McCloskey, Eugene V. [WHO Collaborating Centre for Metabolic Bone Diseases, University of Sheffield Medical School, Beech Hill Road, Sheffield S10 2RX (United Kingdom); Osteoporosis Centre, Northern General Hospital, Sheffield (United Kingdom)

    2009-09-15

    Fractures are a common complication of osteoporosis. Although osteoporosis is defined by bone mineral density at the femoral neck, other sites and validated techniques can be used for fracture prediction. Several clinical risk factors contribute to fracture risk independently of BMD. These include age, prior fragility fracture, smoking, excess alcohol, family history of hip fracture, rheumatoid arthritis and the use of oral glucocorticoids. These risk factors in conjunction with BMD can be integrated to provide estimates of fracture probability using the FRAX tool. Fracture probability rather than BMD alone can be used to fashion strategies for the assessment and treatment of osteoporosis.

  5. High risk pregnancy

    Directory of Open Access Journals (Sweden)

    Bernardita Donoso Bernales

    2012-06-01

    Full Text Available It is estimated that roughly 20% of pregnancies fall into the high risk category, which in turn are responsible for over 80% of perinatal adverse outcome. Modern obstetrics has been very successful in reducing maternal morbidity and mortality. It has focused mainly on fetal and neonatal aspects, and on identifying the subgroup of pregnant women that need greater surveillance and care because of clearly identifiable risk factors. The article describes the preconceptional advice, its components and recommendations for its implementation, as well as its role in maternal and perinatal risk assessment. These interventions attempt to reduce the rates of maternal and perinatal mortality.

  6. Measuring Systemic Risk

    DEFF Research Database (Denmark)

    Acharya, Viral V.; Heje Pedersen, Lasse; Philippon, Thomas

    We present a simple model of systemic risk and we show that each financial institution's contribution to systemic risk can be measured as its systemic expected shortfall (SES), i.e., its propensity to be undercapitalized when the system as a whole is undercapitalized. SES increases...... with the institution's leverage and with its expected loss in the tail of the system's loss distribution. Institutions internalize their externality if they are ‘taxed’ based on their SES. We demonstrate empirically the ability of SES to predict emerging risks during the financial crisis of 2007-2009, in particular...

  7. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  8. Patient caries risk assessment

    DEFF Research Database (Denmark)

    Twetman, Svante; Fontana, Margherita

    2009-01-01

    Risk assessment is an essential component in the decision-making process for the correct prevention and management of dental caries. Multiple risk factors and indicators have been proposed as targets in the assessment of risk of future disease, varying sometimes based on the age group at which...... for prediction purposes, as measured until now in the literature, is at best questionable in schoolchildren, adolescents and adults. That is not to say these additional factors should not be assessed to help understand the strength of their associations with the disease experience in a particular patient......, and aid in the development of an individualized and targeted preventive and management plan....

  9. Modelling and Simulating of Risk Behaviours in Virtual Environments Based on Multi-Agent and Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Linqin Cai

    2013-11-01

    Full Text Available Due to safety and ethical issues, traditional experimental approaches to modelling underground risk behaviours can be costly, dangerous and even impossible to realize. Based on multi-agent technology, a virtual coalmine platform for risk behaviour simulation is presented to model and simulate the human-machine- environment related risk factors in underground coalmines. To reveal mine workers’ risk behaviours, a fuzzy emotional behaviour model is proposed to simulate underground miners’ responding behaviours to potential hazardous events based on cognitive appraisal theories and fuzzy logic techniques. The proposed emotion model can generate more believable behaviours for virtual miners according to personalized emotion states, internal motivation needs and behaviour selection thresholds. Finally, typical accident cases of underground hazard spotting and locomotive transport were implemented. The behaviour believability of virtual miners was evaluated with a user assessment method. Experimental results show that the proposed models can create more realistic and reasonable behaviours in virtual coalmine environments, which can improve miners’ risk awareness and further train miners’ emergent decision-making ability when facing unexpected underground situations.

  10. 风险沟通与疫苗相关事件%Risk Communication in Vaccine-related Events

    Institute of Scientific and Technical Information of China (English)

    郑登峰

    2012-01-01

    The public lost confidence for vaccine safety due to vaccine related events, Its impacts is far greater than the event itself. When an event happ ened, it is an important to response the crisis by rational interpretation of the events. Based on the analysis of domestic and international risk communication experience and the research progress, the risk communication was introduced into risk management of vaccine related events. The strategy of risk communication in vaccine related events from three aspects : preventive communication, emergency risk communication and afterward disposal.%疫苗相关事件所导致的公众对疫苗安全的不信任,其危害程度远大于事件本身.当事件发生时,公众能够理性而客观的解读事件是应对危机的重要基础.该文在分析国内外风险沟通研究进展和经验的基础上,将风险沟通理论引入到疫苗相关事件的风险管理中.从预防性沟通、应急性沟通和善后处理三个方面探讨预防接种中的风险沟通策略.

  11. Alternative Perspectives on Risk

    Science.gov (United States)

    Davison, Jeannie; Orasanu, J.; Connors, Mary M. (Technical Monitor)

    1997-01-01

    The goal of the commercial air transport system is to provide air transportation to the flying public at an acceptable cost with minimal risk. in an ideal situation these three goals would support each other. In fact, it is sometimes the case that the goals conflict: getting passengers to their destinations on time may conflict with fixing a minor mechanical malfunction that may or may not impact safety; flying a route that will avoid turbulence, thereby providing passengers with a more comfortable ride, may consume more fuel; managing traffic density may mean aircraft are delayed or must use an approach that will result in a long taxi to their gates, costing time and fuel. Various players in the system--pilots, dispatchers, controllers, as well as managers in the airline carriers and traffic management system--make decisions every day that involve trade-offs of benefits and costs. The prospect of revisions in the air traffic management system, with shifts in responsibilities from controllers to users, including airline operations center personnel and pilots, means that individuals may be performing either new jobs or old jobs under new guidance. It will be essential to know how the various players (a) perceive the risks and benefits associated with the decisions they will make under the old and new control structures, and (b) how much risk they are willing to accept in making decisions. Risk is here defined as the probability and magnitude of negative events (after Slovic, 1987). Of primary interest are risks associated with traffic, weather, and operational factors such as schedule, fuel consumption, and passenger service. Previous research has documented differences between groups in perceptions of risks associated with both everyday and aviation related situations. Risk perception varies as a function of familiarity with the situation, degree to which one is potentially affected by the risk, the level of control one has over the situation, and one's level of

  12. Benchmarking an operational procedure for rapid flood mapping and risk assessment in Europe

    Science.gov (United States)

    Dottori, Francesco; Salamon, Peter; Kalas, Milan; Bianchi, Alessandra; Feyen, Luc

    2016-04-01

    The development of real-time methods for rapid flood mapping and risk assessment is crucial to improve emergency response and mitigate flood impacts. This work describes the benchmarking of an operational procedure for rapid flood risk assessment based on the flood predictions issued by the European Flood Awareness System (EFAS). The daily forecasts produced for the major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations, based on the hydro-meteorological dataset of EFAS. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in near real-time in terms of flood prone areas, potential economic damage, affected population, infrastructures and cities. An extensive testing of the operational procedure is carried out using the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-derived flood footprints, while ground-based estimations of economic damage and affected population is compared against modelled estimates. We evaluated the skill of flood hazard and risk estimations derived from EFAS flood forecasts with different lead times and combinations. The assessment includes a comparison of several alternative approaches to produce and present the information content, in order to meet the requests of EFAS users. The tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management.

  13. An experimental system for flood risk forecasting and monitoring at global scale

    Science.gov (United States)

    Dottori, Francesco; Alfieri, Lorenzo; Kalas, Milan; Lorini, Valerio; Salamon, Peter

    2017-04-01

    Global flood forecasting and monitoring systems are nowadays a reality and are being applied by a wide range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasting, combining streamflow estimations with expected inundated areas and flood impacts. Finally, emerging technologies such as crowdsourcing and social media monitoring can play a crucial role in flood disaster management and preparedness. Here, we present some recent advances of an experimental procedure for near-real time flood mapping and impact assessment. The procedure translates in near real-time the daily streamflow forecasts issued by the Global Flood Awareness System (GloFAS) into event-based flood hazard maps, which are then combined with exposure and vulnerability information at global scale to derive risk forecast. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To increase the reliability of our forecasts we propose the integration of model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification and correction of impact forecasts. Finally, we present the results of preliminary tests which show the potential of the proposed procedure in supporting emergency response and management.

  14. Risk management with options and futures under liquidity risk

    OpenAIRE

    Adam-Müller, A F A; Panaretou, A

    2009-01-01

    Futures hedging creates liquidity risk through marking to market. Liquidity risk matters if interim losses on a futures position have to be financed at a markup over the risk-free rate. This study analyzes the optimal risk management and production decisions of a firm facing joint price and liquidity risk. It provides a rationale for the use of options on futures in imperfect capital markets. If liquidity risk materializes, the firm sells options on futures in order to partly cover this liqui...

  15. Alcohol and Cancer Risk

    Science.gov (United States)

    ... is through the activity of an enzyme called alcohol dehydrogenase, or ADH. Many individuals of Chinese, Korean, and ... Abstract] Yokoyama A, Omori T. Genetic polymorphisms of alcohol and aldehyde dehydrogenases and risk for esophageal and head and neck ...

  16. Lymphedema Risk Reduction Practices

    Science.gov (United States)

    ... LSAP Perspective (9) 2017 NLN International Conference Position Paper: Lymphedema Risk Reduction Practices Category: Position Papers Tags: ... and water, pat dry, then apply a topical antibacterial. d. Wear non-constricting protective gear over the ...

  17. Risks of underage drinking

    Science.gov (United States)

    ... a higher risk of depression, anxiety, and low self-esteem. Drinking during puberty can also change hormones in ... Abuse, Kokotailo PK. Alcohol use by youth and adolescents: a pediatric concern. Pediatrics . 2010;125(5):1078- ...

  18. Social identities and risk

    DEFF Research Database (Denmark)

    Blok, Anders; Jensen, Mette; Kaltoft, Pernille

    2008-01-01

    Expert-based environmental and health risk regulation is widely believed to suffer from a lack of public understanding and legitimacy. On controversial issues such as genetically modified organisms and food-related chemicals, a "lay-expert discrepancy" in the assessment of risks is clearly visible...... of social identities. On the basis of qualitative interviews with citizens and experts, respectively, we focus on the multiple ways in which identities come to be employed in actors' risk accounts. Empirically, we identify salient characteristics of "typical" imagined experts and lay-people, while arguing...... that these conceptions vary identifiably in-between four groups of citizens and experts. On the basis of our findings, some implications for bridging the lay-expert discrepancy on risk issues are sketched out....

  19. Risk for Travelers

    Science.gov (United States)

    ... with BSE transmitted disease to highly BSE-sensitive transgenic mice at a rate indicative of trace levels ... for human exposure to BSE (see Health Canada's Food Directorate Policy on Specified Risk Material (SRM) in ...

  20. Obesity and Cancer Risk

    Science.gov (United States)

    ... GS. Inflammatory mechanisms in obesity. Annual Review of Immunology 2011; 29:415-445. [PubMed Abstract] Randi G, Franceschi S, La Vecchia C. Gallbladder cancer worldwide: geographical distribution and risk factors. International Journal ...

  1. Risk Factors for Thrombosis

    Institute of Scientific and Technical Information of China (English)

    包承鑫

    2002-01-01

    @@ Thrombotic disease is a multifactorial disease, multiple interactions between genetic and environmental factors contribute to the development of the disease.This review summarized some risk factors reported for arterial thrombosis and venous thrombosis in recent few years.

  2. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  3. Exchange Risk Management Policy

    CERN Document Server

    2005-01-01

    At the Finance Committee of March 2005, following a comment by the CERN Audit Committee, the Chairman invited the Management to prepare a document on exchange risk management policy. The Finance Committee is invited to take note of this document.

  4. Heart Disease Risk Factors

    Science.gov (United States)

    ... Hearts® WISEWOMAN Program Other Chronic Disease Topics Diabetes Nutrition Obesity Physical Activity Stroke Heart Disease Risk Factors Recommend ... Hearts® WISEWOMAN Program Other Chronic Disease Topics Diabetes Nutrition Obesity Physical Activity Stroke File Formats Help: How do ...

  5. Risks of tobacco

    Science.gov (United States)

    Skip navigation U.S. National Library of Medicine The navigation menu has been collapsed. Menu ... tobacco URL of this page: //medlineplus.gov/ency/article/002032.htm Risks of tobacco To use the sharing features ...

  6. Histoplasmosis Risk and Prevention

    Science.gov (United States)

    ... Foodborne, Waterborne, and Environmental Diseases Mycotic Diseases Branch Histoplasmosis Risk & Prevention Recommend on Facebook Tweet Share Compartir Who gets histoplasmosis? Anyone can get histoplasmosis if they’ve been ...

  7. Between Imperative and Risk

    DEFF Research Database (Denmark)

    Andersen, Steen

    2011-01-01

    While companies from small neutral states are frequently more vulnerable to the risks of doing business with or under dictatorial regimes than are companies from great powers, they are not helpless. This article shows that the strategy that both Danish and Swedish companies selected according...... game but were capable, to a certain degree, of promoting their own interests. This article reveals that the political imperative is not only a matter of political risk but also of political opportunity. The history of Christiani & Nielsen offers a useful case of the political risks and fiscal...... opportunities faced by multinationals working in dictatorial settings. This article concludes that, in a choice between a forestalling strategy and an absorption strategy, the latter offers a better way of managing such risks and to minimize exposure. This becomes especially clear in a comparison with Swedish...

  8. Reduce HIV Risk

    Science.gov (United States)

    ... Our research has demonstrated remarkable success in reducing HIV risk-associated sexual behaviors among African American adolescents and adults." Read More "Nursing Research" Articles Nursing Research / Improve Hospital-to-Home Transitions / Reduce ...

  9. Econometrics of risk

    CERN Document Server

    Kreinovich, Vladik; Sriboonchitta, Songsak; Suriya, Komsan

    2015-01-01

    This edited book contains several state-of-the-art papers devoted to econometrics of risk. Some papers provide theoretical analysis of the corresponding mathematical, statistical, computational, and economical models. Other papers describe applications of the novel risk-related econometric techniques to real-life economic situations. The book presents new methods developed just recently, in particular, methods using non-Gaussian heavy-tailed distributions, methods using non-Gaussian copulas to properly take into account dependence between different quantities, methods taking into account imprecise ("fuzzy") expert knowledge, and many other innovative techniques. This versatile volume helps practitioners to learn how to apply new techniques of econometrics of risk, and researchers to further improve the existing models and to come up with new ideas on how to best take into account economic risks.

  10. Cultural differences in risk

    Directory of Open Access Journals (Sweden)

    Do-Yeong Kim

    2010-08-01

    Full Text Available We compared South Koreans with Australians in order to characterize cultural differences in attitudes and choices regarding risk, at both the individual and group levels. Our results showed that Australians, when assessed individually, consistently self-reported higher preference for risk than South Koreans, regardless of gender. The data revealed that South Koreans, regardless of gender composition, were willing to take greater risks when making decisions in group decision-making situations than when they were alone. This is a different pattern from that seen in the Australian sample, in which a risky shift was noted only among males. This difference was attributed to the influence of various cultural orientations (independent vs. interdependent relationship styles. This study also provides a discussion of the implications of these results in terms of cultural differences in attitudes and decisions regarding risk.

  11. From Hazard to Risk

    DEFF Research Database (Denmark)

    Madsen, Charlotte Bernhard; Houben, Geert; Hattersley, Sue

    2013-01-01

    Regulatory thresholds for allergenic foods have not yet been developed. This means that public and industrial risk managers do not have regulatory thresholds to decide if a content or level of contamination is acceptable or not. For a long time, data have been inadequate to define safe thresholds...... for food allergens. More and more challenge data from food allergic patients are now available, and this opens the possibility to perform more advanced food allergy safety and risk assessments. These can be used to inform risk management decisions and ultimately to form the basis for regulatory thresholds....... In the chapter we describe three different approaches for safety/risk assessment based on no/low observed adverse effect level, benchmark dose, or probabilistic modeling. These methods are illustrated by examples from real life and the possibilities and limitations are discussed....

  12. Pancreatic Cancer Risk Factors

    Science.gov (United States)

    ... risks of other cancers (or other health problems). Examples of genetic syndromes that can cause exocrine pancreatic cancer include: Hereditary breast and ovarian cancer syndrome , caused by mutations in the BRCA1 or BRCA2 genes Familial atypical ...

  13. Pregnancy - health risks

    Science.gov (United States)

    ... provider before trying to get pregnant. Seeing a prenatal provider before trying to get pregnant or early in the pregnancy can help prevent, or detect and control health risks to the mother and unborn baby ...

  14. Risk Management Plan Rule

    Science.gov (United States)

    RMP implements Section 112(r) of the 1990 Clean Air Act amendments, and requires facilities that use extremely hazardous substances to develop a Risk Management Plan and revise/resubmit every five years. Find guidance, factsheets, training, and assistance.

  15. Resourceful or At Risk

    DEFF Research Database (Denmark)

    Højen-Sørensen, Anna-Katharina

    Introduction: Social categories are used to determine which individuals are at an increased risk of unfavorable outcomes and they are a vital tool for the development of targeted interventions. This presentation takes a critical look at the Resourceful and At Risk categories, that are often emplo...... employed in research and social work, and investigate the possible consequences of the preconceptions born out of these categories....

  16. IT-Risk-Management

    OpenAIRE

    Lin, Yimei

    2014-01-01

    E-Business Risk ist eine Kombination von Gesch?ftsmodell und Technologie. Gegenstand eines effektiven und effizienten IT-Risk Managements ist folglich die Verbindung von Gesch?ftsmodell und Technologie. Es wird als Gesamtheit aller Ma?nahmen, Prozesse und Institutionen verstanden, die auf eine zielgerichtete Gestaltung der vom Informationsmanagement ausgehenden Gef?hrdungen auf die Gesch?ftsprozesse und Gesamtrisikolage des Wirtschaftssubjektes ausgerichtet sind. Neben den technologischen Ein...

  17. College Risk and Return

    OpenAIRE

    Gonzalo Castex

    2011-01-01

    Attending college is thought of as a very profitable investment decision, as its estimated annualized return ranges from 8% to 13%. However, a large fraction of high school graduates do not enroll in college. I reconcile the observed high average returns to schooling with relatively low attendance rates when considering college as a risky investment decision. A high dropout risk has two important effects on the estimated average returns to college: selection bias and risk premium. In order to...

  18. Operational Risk Modeling

    Directory of Open Access Journals (Sweden)

    Gabriela ANGHELACHE

    2011-06-01

    Full Text Available Losses resulting from operational risk events from a complex interaction between organizational factors, personal and market participants that do not fit a simple classification scheme. Taking into account past losses (ex. Barings, Daiwa, etc. we can say that operational risk is a major financial losses in the banking sector, although until recently have been underestimated, considering that they are generally minor, note setting survival of a bank.

  19. Risk in financial reporting

    OpenAIRE

    Tsatsaronis, Kostas; Claudio E. V. Borio

    2015-01-01

    Advances in risk measurement technology have reshaped financial markets and the functioning of the financial system. More recently, they have been reshaping the prudential framework. Looking forward, they have the potential to reshape financial reporting too. Recent initiatives to improve financial reporting standards have brought to the fore significant differences in perspective between accounting standard setters and prudential authorities. Building on previous work, we argue that risk mea...

  20. Fuzzy based risk register for construction project risk assessment

    Science.gov (United States)

    Kuchta, Dorota; Ptaszyńska, Ewa

    2017-07-01

    The paper contains fuzzy based risk register used to identify risks which appear in construction projects and to assess their attributes. Risk is considered here as a possible event with negative consequences for the project [4]. We use different risk attributes in the proposed risk register. Values of risk attributes are generated by using fuzzy numbers. Specific risk attributes have different importance for project managers of construction projects. To compare specific risk attributes we use methods of fuzzy numbers ranking. The main strengths of the proposed concept in managing construction projects are also presented in the paper.

  1. Northwest Climate Risk Assessment

    Science.gov (United States)

    Mote, P.; Dalton, M. M.; Snover, A. K.

    2012-12-01

    As part of the US National Climate Assessment, the Northwest region undertook a process of climate risk assessment. This process included an expert evaluation of previously identified impacts, their likelihoods, and consequences, and engaged experts from both academia and natural resource management practice (federal, tribal, state, local, private, and non-profit) in a workshop setting. An important input was a list of 11 risks compiled by state agencies in Oregon and similar adaptation efforts in Washington. By considering jointly the likelihoods, consequences, and adaptive capacity, participants arrived at an approximately ranked list of risks which was further assessed and prioritized through a series of risk scoring exercises to arrive at the top three climate risks facing the Northwest: 1) changes in amount and timing of streamflow related to snowmelt, causing far-reaching ecological and socioeconomic consequences; 2) coastal erosion and inundation, and changing ocean acidity, combined with low adaptive capacity in the coastal zone to create large risks; and 3) the combined effects of wildfire, insect outbreaks, and diseases will cause large areas of forest mortality and long-term transformation of forest landscapes.

  2. Draft Title 40 CFR 191 compliance certification application for the Waste Isolation Pilot Plant. Volume 7: Appendix GCR Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-31

    This report contains the second part of the geological characterization report for the Waste Isolation Pilot Plant. Both hydrology and geochemistry are evaluated. The following aspects of hydrology are discussed: surface hydrology; ground water hydrology; and hydrology drilling and testing. Hydrologic studies at the site and adjacent site areas have concentrated on defining the hydrogeology and associated salt dissolution phenomena. The geochemical aspects include a description of chemical properties of geologic media presently found in the surface and subsurface environments of southeastern New Mexico in general, and of the proposed WIPP withdrawal area in particular. The characterization does not consider any aspect of artificially-introduced material, temperature, pressure, or any other physico-chemical condition not native to the rocks of southeastern New Mexico.

  3. Differential Regulation of the Overlapping Kaposi's Sarcoma-Associated Herpesvirus vGCR (orf74) and LANA (orf73) Promoters

    OpenAIRE

    Jeong, Joseph; Papin, James; Dittmer, Dirk

    2001-01-01

    Similar to that of other herpesviruses, Kaposi's sarcoma-associated herpesvirus (KSHV/HHV-8) lytic replication destroys the host cell, while the virus can persist in a latent state in synchrony with the host. During latency only a few genes are transcribed, and the question becomes one of what determines latent versus lytic gene expression. Here we undertake a detailed analysis of the latency-associated nuclear antigen (LANA [orf73]) promoter (LANAp). We characterized a minimal region that is...

  4. Colorectal Cancer Risk Assessment Tool

    Science.gov (United States)

    ... 11/12/2014 Risk Calculator About the Tool Colorectal Cancer Risk Factors Download SAS and Gauss Code Page ... Rectal Cancer: Prevention, Genetics, Causes Tests to Detect Colorectal Cancer and Polyps Cancer Risk Prediction Resources Update November ...

  5. Risk in the Weapons Stockpile

    Energy Technology Data Exchange (ETDEWEB)

    Noone, Bailey C [Los Alamos National Laboratory

    2012-08-14

    When it comes to the nuclear weapons stockpile, risk must be as low as possible. Design and care to keep the stockpile healthy involves all aspects of risk management. Design diversity is a method that helps to mitigate risk.

  6. Health risks of alcohol use

    Science.gov (United States)

    Alcoholism - risks; Alcohol abuse - risks; Alcohol dependence - risks; Risky drinking ... Beer, wine, and liquor all contain alcohol. If you are drinking any of these, you are using alcohol. Your drinking patterns may vary, depending on who you are with ...

  7. Simplified seismic risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pellissetti, Manuel; Klapp, Ulrich [AREVA NP GmbH, Erlangen (Germany)

    2011-07-01

    Within the context of probabilistic safety analysis (PSA) for nuclear power plants (NPP's), seismic risk assessment has the purpose to demonstrate that the contribution of seismic events to overall risk is not excessive. The most suitable vehicle for seismic risk assessment is a full scope seismic PSA (SPSA), in which the frequency of core damage due to seismic events is estimated. An alternative method is represented by seismic margin assessment (SMA), which aims at showing sufficient margin between the site-specific safe shutdown earthquake (SSE) and the actual capacity of the plant. Both methods are based on system analysis (fault-trees and event-trees) and hence require fragility estimates for safety relevant systems, structures and components (SSC's). If the seismic conditions at a specific site of a plant are not very demanding, then it is reasonable to expect that the risk due to seismic events is low. In such cases, the cost-benefit ratio for performing a full scale, site-specific SPSA or SMA will be excessive, considering the ultimate objective of seismic risk analysis. Rather, it will be more rational to rely on a less comprehensive analysis, used as a basis for demonstrating that the risk due to seismic events is not excessive. The present paper addresses such a simplified approach to seismic risk assessment which is used in AREVA to: - estimate seismic risk in early design stages, - identify needs to extend the design basis, - define a reasonable level of seismic risk analysis Starting from a conservative estimate of the overall plant capacity, in terms of the HCLPF (High Confidence of Low Probability of Failure), and utilizing a generic value for the variability, the seismic risk is estimated by convolution of the hazard and the fragility curve. Critical importance is attached to the selection of the plant capacity in terms of the HCLPF, without performing extensive fragility calculations of seismically relevant SSC's. A suitable basis

  8. Credit risk management in banks

    OpenAIRE

    Pětníková, Tereza

    2014-01-01

    The subject of this diploma thesis is managing credit risk in banks, as the most significant risk faced by banks. The aim of this work is to define the basic techniques, tools and methods that are used by banks to manage credit risk. The first part of this work focuses on defining these procedures and describes the entire process of credit risk management, from the definition of credit risk, describing credit strategy and policy, organizational structure, defining the most used credit risk mi...

  9. RISK MANAGEMENT IN CONSTRUCTION INDUSTRY

    OpenAIRE

    V. Sankara Subramaniyan; K. Veerakumar

    2017-01-01

    Construction industry is highly risk prone, with complex and dynamic project environments which create an atmosphere of high uncertainty and risk. The industry is vulnerable to various technical, socio-political and business risks. The track record to cope with these risks has not been very good in construction industry. Risk management is a concept which becomes very popular in a number of businesses. Many companies often establish a risk management procedure in their projects for improving ...

  10. RISK REGION. POINTS OF VIEW

    Directory of Open Access Journals (Sweden)

    VICTOR SOROCOVSCHI

    2016-03-01

    Full Text Available The paper deals with three fundamental issues related to natural risks. The first issue concerns the definition and characteristics of the risk region. The second issue talks about identification of criteria that underly risk regions demarcation and ranking. The analyze of European risk regions exposed to major natural risks and the frequency identification of natural risks affecting major regions of Romania are the topics addressed in the last part of the paper.

  11. Can Public Health Risk Assessment Using Risk Matrices Be Misleading?

    Science.gov (United States)

    Vatanpour, Shabnam; Hrudey, Steve E; Dinu, Irina

    2015-08-14

    The risk assessment matrix is a widely accepted, semi-quantitative tool for assessing risks, and setting priorities in risk management. Although the method can be useful to promote discussion to distinguish high risks from low risks, a published critique described a problem when the frequency and severity of risks are negatively correlated. A theoretical analysis showed that risk predictions could be misleading. We evaluated a practical public health example because it provided experiential risk data that allowed us to assess the practical implications of the published concern that risk matrices would make predictions that are worse than random. We explored this predicted problem by constructing a risk assessment matrix using a public health risk scenario-Tainted blood transfusion infection risk-That provides negative correlation between harm frequency and severity. We estimated the risk from the experiential data and compared these estimates with those provided by the risk assessment matrix. Although we validated the theoretical concern, for these authentic experiential data, the practical scope of the problem was limited. The risk matrix has been widely used in risk assessment. This method should not be abandoned wholesale, but users must address the source of the problem, apply the risk matrix with a full understanding of this problem and use matrix predictions to inform, but not drive decision-making.

  12. RISK TRANSFER AND RISK REDUCTION OF ATHLETES

    Directory of Open Access Journals (Sweden)

    Željko Vojinović

    2011-09-01

    Full Text Available One of the indispensable factors in sports is insurance. The accidents influence not only the health, permanently or temporarily,they also influence the financial resources, more or less, depending on the recovery time of the injuries. Insurer in this case pay the agreed amount (the agreed compensation to the insured. Each participant in the sporting competition should have personal insurance. The reasons for the theme are to find ways to explain how athletes can reduce the risks they are exposed to in doing their activities, training and competition, and other moments in life. Every man has a need for certainty in the future, regardless of the category in which he works, the values and skills available. The only difference is in absolute values and everyone has his own need. Athletes ,those from less successful to the most successful ones, whose transfers or fees are in millions, all think about the future and of course how to save and invest funds that are earned. They can find a solution in insurance, as an institution that takes over their risks, taking care of the invested money and benefits of those stakes. When there is uncertainty in our lives we seek security and see it as a basic need. Insurers claim that insurance offers just that - the security of property and life

  13. Microbial Risk Assessment

    Science.gov (United States)

    Ott, C. M.; Mena, K. D.; Nickerson, C.A.; Pierson, D. L.

    2009-01-01

    Historically, microbiological spaceflight requirements have been established in a subjective manner based upon expert opinion of both environmental and clinical monitoring results and the incidence of disease. The limited amount of data, especially from long-duration missions, has created very conservative requirements based primarily on the concentration of microorganisms. Periodic reevaluations of new data from later missions have allowed some relaxation of these stringent requirements. However, the requirements remain very conservative and subjective in nature, and the risk of crew illness due to infectious microorganisms is not well defined. The use of modeling techniques for microbial risk has been applied in the food and potable water industries and has exceptional potential for spaceflight applications. From a productivity standpoint, this type of modeling can (1) decrease unnecessary costs and resource usage and (2) prevent inadequate or inappropriate data for health assessment. In addition, a quantitative model has several advantages for risk management and communication. By identifying the variable components of the model and the knowledge associated with each component, this type of modeling can: (1) Systematically identify and close knowledge gaps, (2) Systematically identify acceptable and unacceptable risks, (3) Improve communication with stakeholders as to the reasons for resource use, and (4) Facilitate external scientific approval of the NASA requirements. The modeling of microbial risk involves the evaluation of several key factors including hazard identification, crew exposure assessment, dose-response assessment, and risk characterization. Many of these factors are similar to conditions found on Earth; however, the spaceflight environment is very specialized as the inhabitants live in a small, semi-closed environment that is often dependent on regenerative life support systems. To further complicate modeling efforts, microbial dose

  14. Risk management: concepts and guidance

    National Research Council Canada - National Science Library

    Pritchard, Carl L

    2015-01-01

    .... Supplying comprehensive coverage of risk management tools, practices, and protocols, the book presents powerful techniques that can enhance organizational risk identification, assessment, and management{u2014...

  15. [Cardiovascular risk factors in women].

    Science.gov (United States)

    Cengel, Atiye

    2010-03-01

    It is estimated that at least 80% of patients with cardiovascular disease (CVD) have conventional risk factors and optimization of these risk factors can reduce morbidity and mortality due to this disease considerably. Contemporary women have increased burden of some of these risk factors such as obesity, metabolic syndrome and smoking. Turkish women have a worse CV risk profile than Turkish men in some aspects. Risk stratification systems such as Framingham have a tendency of underestimating the risk in women. Coronary artery disease remains in vessel wall for a longer period of time in women; therefore obstructive disease appear later in their lifespan necessitating risk stratification systems for estimating their lifetime risk.

  16. Current Chemical Risk Reduction Activities

    Science.gov (United States)

    EPA's existing chemicals programs address pollution prevention, risk assessment, hazard and exposure assessment and/or characterization, and risk management for chemicals substances in commercial use.

  17. RISK MANAGEMENT: AN INTEGRATED APPROACH TO RISK MANAGEMENT AND ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Szabo Alina

    2012-12-01

    Full Text Available Purpose: The objective of this paper is to offer an overview over risk management cycle by focusing on prioritization and treatment, in order to ensure an integrated approach to risk management and assessment, and establish the ‘top 8-12’ risks report within the organization. The interface with Internal Audit is ensured by the implementation of the scoring method to prioritize risks collected from previous generated risk report. Methodology/approach: Using evidence from other research in the area and the professional expertise, this article outlines an integrated approach to risk assessment and risk management reporting processes, by separating the risk in two main categories: strategic and operational risks. The focus is on risk prioritization and scoring; the final output will comprise a mix of strategic and operational (‘top 8-12’ risks, which should be used to establish the annual Internal Audit plan. Originality/value: By using an integrated approach to risk assessment and risk management will eliminate the need for a separate Internal Audit risk assessment over prevailing risks. It will reduce the level of risk assessment overlap by different functions (Tax, Treasury, Information System over the same risk categories as a single methodology, is used and will align timings of risk assessment exercises. The risk prioritization by usage of risk and control scoring criteria highlights the combination between financial and non-financial impact criteria allowing risks that do not naturally lend themselves to a financial amount to be also assessed consistently. It is emphasized the usage of score method to prioritize the risks included in the annual audit plan in order to increase accuracy and timelines.

  18. Reputation and its risks.

    Science.gov (United States)

    Eccles, Robert G; Newquist, Scott C; Schatz, Roland

    2007-02-01

    Regulators, industry groups, consultants, and individual companies have developed elaborate guidelines over the years for assessing and managing risks in a wide range of areas, from commodity prices to natural disasters. Yet they have all but ignored reputational risk, mostly because they aren't sure how to define or measure it. That's a big problem, say the authors. Because so much market value comes from hard-to-assess intangible assets like brand equity and intellectual capital, organizations are especially vulnerable to anything that damages their reputations. Moreover, companies with strong positive reputations attract better talent and are perceived as providing more value in their products and services, which often allows them to charge a premium. Their customers are more loyal and buy broader ranges of products and services. Since the market believes that such companies will deliver sustained earnings and future growth, they have higher price-earnings multiples and market values and lower costs of capital. Most companies, however, do an inadequate job of managing their reputations in general and the risks to their reputations in particular. They tend to focus their energies on handling the threats to their reputations that have already surfaced. That is not risk management; it is crisis management--a reactive approach aimed at limiting the damage. The authors provide a framework for actively managing reputational risk. They introduce three factors (the reputation-reality gap, changing beliefs and expectations, and weak internal coordination) that affect the level of such risks and then explore several ways to sufficiently quantify and control those factors. The process outlined in this article will help managers do a better job of assessing existing and potential threats to their companies' reputations and deciding whether to accept a particular risk or take actions to avoid or mitigate it.

  19. [Detecting high risk pregnancy].

    Science.gov (United States)

    Doret, Muriel; Gaucherand, Pascal

    2009-12-20

    Antenatal care is aiming to reduce maternal land foetal mortality and morbidity. Maternal and foetal mortality can be due to different causes. Their knowledge allows identifying pregnancy (high risk pregnancy) with factors associated with an increased risk for maternal and/or foetal mortality and serious morbidity. Identification of high risk pregnancies and initiation of appropriate treatment and/or surveillance should improve maternal and/or foetal outcome. New risk factors are continuously described thanks to improvement in antenatal care and development in biology and cytopathology, increasing complexity in identifying high risk pregnancies. Level of risk can change all over the pregnancy. Ideally, it should be evaluated prior to the pregnancy and at each antenatal visit. Clinical examination is able to screen for intra-uterin growth restriction, pre-eclampsia, threatened for preterm labour; ultrasounds help in the diagnosis of foetal morphological anomalies, foetal chromosomal anomalies, placenta praevia and abnormal foetal growth; biological exams are used to screen for pre-eclampsia, gestational diabetes, trisomy 21 (for which screening method just changed), rhesus immunisation, seroconversion for toxoplasmosis or rubeola, unknown infectious disease (syphilis, hepatitis B, VIH). During pregnancy, most of the preventive strategies have to be initiated during the first trimester or even before conception. Prevention for neural-tube defects, neonatal hypocalcemia and listeriosis should be performed for all women. On the opposite, some measures are concerning only women with risk factors such as prevention for toxoplasmosis, rhesus immunization (which recently changed), tobacco complications and pre-eclampsia and intra-uterine growth factor restriction.

  20. The Use of Dynamic Stochastic Social Behavior Models to Produce Likelihood Functions for Risk Modeling of Proliferation and Terrorist Attacks

    Energy Technology Data Exchange (ETDEWEB)

    Young, Jonathan; Thompson, Sandra E.; Brothers, Alan J.; Whitney, Paul D.; Coles, Garill A.; Henderson, Cindy L.; Wolf, Katherine E.; Hoopes, Bonnie L.

    2008-12-01

    The ability to estimate the likelihood of future events based on current and historical data is essential to the decision making process of many government agencies. Successful predictions related to terror events and characterizing the risks will support development of options for countering these events. The predictive tasks involve both technical and social component models. The social components have presented a particularly difficult challenge. This paper outlines some technical considerations of this modeling activity. Both data and predictions associated with the technical and social models will likely be known with differing certainties or accuracies – a critical challenge is linking across these model domains while respecting this fundamental difference in certainty level. This paper will describe the technical approach being taken to develop the social model and identification of the significant interfaces between the technical and social modeling in the context of analysis of diversion of nuclear material.