WorldWideScience

Sample records for emd-based event analysis

  1. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  2. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  3. EMD-Based Symbolic Dynamic Analysis for the Recognition of Human and Nonhuman Pyroelectric Infrared Signals

    Directory of Open Access Journals (Sweden)

    Jiaduo Zhao

    2016-01-01

    Full Text Available In this paper, we propose an effective human and nonhuman pyroelectric infrared (PIR signal recognition method to reduce PIR detector false alarms. First, using the mathematical model of the PIR detector, we analyze the physical characteristics of the human and nonhuman PIR signals; second, based on the analysis results, we propose an empirical mode decomposition (EMD-based symbolic dynamic analysis method for the recognition of human and nonhuman PIR signals. In the proposed method, first, we extract the detailed features of a PIR signal into five symbol sequences using an EMD-based symbolization method, then, we generate five feature descriptors for each PIR signal through constructing five probabilistic finite state automata with the symbol sequences. Finally, we use a weighted voting classification strategy to classify the PIR signals with their feature descriptors. Comparative experiments show that the proposed method can effectively classify the human and nonhuman PIR signals and reduce PIR detector’s false alarms.

  4. EMD-Based Symbolic Dynamic Analysis for the Recognition of Human and Nonhuman Pyroelectric Infrared Signals.

    Science.gov (United States)

    Zhao, Jiaduo; Gong, Weiguo; Tang, Yuzhen; Li, Weihong

    2016-01-20

    In this paper, we propose an effective human and nonhuman pyroelectric infrared (PIR) signal recognition method to reduce PIR detector false alarms. First, using the mathematical model of the PIR detector, we analyze the physical characteristics of the human and nonhuman PIR signals; second, based on the analysis results, we propose an empirical mode decomposition (EMD)-based symbolic dynamic analysis method for the recognition of human and nonhuman PIR signals. In the proposed method, first, we extract the detailed features of a PIR signal into five symbol sequences using an EMD-based symbolization method, then, we generate five feature descriptors for each PIR signal through constructing five probabilistic finite state automata with the symbol sequences. Finally, we use a weighted voting classification strategy to classify the PIR signals with their feature descriptors. Comparative experiments show that the proposed method can effectively classify the human and nonhuman PIR signals and reduce PIR detector's false alarms.

  5. Forecasting crude oil price with an EMD-based neural network ensemble learning paradigm

    International Nuclear Information System (INIS)

    Yu, Lean; Wang, Shouyang; Lai, Kin Keung

    2008-01-01

    In this study, an empirical mode decomposition (EMD) based neural network ensemble learning paradigm is proposed for world crude oil spot price forecasting. For this purpose, the original crude oil spot price series were first decomposed into a finite, and often small, number of intrinsic mode functions (IMFs). Then a three-layer feed-forward neural network (FNN) model was used to model each of the extracted IMFs, so that the tendencies of these IMFs could be accurately predicted. Finally, the prediction results of all IMFs are combined with an adaptive linear neural network (ALNN), to formulate an ensemble output for the original crude oil price series. For verification and testing, two main crude oil price series, West Texas Intermediate (WTI) crude oil spot price and Brent crude oil spot price, are used to test the effectiveness of the proposed EMD-based neural network ensemble learning methodology. Empirical results obtained demonstrate attractiveness of the proposed EMD-based neural network ensemble learning paradigm. (author)

  6. Multivariate EMD-Based Modeling and Forecasting of Crude Oil Price

    Directory of Open Access Journals (Sweden)

    Kaijian He

    2016-04-01

    Full Text Available Recent empirical studies reveal evidence of the co-existence of heterogeneous data characteristics distinguishable by time scale in the movement crude oil prices. In this paper we propose a new multivariate Empirical Mode Decomposition (EMD-based model to take advantage of these heterogeneous characteristics of the price movement and model them in the crude oil markets. Empirical studies in benchmark crude oil markets confirm that more diverse heterogeneous data characteristics can be revealed and modeled in the projected time delayed domain. The proposed model demonstrates the superior performance compared to the benchmark models.

  7. Analysis of extreme events

    CSIR Research Space (South Africa)

    Khuluse, S

    2009-04-01

    Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...

  8. Application of EMD-Based SVD and SVM to Coal-Gangue Interface Detection

    Directory of Open Access Journals (Sweden)

    Wei Liu

    2014-01-01

    Full Text Available Coal-gangue interface detection during top-coal caving mining is a challenging problem. This paper proposes a new vibration signal analysis approach to detecting the coal-gangue interface based on singular value decomposition (SVD techniques and support vector machines (SVMs. Due to the nonstationary characteristics in vibration signals of the tail boom support of the longwall mining machine in this complicated environment, the empirical mode decomposition (EMD is used to decompose the raw vibration signals into a number of intrinsic mode functions (IMFs by which the initial feature vector matrices can be formed automatically. By applying the SVD algorithm to the initial feature vector matrices, the singular values of matrices can be obtained and used as the input feature vectors of SVMs classifier. The analysis results of vibration signals from the tail boom support of a longwall mining machine show that the method based on EMD, SVD, and SVM is effective for coal-gangue interface detection even when the number of samples is small.

  9. EMD-Based Predictive Deep Belief Network for Time Series Prediction: An Application to Drought Forecasting

    Directory of Open Access Journals (Sweden)

    Norbert A. Agana

    2018-02-01

    Full Text Available Drought is a stochastic natural feature that arises due to intense and persistent shortage of precipitation. Its impact is mostly manifested as agricultural and hydrological droughts following an initial meteorological phenomenon. Drought prediction is essential because it can aid in the preparedness and impact-related management of its effects. This study considers the drought forecasting problem by developing a hybrid predictive model using a denoised empirical mode decomposition (EMD and a deep belief network (DBN. The proposed method first decomposes the data into several intrinsic mode functions (IMFs using EMD, and a reconstruction of the original data is obtained by considering only relevant IMFs. Detrended fluctuation analysis (DFA was applied to each IMF to determine the threshold for robust denoising performance. Based on their scaling exponents, irrelevant intrinsic mode functions are identified and suppressed. The proposed method was applied to predict different time scale drought indices across the Colorado River basin using a standardized streamflow index (SSI as the drought index. The results obtained using the proposed method was compared with standard methods such as multilayer perceptron (MLP and support vector regression (SVR. The proposed hybrid model showed improvement in prediction accuracy, especially for multi-step ahead predictions.

  10. Patient-Specific Seizure Detection in Long-Term EEG Using Signal-Derived Empirical Mode Decomposition (EMD)-based Dictionary Approach.

    Science.gov (United States)

    Kaleem, Muhammad; Gurve, Dharmendra; Guergachi, Aziz; Krishnan, Sridhar

    2018-06-25

    The objective of the work described in this paper is development of a computationally efficient methodology for patient-specific automatic seizure detection in long-term multi-channel EEG recordings. Approach: A novel patient-specific seizure detection approach based on signal-derived Empirical Mode Decomposition (EMD)-based dictionary approach is proposed. For this purpose, we use an empirical framework for EMD-based dictionary creation and learning, inspired by traditional dictionary learning methods, in which the EMD-based dictionary is learned from the multi-channel EEG data being analyzed for automatic seizure detection. We present the algorithm for dictionary creation and learning, whose purpose is to learn dictionaries with a small number of atoms. Using training signals belonging to seizure and non-seizure classes, an initial dictionary, termed as the raw dictionary, is formed. The atoms of the raw dictionary are composed of intrinsic mode functions obtained after decomposition of the training signals using the empirical mode decomposition algorithm. The raw dictionary is then trained using a learning algorithm, resulting in a substantial decrease in the number of atoms in the trained dictionary. The trained dictionary is then used for automatic seizure detection, such that coefficients of orthogonal projections of test signals against the trained dictionary form the features used for classification of test signals into seizure and non-seizure classes. Thus no hand-engineered features have to be extracted from the data as in traditional seizure detection approaches. Main results: The performance of the proposed approach is validated using the CHB-MIT benchmark database, and averaged accuracy, sensitivity and specificity values of 92.9%, 94.3% and 91.5%, respectively, are obtained using support vector machine classifier and five-fold cross-validation method. These results are compared with other approaches using the same database, and the suitability

  11. EVENT PLANNING USING FUNCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  12. MGR External Events Hazards Analysis

    International Nuclear Information System (INIS)

    Booth, L.

    1999-01-01

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses

  13. TEMAC, Top Event Sensitivity Analysis

    International Nuclear Information System (INIS)

    Iman, R.L.; Shortencarier, M.J.

    1988-01-01

    1 - Description of program or function: TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude of risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates. 2 - Method of solution: Sensitivity and uncertainty analyses associated with top events involve mathematical operations on the corresponding Boolean expression for the top event, as well as repeated evaluations of the top event in a Monte Carlo fashion. TEMAC employs a general matrix approach which provides a convenient general form for Boolean expressions, is computationally efficient, and allows large problems to be analyzed. 3 - Restrictions on the complexity of the problem - Maxima of: 4000 cut sets, 500 events, 500 values in a Monte Carlo sample, 16 characters in an event name. These restrictions are implemented through the FORTRAN 77 PARAMATER statement

  14. The Influence of Arginine on the Response of Enamel Matrix Derivative (EMD Proteins to Thermal Stress: Towards Improving the Stability of EMD-Based Products.

    Directory of Open Access Journals (Sweden)

    Alessandra Apicella

    Full Text Available In a current procedure for periodontal tissue regeneration, enamel matrix derivative (EMD, which is the active component, is mixed with a propylene glycol alginate (PGA gel carrier and applied directly to the periodontal defect. Exposure of EMD to physiological conditions then causes it to precipitate. However, environmental changes during manufacture and storage may result in modifications to the conformation of the EMD proteins, and eventually premature phase separation of the gel and a loss in therapeutic effectiveness. The present work relates to efforts to improve the stability of EMD-based formulations such as Emdogain™ through the incorporation of arginine, a well-known protein stabilizer, but one that to our knowledge has not so far been considered for this purpose. Representative EMD-buffer solutions with and without arginine were analyzed by 3D-dynamic light scattering, UV-Vis spectroscopy, transmission electron microscopy and Fourier transform infrared spectroscopy at different acidic pH and temperatures, T, in order to simulate the effect of pH variations and thermal stress during manufacture and storage. The results provided evidence that arginine may indeed stabilize EMD against irreversible aggregation with respect to variations in pH and T under these conditions. Moreover, stopped-flow transmittance measurements indicated arginine addition not to suppress precipitation of EMD from either the buffers or the PGA gel carrier when the pH was raised to 7, a fundamental requirement for dental applications.

  15. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  16. Trending analysis of precursor events

    International Nuclear Information System (INIS)

    Watanabe, Norio

    1998-01-01

    The Accident Sequence Precursor (ASP) Program of United States Nuclear Regulatory Commission (U.S.NRC) identifies and categorizes operational events at nuclear power plants in terms of the potential for core damage. The ASP analysis has been performed on yearly basis and the results have been published in the annual reports. This paper describes the trends in initiating events and dominant sequences for 459 precursors identified in the ASP Program during the 1969-94 period and also discusses a comparison with dominant sequences predicted in the past Probabilistic Risk Assessment (PRA) studies. These trends were examined for three time periods, 1969-81, 1984-87 and 1988-94. Although the different models had been used in the ASP analyses for these three periods, the distribution of precursors by dominant sequences show similar trends to each other. For example, the sequences involving loss of both main and auxiliary feedwater were identified in many PWR events and those involving loss of both high and low coolant injection were found in many BWR events. Also, it was found that these dominant sequences were comparable to those determined to be dominant in the predictions by the past PRAs. As well, a list of the 459 precursors identified are provided in Appendix, indicating initiating event types, unavailable systems, dominant sequences, conditional core damage probabilities, and so on. (author)

  17. Event Shape Analysis in ALICE

    CERN Document Server

    AUTHOR|(CDS)2073367; Paic, Guy

    2009-01-01

    The jets are the final state manifestation of the hard parton scattering. Since at LHC energies the production of hard processes in proton-proton collisions will be copious and varied, it is important to develop methods to identify them through the study of their final states. In the present work we describe a method based on the use of some shape variables to discriminate events according their topologies. A very attractive feature of this analysis is the possibility of using the tracking information of the TPC+ITS in order to identify specific events like jets. Through the correlation between the quantities: thrust and recoil, calculated in minimum bias simulations of proton-proton collisions at 10 TeV, we show the sensitivity of the method to select specific topologies and high multiplicity. The presented results were obtained both at level generator and after reconstruction. It remains that with any kind of jet reconstruction algorithm one will confronted in general with overlapping jets. The present meth...

  18. Joint Attributes and Event Analysis for Multimedia Event Detection.

    Science.gov (United States)

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  19. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  20. Risk analysis of brachytherapy events

    International Nuclear Information System (INIS)

    Buricova, P.; Zackova, H.; Hobzova, L.; Novotny, J.; Kindlova, A.

    2005-01-01

    For prevention radiological events it is necessary to identify hazardous situation and to analyse the nature of committed errors. Though the recommendation on the classification and prevention of radiological events: Radiological accidents has been prepared in the framework of Czech Society of Radiation Oncology, Biology and Physics and it was approved by Czech regulatory body (SONS) in 1999, only a few reports have been submitted up to now from brachytherapy practice. At the radiotherapy departments attention has been paid more likely to the problems of dominant teletherapy treatments. But in the two last decades the usage of brachytherapy methods has gradually increased because .nature of this treatment well as the possibilities of operating facility have been completely changed: new radionuclides of high activity are introduced and sophisticate afterloading systems controlled by computers are used. Consequently also the nature of errors, which can occurred in the clinical practice, has been changing. To determine the potentially hazardous parts of procedure the so-called 'process tree', which follows the flow of entire treatment process, has been created for most frequent type of applications. Marking the location of errors on the process tree indicates where failures occurred and accumulation of marks along branches show weak points in the process. Analysed data provide useful information to prevent medical events in brachytherapy .The results strength the requirements given in Recommendations of SONS and revealed the need for its amendment. They call especially for systematic registration of the events. (authors)

  1. Surface Management System Departure Event Data Analysis

    Science.gov (United States)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  2. External events analysis for experimental fusion facilities

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1990-01-01

    External events are those off-normal events that threaten facilities either from outside or inside the building. These events, such as floods, fires, and earthquakes, are among the leading risk contributors for fission power plants, and the nature of fusion facilities indicates that they may also lead fusion risk. This paper gives overviews of analysis methods, references good analysis guidance documents, and gives design tips for mitigating the effects of floods and fires, seismic events, and aircraft impacts. Implications for future fusion facility siting are also discussed. Sites similar to fission plant sites are recommended. 46 refs

  3. Event analysis in primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Paulasaari, H. [Tampere Univ. of Technology (Finland)

    1996-12-31

    The target of the project is to develop a system which observes the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: the first is the communication with the SRIO and the SCADA system, the second is the internal clock, which is used to produce time stamped event data, and the third is the capability to register some values during the fault. For example, during a short circuit fault the relay registers the value of the short circuit current and information on the number of faulted phases. In the case of an earth fault the relay stores both the neutral current and the neutral voltage

  4. Event analysis in primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Paulasaari, H [Tampere Univ. of Technology (Finland)

    1997-12-31

    The target of the project is to develop a system which observes the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: the first is the communication with the SRIO and the SCADA system, the second is the internal clock, which is used to produce time stamped event data, and the third is the capability to register some values during the fault. For example, during a short circuit fault the relay registers the value of the short circuit current and information on the number of faulted phases. In the case of an earth fault the relay stores both the neutral current and the neutral voltage

  5. External event analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Bohn, M.P.; Lambright, J.A.

    1989-01-01

    The US Nuclear Regulatory Commission is sponsoring probabilistic risk assessments of six operating commercial nuclear power plants as part of a major update of the understanding of risk as provided by the original WASH-1400 risk assessments. In contrast to the WASH-1400 studies, at least two of the NUREG-1150 risk assessments will include an analysis of risks due to earthquakes, fires, floods, etc., which are collectively known as eternal events. This paper summarizes the methods to be used in the external event analysis for NUREG-1150 and the results obtained to date. The two plants for which external events are being considered are Surry and Peach Bottom, a PWR and BWR respectively. The external event analyses (through core damage frequency calculations) were completed in June 1989, with final documentation available in September. In contrast to most past external event analyses, wherein rudimentary systems models were developed reflecting each external event under consideration, the simplified NUREG-1150 analyses are based on the availability of the full internal event PRA systems models (event trees and fault trees) and make use of extensive computer-aided screening to reduce them to sequence cut sets important to each external event. This provides two major advantages in that consistency and scrutability with respect to the internal event analysis is achieved, and the full gamut of random and test/maintenance unavailabilities are automatically included, while only those probabilistically important survive the screening process. Thus, full benefit of the internal event analysis is obtained by performing the internal and external event analyses sequentially

  6. NPP unusual events: data, analysis and application

    International Nuclear Information System (INIS)

    Tolstykh, V.

    1990-01-01

    Subject of the paper are the IAEA cooperative patterns of unusual events data treatment and utilization of the operating safety experience feedback. The Incident Reporting System (IRS) and the Analysis of Safety Significant Event Team (ASSET) are discussed. The IRS methodology in collection, handling, assessment and dissemination of data on NPP unusual events (deviations, incidents and accidents) occurring during operations, surveillance and maintenance is outlined by the reports gathering and issuing practice, the experts assessment procedures and the parameters of the system. After 7 years of existence the IAEA-IRS contains over 1000 reports and receives 1.5-4% of the total information on unusual events. The author considers the reports only as detailed technical 'records' of events requiring assessment. The ASSET approaches implying an in-depth occurrences analysis directed towards level-1 PSA utilization are commented on. The experts evaluated root causes for the reported events and some trends are presented. Generally, internal events due to unexpected paths of water in the nuclear installations, occurrences related to the integrity of the primary heat transport systems, events associated with the engineered safety systems and events involving human factor represent the large groups deserving close attention. Personal recommendations on how to use the events related information use for NPP safety improvement are given. 2 tabs (R.Ts)

  7. Data analysis of event tape and connection

    International Nuclear Information System (INIS)

    Gong Huili

    1995-01-01

    The data analysis on the VAX-11/780 computer is briefly described, the data is from the recorded event tape of JUHU data acquisition system on the PDP-11/44 computer. The connection of the recorded event tapes of the XSYS data acquisition system on VAX computer is also introduced

  8. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  9. Interpretation Analysis as a Competitive Event.

    Science.gov (United States)

    Nading, Robert M.

    Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…

  10. Human reliability analysis using event trees

    International Nuclear Information System (INIS)

    Heslinga, G.

    1983-01-01

    The shut-down procedure of a technologically complex installation as a nuclear power plant consists of a lot of human actions, some of which have to be performed several times. The procedure is regarded as a chain of modules of specific actions, some of which are analyzed separately. The analysis is carried out by making a Human Reliability Analysis event tree (HRA event tree) of each action, breaking down each action into small elementary steps. The application of event trees in human reliability analysis implies more difficulties than in the case of technical systems where event trees were mainly used until now. The most important reason is that the operator is able to recover a wrong performance; memory influences play a significant role. In this study these difficulties are dealt with theoretically. The following conclusions can be drawn: (1) in principle event trees may be used in human reliability analysis; (2) although in practice the operator will recover his fault partly, theoretically this can be described as starting the whole event tree again; (3) compact formulas have been derived, by which the probability of reaching a specific failure consequence on passing through the HRA event tree after several times of recovery is to be calculated. (orig.)

  11. Negated bio-events: analysis and identification

    Science.gov (United States)

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  12. Statistical analysis of solar proton events

    Directory of Open Access Journals (Sweden)

    V. Kurt

    2004-06-01

    Full Text Available A new catalogue of 253 solar proton events (SPEs with energy >10MeV and peak intensity >10 protons/cm2.s.sr (pfu at the Earth's orbit for three complete 11-year solar cycles (1970-2002 is given. A statistical analysis of this data set of SPEs and their associated flares that occurred during this time period is presented. It is outlined that 231 of these proton events are flare related and only 22 of them are not associated with Ha flares. It is also noteworthy that 42 of these events are registered as Ground Level Enhancements (GLEs in neutron monitors. The longitudinal distribution of the associated flares shows that a great number of these events are connected with west flares. This analysis enables one to understand the long-term dependence of the SPEs and the related flare characteristics on the solar cycle which are useful for space weather prediction.

  13. Sentiment analysis on tweets for social events

    DEFF Research Database (Denmark)

    Zhou, Xujuan; Tao, Xiaohui; Yong, Jianming

    2013-01-01

    Sentiment analysis or opinion mining is an important type of text analysis that aims to support decision making by extracting and analyzing opinion oriented text, identifying positive and negative opinions, and measuring how positively or negatively an entity (i.e., people, organization, event......, location, product, topic, etc.) is regarded. As more and more users express their political and religious views on Twitter, tweets become valuable sources of people's opinions. Tweets data can be efficiently used to infer people's opinions for marketing or social studies. This paper proposes a Tweets...... Sentiment Analysis Model (TSAM) that can spot the societal interest and general people's opinions in regard to a social event. In this paper, Australian federal election 2010 event was taken as an example for sentiment analysis experiments. We are primarily interested in the sentiment of the specific...

  14. Event analysis in a primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Jaerventausta, P; Paulasaari, H [Tampere Univ. of Technology (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    The target of the project was to develop applications which observe the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: communication with the SCADA, the internal clock to produce time stamped event data, and the capability to register certain values during the fault. Using the above features some new functions for event analysis were developed in the project

  15. Attack Graph Construction for Security Events Analysis

    Directory of Open Access Journals (Sweden)

    Andrey Alexeevich Chechulin

    2014-09-01

    Full Text Available The paper is devoted to investigation of the attack graphs construction and analysis task for a network security evaluation and real-time security event processing. Main object of this research is the attack modeling process. The paper contains the description of attack graphs building, modifying and analysis technique as well as overview of implemented prototype for network security analysis based on attack graph approach.

  16. Advanced event reweighting using multivariate analysis

    International Nuclear Information System (INIS)

    Martschei, D; Feindt, M; Honc, S; Wagner-Kuhr, J

    2012-01-01

    Multivariate analysis (MVA) methods, especially discrimination techniques such as neural networks, are key ingredients in modern data analysis and play an important role in high energy physics. They are usually trained on simulated Monte Carlo (MC) samples to discriminate so called 'signal' from 'background' events and are then applied to data to select real events of signal type. We here address procedures that improve this work flow. This will be the enhancement of data / MC agreement by reweighting MC samples on a per event basis. Then training MVAs on real data using the sPlot technique will be discussed. Finally we will address the construction of MVAs whose discriminator is independent of a certain control variable, i.e. cuts on this variable will not change the discriminator shape.

  17. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  18. Disruptive event analysis: volcanism and igneous intrusion

    International Nuclear Information System (INIS)

    Crowe, B.M.

    1979-01-01

    Three basic topics are addressed for the disruptive event analysis: first, the range of disruptive consequences of a radioactive waste repository by volcanic activity; second, the possible reduction of the risk of disruption by volcanic activity through selective siting of a repository; and third, the quantification of the probability of repository disruption by volcanic activity

  19. Parallel processor for fast event analysis

    International Nuclear Information System (INIS)

    Hensley, D.C.

    1983-01-01

    Current maximum data rates from the Spin Spectrometer of approx. 5000 events/s (up to 1.3 MBytes/s) and minimum analysis requiring at least 3000 operations/event require a CPU cycle time near 70 ns. In order to achieve an effective cycle time of 70 ns, a parallel processing device is proposed where up to 4 independent processors will be implemented in parallel. The individual processors are designed around the Am2910 Microsequencer, the AM29116 μP, and the Am29517 Multiplier. Satellite histogramming in a mass memory system will be managed by a commercial 16-bit μP system

  20. Dynamic Event Tree Analysis Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  1. Multistate event history analysis with frailty

    Directory of Open Access Journals (Sweden)

    Govert Bijwaard

    2014-05-01

    Full Text Available Background: In survival analysis a large literature using frailty models, or models with unobserved heterogeneity, exists. In the growing literature and modelling on multistate models, this issue is only in its infant phase. Ignoring frailty can, however, produce incorrect results. Objective: This paper presents how frailties can be incorporated into multistate models, with an emphasis on semi-Markov multistate models with a mixed proportional hazard structure. Methods: First, the aspects of frailty modeling in univariate (proportional hazard, Cox and multivariate event history models are addressed. The implications of choosing shared or correlated frailty is highlighted. The relevant differences with recurrent events data are covered next. Multistate models are event history models that can have both multivariate and recurrent events. Incorporating frailty in multistate models, therefore, brings all the previously addressed issues together. Assuming a discrete frailty distribution allows for a very general correlation structure among the transition hazards in a multistate model. Although some estimation procedures are covered the emphasis is on conceptual issues. Results: The importance of multistate frailty modeling is illustrated with data on labour market and migration dynamics of recent immigrants to the Netherlands.

  2. Analysis hierarchical model for discrete event systems

    Science.gov (United States)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  3. Using variable transformations to perform common event analysis

    International Nuclear Information System (INIS)

    Worrell, R.B.

    1977-01-01

    Any analytical method for studying the effect of common events on the behavior of a system is considered as being a form of common event analysis. The particular common events that are involved often represent quite different phenomena, and this has led to the development of different kinds of common event analysis. For example, common mode failure analysis, common cause analysis, critical location analysis, etc., are all different kinds of common event analysis for which the common events involved represent different phenomena. However, the problem that must be solved for each of these different kinds of common event analysis is essentially the same: Determine the effect of common events on the behavior of a system. Thus, a technique that is useful in achieving one kind of common event analysis is often useful in achieving other kinds of common event analysis

  4. Probabilistic analysis of extreme wind events

    Energy Technology Data Exchange (ETDEWEB)

    Chaviaropoulos, P.K. [Center for Renewable Energy Sources (CRES), Pikermi Attikis (Greece)

    1997-12-31

    A vital task in wind engineering and meterology is to understand, measure, analyse and forecast extreme wind conditions, due to their significant effects on human activities and installations like buildings, bridges or wind turbines. The latest version of the IEC standard (1996) pays particular attention to the extreme wind events that have to be taken into account when designing or certifying a wind generator. Actually, the extreme wind events within a 50 year period are those which determine the ``static`` design of most of the wind turbine components. The extremes which are important for the safety of wind generators are those associated with the so-called ``survival wind speed``, the extreme operating gusts and the extreme wind direction changes. A probabilistic approach for the analysis of these events is proposed in this paper. Emphasis is put on establishing the relation between extreme values and physically meaningful ``site calibration`` parameters, like probability distribution of the annual wind speed, turbulence intensity and power spectra properties. (Author)

  5. Contingency Analysis of Cascading Line Outage Events

    Energy Technology Data Exchange (ETDEWEB)

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  6. DISRUPTIVE EVENT BIOSPHERE DOSE CONVERSION FACTOR ANALYSIS

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The Biosphere Model Report (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objective of this analysis was to develop the BDCFs for the volcanic

  7. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  8. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  9. Human reliability analysis of dependent events

    International Nuclear Information System (INIS)

    Swain, A.D.; Guttmann, H.E.

    1977-01-01

    In the human reliability analysis in WASH-1400, the continuous variable of degree of interaction among human events was approximated by selecting four points on this continuum to represent the entire continuum. The four points selected were identified as zero coupling (i.e., zero dependence), complete coupling (i.e., complete dependence), and two intermediate points--loose coupling (a moderate level of dependence) and tight coupling (a high level of dependence). The paper expands the WASH-1400 treatment of common mode failure due to the interaction of human activities. Mathematical expressions for the above four levels of dependence are derived for parallel and series systems. The psychological meaning of each level of dependence is illustrated by examples, with probability tree diagrams to illustrate the use of conditional probabilities resulting from the interaction of human actions in nuclear power plant tasks

  10. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  11. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  12. Disruptive Event Biosphere Doser Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  13. Analysis of external events - Nuclear Power Plant Dukovany

    International Nuclear Information System (INIS)

    Hladky, Milan

    2000-01-01

    PSA of external events at level 1 covers internal events, floods, fires, other external events are not included yet. Shutdown PSA takes into account internal events, floods, fires, heavy load drop, other external events are not included yet. Final safety analysis report was conducted after 10 years of operation for all Dukovany operational units. Probabilistic approach was used for analysis of aircraft drop and external man-induced events. The risk caused by man-induced events was found to be negligible and was accepted by State Office for Nuclear Safety (SONS)

  14. Event shape analysis in ultrarelativistic nuclear collisions

    OpenAIRE

    Kopecna, Renata; Tomasik, Boris

    2016-01-01

    We present a novel method for sorting events. So far, single variables like flow vector magnitude were used for sorting events. Our approach takes into account the whole azimuthal angle distribution rather than a single variable. This method allows us to determine the good measure of the event shape, providing a multiplicity-independent insight. We discuss the advantages and disadvantages of this approach, the possible usage in femtoscopy, and other more exclusive experimental studies.

  15. Economic Multipliers and Mega-Event Analysis

    OpenAIRE

    Victor Matheson

    2004-01-01

    Critics of economic impact studies that purport to show that mega-events such as the Olympics bring large benefits to the communities “lucky” enough to host them frequently cite the use of inappropriate multipliers as a primary reason why these impact studies overstate the true economic gains to the hosts of these events. This brief paper shows in a numerical example how mega-events may lead to inflated multipliers and exaggerated claims of economic benefits.

  16. Root cause analysis of relevant events

    International Nuclear Information System (INIS)

    Perez, Silvia S.; Vidal, Patricia G.

    2000-01-01

    During 1998 the research work followed more specific guidelines, which entailed focusing exclusively on the two selected methods (ASSET and HPIP) and incorporating some additional human behaviour elements based on the documents of reference. Once resident inspectors were incorporated in the project (and trained accordingly), events occurring in Argentine nuclear power plants were analysed. Some events were analysed (all of them from Atucha I and Embalse nuclear power plant), concluding that the systematic methodology used allows us to investigate also minor events that were precursors of the events selected. (author)

  17. Analysis for Human-related Events during the Overhaul

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Tae; Kim, Min Chull; Choi, Dong Won; Lee, Durk Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2011-10-15

    The event frequency due to human error is decreasing among 20 operating Nuclear Power Plants (NPPs) excluding the NPP (Shin-Kori unit-1) in the commissioning stage since 2008. However, the events due to human error during an overhaul (O/H) occur annually (see Table I). An analysis for human-related events during the O/H was performed. Similar problems were identified for each event from the analysis and also, organizational and safety cultural factors were also identified

  18. Glaciological parameters of disruptive event analysis

    International Nuclear Information System (INIS)

    Bull, C.

    1979-01-01

    The following disruptive events caused by ice sheets are considered: continental glaciation, erosion, loading and subsidence, deep ground water recharge, flood erosion, isostatic rebound rates, melting, and periodicity of ice ages

  19. A Fourier analysis of extreme events

    DEFF Research Database (Denmark)

    Mikosch, Thomas Valentin; Zhao, Yuwei

    2014-01-01

    The extremogram is an asymptotic correlogram for extreme events constructed from a regularly varying stationary sequence. In this paper, we define a frequency domain analog of the correlogram: a periodogram generated from a suitable sequence of indicator functions of rare events. We derive basic ...... properties of the periodogram such as the asymptotic independence at the Fourier frequencies and use this property to show that weighted versions of the periodogram are consistent estimators of a spectral density derived from the extremogram....

  20. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  1. External events analysis of the Ignalina Nuclear Power Plant

    International Nuclear Information System (INIS)

    Liaukonis, Mindaugas; Augutis, Juozas

    1999-01-01

    This paper presents analysis of external events impact on the safe operation of the Ignalina Nuclear Power Plant (INPP) safety systems. Analysis was based on the probabilistic estimation and modelling of the external hazards. The screening criteria were applied to the number of external hazards. The following external events such as aircraft failure on the INPP, external flooding, fire, extreme winds requiring further bounding study were analysed. Mathematical models were developed and event probabilities were calculated. External events analysis showed rather limited external events danger to Ignalina NPP. Results of the analysis were compared to analogous analysis in western NPPs and no great differences were specified. Calculations performed show that external events can not significantly influence the safety level of the Ignalina NPP operation. (author)

  2. Statistical analysis of hydrodynamic cavitation events

    Science.gov (United States)

    Gimenez, G.; Sommer, R.

    1980-10-01

    The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.

  3. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  4. Risk and sensitivity analysis in relation to external events

    International Nuclear Information System (INIS)

    Alzbutas, R.; Urbonas, R.; Augutis, J.

    2001-01-01

    This paper presents risk and sensitivity analysis of external events impacts on the safe operation in general and in particular the Ignalina Nuclear Power Plant safety systems. Analysis is based on the deterministic and probabilistic assumptions and assessment of the external hazards. The real statistic data are used as well as initial external event simulation. The preliminary screening criteria are applied. The analysis of external event impact on the NPP safe operation, assessment of the event occurrence, sensitivity analysis, and recommendations for safety improvements are performed for investigated external hazards. Such events as aircraft crash, extreme rains and winds, forest fire and flying parts of the turbine are analysed. The models are developed and probabilities are calculated. As an example for sensitivity analysis the model of aircraft impact is presented. The sensitivity analysis takes into account the uncertainty features raised by external event and its model. Even in case when the external events analysis show rather limited danger, the sensitivity analysis can determine the highest influence causes. These possible variations in future can be significant for safety level and risk based decisions. Calculations show that external events cannot significantly influence the safety level of the Ignalina NPP operation, however the events occurrence and propagation can be sufficiently uncertain.(author)

  5. Probabilistic analysis of external events with focus on the Fukushima event

    International Nuclear Information System (INIS)

    Kollasko, Heiko; Jockenhoevel-Barttfeld, Mariana; Klapp, Ulrich

    2014-01-01

    External hazards are those natural or man-made hazards to a site and facilities that are originated externally to both the site and its processes, i.e. the duty holder may have very little or no control over the hazard. External hazards can have the potential of causing initiating events at the plant, typically transients like e.g., loss of offsite power. Simultaneously, external events may affect safety systems required to control the initiating event and, where applicable, also back-up systems implemented for risk-reduction. The plant safety may especially be threatened when loads from external hazards exceed the load assumptions considered in the design of safety-related systems, structures and components. Another potential threat is given by hazards inducing initiating events not considered in the safety demonstration otherwise. An example is loss of offsite power combined with prolonged plant isolation. Offsite support, e.g., delivery of diesel fuel oil, usually credited in the deterministic safety analysis may not be possible in this case. As the Fukushima events have shown, the biggest threat is likely given by hazards inducing both effects. Such hazards may well be dominant risk contributors even if their return period is very high. In order to identify relevant external hazards for a certain Nuclear Power Plant (NPP) location, a site specific screening analysis is performed, both for single events and for combinations of external events. As a result of the screening analysis, risk significant and therefore relevant (screened-in) single external events and combinations of them are identified for a site. The screened-in events are further considered in a detailed event tree analysis in the frame of the Probabilistic Safety Analysis (PSA) to calculate the core damage/large release frequency resulting from each relevant external event or from each relevant combination. Screening analyses of external events performed at AREVA are based on the approach provided

  6. ANALYSIS OF EVENT TOURISM IN RUSSIA, ITS FUNCTIONS, WAYS TO IMPROVE THE EFFICIENCY OF EVENT

    Directory of Open Access Journals (Sweden)

    Mikhail Yur'evich Grushin

    2016-01-01

    Full Text Available This article considers one of the important directions of development of the national economy in the area of tourist services – development of event tourism in the Russian Federation. Today the market of event management in Russia is in the process of formation, therefore its impact on the socio-economic development of regions and Russia as a whole is minimal, and the analysis of the influence is not performed. This problem comes to the fore in the regions of Russia, specializing in the creation of event-direction tourist-recreational cluster. The article provides an analysis of the existing market of event management and event tourism functions. Providing the ways to improve the efficiency of event management and recommendations for the organizer of events in the regions. The article shows the specific role of event tourism in the national tourism and provides direction for the development of organizational and methodical recommendations on its formation in the regions of Russia and the creation of an effective management system at the regional level. The purpose of this article is to analyze the emerging in Russia event tourism market and its specifics. On the basis of these studies are considered folding patterns of the new market and the assessment of its impact on the modern national tourism industry. Methodology. To complete this article are used comparative and economic and statistical analysis methods. Conclusions/significance. The practical importance of this article is in the elimination of existing in the national tourism industry contradictions: on the one hand, in the Russian Federation is annually held a large amount events activities, including world-class in all regions say about tourist trips to the event, but the event tourism does not exist yet. In all regions, there is an internal and inbound tourism, but it has nothing to do with the event tourism. The article has a practical conclusions demonstrate the need to adapt the

  7. Second-order analysis of semiparametric recurrent event processes.

    Science.gov (United States)

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  8. Analysis of catchments response to severe drought event for ...

    African Journals Online (AJOL)

    Nafiisah

    The run sum analysis method was a sound method which indicates in ... intensity and duration of stream flow depletion between nearby catchments. ... threshold level analysis method, and allows drought events to be described in more.

  9. Preliminary safety analysis of unscrammed events for KLFR

    International Nuclear Information System (INIS)

    Kim, S.J.; Ha, G.S.

    2005-01-01

    The report presents the design features of KLFR; Safety Analysis Code; steady-state calculation results and analysis results of unscrammed events. The calculations of the steady-state and unscrammed events have been performed for the conceptual design of KLFR using SSC-K code. UTOP event results in no fuel damage and no centre-line melting. The inherent safety features are demonstrated through the analysis of ULOHS event. Although the analysis of ULOF has much uncertainties in the pump design, the analysis results show the inherent safety characteristics. 6% flow of rated flow of natural circulation is formed in the case of ULOF. In the metallic fuel rod, the cladding temperature is somewhat high due to the low heat transfer coefficient of lead. ULOHS event should be considered in design of RVACS for long-term cooling

  10. Top event prevention analysis: A deterministic use of PRA

    International Nuclear Information System (INIS)

    Worrell, R.B.; Blanchard, D.P.

    1996-01-01

    This paper describes the application of Top Event Prevention Analysis. The analysis finds prevention sets which are combinations of basic events that can prevent the occurrence of a fault tree top event such as core damage. The problem analyzed in this application is that of choosing a subset of Motor-Operated Valves (MOVs) for testing under the Generic Letter 89-10 program such that the desired level of safety is achieved while providing economic relief from the burden of testing all safety-related valves. A brief summary of the method is given, and the process used to produce a core damage expression from Level 1 PRA models for a PWR is described. The analysis provides an alternative to the use of importance measures for finding the important combination of events in a core damage expression. This application of Top Event Prevention Analysis to the MOV problem was achieve with currently available software

  11. Resonant experience in emergent events of analysis

    DEFF Research Database (Denmark)

    Revsbæk, Line

    2018-01-01

    Theory, and the traditions of thought available and known to us, give shape to what we are able to notice of our field of inquiry, and so also of our practice of research. Building on G. H. Mead’s Philosophy of the Present (1932), this paper draws attention to ‘emergent events’ of analysis when...... in responsive relating to (case study) others is made generative as a dynamic in and of case study analysis. Using a case of being a newcomer (to research communities) researching newcomer innovation (of others), ‘resonant experience’ is illustrated as a heuristic in interview analysis to simultaneously...

  12. External events analysis for the Savannah River Site K reactor

    International Nuclear Information System (INIS)

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be ''external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 x 10 -4 per year, from which seismic events are the major contributor (1.2 x 10 -4 per year). Fire initiated events contribute 1.4 x 10 -7 per year, tornados 5.8 x 10 -7 per year, dam failures 1.5 x 10 -6 per year and the crane failure scenario less than 10 -4 per year to the core melt frequency. 8 refs., 3 figs., 5 tabs

  13. A Fourier analysis of extremal events

    DEFF Research Database (Denmark)

    Zhao, Yuwei

    is the extremal periodogram. The extremal periodogram shares numerous asymptotic properties with the periodogram of a linear process in classical time series analysis: the asymptotic distribution of the periodogram ordinates at the Fourier frequencies have a similar form and smoothed versions of the periodogram...

  14. Event analysis using a massively parallel processor

    International Nuclear Information System (INIS)

    Bale, A.; Gerelle, E.; Messersmith, J.; Warren, R.; Hoek, J.

    1990-01-01

    This paper describes a system for performing histogramming of n-tuple data at interactive rates using a commercial SIMD processor array connected to a work-station running the well-known Physics Analysis Workstation software (PAW). Results indicate that an order of magnitude performance improvement over current RISC technology is easily achievable

  15. Multi-Unit Initiating Event Analysis for a Single-Unit Internal Events Level 1 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong San; Park, Jin Hee; Lim, Ho Gon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The Fukushima nuclear accident in 2011 highlighted the importance of considering the risks from multi-unit accidents at a site. The ASME/ANS probabilistic risk assessment (PRA) standard also includes some requirements related to multi-unit aspects, one of which (IE-B5) is as follows: 'For multi-unit sites with shared systems, DO NOT SUBSUME multi-unit initiating events if they impact mitigation capability [1].' However, the existing single-unit PSA models do not explicitly consider multi-unit initiating events and hence systems shared by multiple units (e.g., alternate AC diesel generator) are fully credited for the single unit and ignores the need for the shared systems by other units at the same site [2]. This paper describes the results of the multi-unit initiating event (IE) analysis performed as a part of the at-power internal events Level 1 probabilistic safety assessment (PSA) for an OPR1000 single unit ('reference unit'). In this study, a multi-unit initiating event analysis for a single-unit PSA was performed, and using the results, dual-unit LOOP initiating event was added to the existing PSA model for the reference unit (OPR1000 type). Event trees were developed for dual-unit LOOP and dual-unit SBO which can be transferred from dual- unit LOOP. Moreover, CCF basic events for 5 diesel generators were modelled. In case of simultaneous SBO occurrences in both units, this study compared two different assumptions on the availability of the AAC D/G. As a result, when dual-unit LOOP initiating event was added to the existing single-unit PSA model, the total CDF increased by 1∼ 2% depending on the probability that the AAC D/G is available to a specific unit in case of simultaneous SBO in both units.

  16. Analysis of event-mode data with Interactive Data Language

    International Nuclear Information System (INIS)

    De Young, P.A.; Hilldore, B.B.; Kiessel, L.M.; Peaslee, G.F.

    2003-01-01

    We have developed an analysis package for event-mode data based on Interactive Data Language (IDL) from Research Systems Inc. This high-level language is high speed, array oriented, object oriented, and has extensive visual (multi-dimensional plotting) and mathematical functions. We have developed a general framework, written in IDL, for the analysis of a variety of experimental data that does not require significant customization for each analysis. Unlike many traditional analysis package, spectra and gates are applied after data are read and are easily changed as analysis proceeds without rereading the data. The events are not sequentially processed into predetermined arrays subject to predetermined gates

  17. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  18. Analysis of unprotected overcooling events in the Integral Fast Reactor

    International Nuclear Information System (INIS)

    Vilim, R.B.

    1989-01-01

    Simple analytic models are developed for predicting the response of a metal fueled, liquid-metal cooled reactor to unprotected overcooling events in the balance of plant. All overcooling initiators are shown to fall into two categories. The first category contains these events for which there is no final equilibrium state of constant overcooling, as in the case for a large steam leak. These events are analyzed using a non-flow control mass approach. The second category contains those events which will eventually equilibrate, such as a loss of feedwater heaters. A steady flow control volume analysis shows that these latter events ultimately affect the plant through the feedwater inlet to the steam generator. The models developed for analyzing these two categories provide upper bounds for the reactor's passive response to overcooling accident initiators. Calculation of these bounds for a prototypic plant indicate that failure limits -- eutectic melting, sodium boiling, fuel pin failure -- are not exceeded in any overcooling event. 2 refs

  19. Repeated Time-to-event Analysis of Consecutive Analgesic Events in Postoperative Pain

    DEFF Research Database (Denmark)

    Juul, Rasmus Vestergaard; Rasmussen, Sten; Kreilgaard, Mads

    2015-01-01

    BACKGROUND: Reduction in consumption of opioid rescue medication is often used as an endpoint when investigating analgesic efficacy of drugs by adjunct treatment, but appropriate methods are needed to analyze analgesic consumption in time. Repeated time-to-event (RTTE) modeling is proposed as a way...... to describe analgesic consumption by analyzing the timing of consecutive analgesic events. METHODS: Retrospective data were obtained from 63 patients receiving standard analgesic treatment including morphine on request after surgery following hip fracture. Times of analgesic events up to 96 h after surgery...... were extracted from hospital medical records. Parametric RTTE analysis was performed with exponential, Weibull, or Gompertz distribution of analgesic events using NONMEM®, version 7.2 (ICON Development Solutions, USA). The potential influences of night versus day, sex, and age were investigated...

  20. Sovereign Default Analysis through Extreme Events Identification

    Directory of Open Access Journals (Sweden)

    Vasile George MARICA

    2015-06-01

    Full Text Available This paper investigates contagion in international credit markets through the use of a novel jump detection technique proposed by Chan and Maheuin (2002. This econometrical methodology is preferred because it is non-linear by definition and not a subject to volatility bias. Also, the identified jumps in CDS premiums are considered as outliers positioned beyond any stochastic movement that can and is already modelled through well-known linear analysis. Though contagion is hard to define, we show that extreme discrete movements in default probabilities inferred from CDS premiums can lead to sound economic conclusions about the risk profile of sovereign nations in international bond markets. We find evidence of investor sentiment clustering for countries with unstable political regimes or that are engaged in armed conflict. Countries that have in their recent history faced currency or financial crises are less vulnerable to external unexpected shocks. First we present a brief history of sovereign defaults with an emphasis on their increased frequency and geographical reach, as financial markets become more and more integrated. We then pass to a literature review of the most important definitions for contagion, and discuss what quantitative methods are available to detect the presence of contagion. The paper continues with the details for the methodology of jump detection through non-linear modelling and its use in the field of contagion identification. In the last sections we present the estimation results for simultaneous jumps between emerging markets CDS and draw conclusions on the difference of behavior in times of extreme movement versus tranquil periods.

  1. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  2. Glaciological parameters of disruptive event analysis

    International Nuclear Information System (INIS)

    Bull, C.

    1980-04-01

    The possibility of complete glaciation of the earth is small and probably need not be considered in the consequence analysis by the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program. However, within a few thousand years an ice sheet may well cover proposed waste disposal sites in Michigan. Those in the Gulf Coast region and New Mexico are unlikely to be ice covered. The probability of ice cover at Hanford in the next million years is finite, perhaps about 0.5. Sea level will fluctuate as a result of climatic changes. As ice sheets grow, sea level will fall. Melting of ice sheets will be accompanied by a rise in sea level. Within the present interglacial period there is a definite chance that the West Antarctic ice sheet will melt. Ice sheets are agents of erosion, and some estimates of the amount of material they erode have been made. As an average over the area glaciated by late Quaternary ice sheets, only a few tens of meters of erosion is indicated. There were perhaps 3 meters of erosion per glaciation cycle. Under glacial conditions the surface boundary conditions for ground water recharge will be appreciably changed. In future glaciations melt-water rivers generally will follow pre-existing river courses. Some salt dome sites in the Gulf Coast region could be susceptible to changes in the course of the Mississippi River. The New Mexico site, which is on a high plateau, seems to be immune from this type of problem. The Hanford Site is only a few miles from the Columbia River, and in the future, lateral erosion by the Columbia River could cause changes in its course. A prudent assumption in the AEGIS study is that the present interglacial will continue for only a limited period and that subsequently an ice sheet will form over North America. Other factors being equal, it seems unwise to site a nuclear waste repository (even at great depth) in an area likely to be glaciated

  3. Human performance analysis of industrial radiography radiation exposure events

    International Nuclear Information System (INIS)

    Reece, W.J.; Hill, S.G.

    1995-01-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures

  4. Serious adverse events with infliximab: analysis of spontaneously reported adverse events.

    Science.gov (United States)

    Hansen, Richard A; Gartlehner, Gerald; Powell, Gregory E; Sandler, Robert S

    2007-06-01

    Serious adverse events such as bowel obstruction, heart failure, infection, lymphoma, and neuropathy have been reported with infliximab. The aims of this study were to explore adverse event signals with infliximab by using a long period of post-marketing experience, stratifying by indication. The relative reporting of infliximab adverse events to the U.S. Food and Drug Administration (FDA) was assessed with the public release version of the adverse event reporting system (AERS) database from 1968 to third quarter 2005. On the basis of a systematic review of adverse events, Medical Dictionary for Regulatory Activities (MedDRA) terms were mapped to predefined categories of adverse events, including death, heart failure, hepatitis, infection, infusion reaction, lymphoma, myelosuppression, neuropathy, and obstruction. Disproportionality analysis was used to calculate the empiric Bayes geometric mean (EBGM) and corresponding 90% confidence intervals (EB05, EB95) for adverse event categories. Infliximab was identified as the suspect medication in 18,220 reports in the FDA AERS database. We identified a signal for lymphoma (EB05 = 6.9), neuropathy (EB05 = 3.8), infection (EB05 = 2.9), and bowel obstruction (EB05 = 2.8). The signal for granulomatous infections was stronger than the signal for non-granulomatous infections (EB05 = 12.6 and 2.4, respectively). The signals for bowel obstruction and infusion reaction were specific to patients with IBD; this suggests potential confounding by indication, especially for bowel obstruction. In light of this additional evidence of risk of lymphoma, neuropathy, and granulomatous infections, clinicians should stress this risk in the shared decision-making process.

  5. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    Science.gov (United States)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to

  6. Microprocessor event analysis in parallel with Camac data acquisition

    International Nuclear Information System (INIS)

    Cords, D.; Eichler, R.; Riege, H.

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a Camac System (GEC-ELLIOTT System Crate) and shares the Camac access with a Nord-1OS computer. Interfaces have been designed and tested for execution of Camac cycles, communication with the Nord-1OS computer and DMA-transfer from Camac to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-1OS computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the result of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-1OS buffer will be reset and the event omitted from further processing. (orig.)

  7. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    Energy Technology Data Exchange (ETDEWEB)

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K. (and others)

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  8. Microprocessor event analysis in parallel with CAMAC data acquisition

    CERN Document Server

    Cords, D; Riege, H

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a CAMAC System (GEC-ELLIOTT System Crate) and shares the CAMAC access with a Nord-10S computer. Interfaces have been designed and tested for execution of CAMAC cycles, communication with the Nord-10S computer and DMA-transfer from CAMAC to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-10S computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the results of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-10S buffer will be reset and the event omitted from further processing. (5 refs).

  9. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  10. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  11. Event history analysis and the cross-section

    DEFF Research Database (Denmark)

    Keiding, Niels

    2006-01-01

    Examples are given of problems in event history analysis, where several time origins (generating calendar time, age, disease duration, time on study, etc.) are considered simultaneously. The focus is on complex sampling patterns generated around a cross-section. A basic tool is the Lexis diagram....

  12. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    International Nuclear Information System (INIS)

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-01-01

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility

  13. Root Cause Analysis: Learning from Adverse Safety Events.

    Science.gov (United States)

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. © RSNA, 2015.

  14. Physics analysis of the gang partial rod drive event

    International Nuclear Information System (INIS)

    Boman, C.; Frost, R.L.

    1992-08-01

    During the routine positioning of partial-length control rods in Gang 3 on the afternoon of Monday, July 27, 1992, the partial-length rods continued to drive into the reactor even after the operator released the controlling toggle switch. In response to this occurrence, the Safety Analysis and Engineering Services Group (SAEG) requested that the Applied Physics Group (APG) analyze the gang partial rod drive event. Although similar accident scenarios were considered in analysis for Chapter 15 of the Safety Analysis Report (SAR), APG and SAEG conferred and agreed that this particular type of gang partial-length rod motion event was not included in the SAR. This report details this analysis

  15. LOSP-initiated event tree analysis for BWR

    International Nuclear Information System (INIS)

    Watanabe, Norio; Kondo, Masaaki; Uno, Kiyotaka; Chigusa, Takeshi; Harami, Taikan

    1989-03-01

    As a preliminary study of 'Japanese Model Plant PSA', a LOSP (loss of off-site power)-initiated Event Tree Analysis for a Japanese typical BWR was carried out solely based on the open documents such as 'Safety Analysis Report'. The objectives of this analysis are as follows; - to delineate core-melt accident sequences initiated by LOSP, - to evaluate the importance of core-melt accident sequences in terms of occurrence frequency, and - to develop a foundation of plant information and analytical procedures for efficiently performing further 'Japanese Model Plant PSA'. This report describes the procedure and results of the LOSP-initiated Event Tree Analysis. In this analysis, two types of event trees, Functional Event Tree and Systemic Event Tree, were developed to delineate core-melt accident sequences and to quantify their frequencies. Front-line System Event Tree was prepared as well to provide core-melt sequence delineation for accident progression analysis of Level 2 PSA which will be followed in a future. Applying U.S. operational experience data such as component failure rates and a LOSP frequency, we obtained the following results; - The total frequency of core-melt accident sequences initiated by LOSP is estimated at 5 x 10 -4 per reactor-year. - The dominant sequences are 'Loss of Decay Heat Removal' and 'Loss of Emergency Electric Power Supply', which account for more than 90% of the total core-melt frequency. In this analysis, a higher value of 0.13/R·Y was used for the LOSP frequency than experiences in Japan and any recovery action was not considered. In fact, however, there has been no experience of LOSP event in Japanese nuclear power plants so far and it is also expected that offsite power and/or PCS would be recovered before core melt. Considering Japanese operating experience and recovery factors will reduce the total core-melt frequency to less than 10 -6 per reactor-year. (J.P.N.)

  16. Evaluation of Fourier integral. Spectral analysis of seismic events

    International Nuclear Information System (INIS)

    Chitaru, Cristian; Enescu, Dumitru

    2003-01-01

    Spectral analysis of seismic events represents a method for great earthquake prediction. The seismic signal is not a sinusoidal signal; for this, it is necessary to find a method for best approximation of real signal with a sinusoidal signal. The 'Quanterra' broadband station allows the data access in numerical and/or graphical forms. With the numerical form we can easily make a computer program (MSOFFICE-EXCEL) for spectral analysis. (authors)

  17. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  18. Top event prevention analysis - a deterministic use of PRA

    International Nuclear Information System (INIS)

    Blanchard, D.P.; Worrell, R.B.

    1995-01-01

    Risk importance measures are popular for many applications of probabilistic analysis. Inherent in the derivation of risk importance measures are implicit assumptions that those using these numerical results should be aware of in their decision making. These assumptions and potential limitations include the following: (1) The risk importance measures are derived for a single event at a time and are therefore valid only if all other event probabilities are unchanged at their current values. (2) The results for which risk importance measures are derived may not be complete for reasons such as truncation

  19. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  20. Using discriminant analysis as a nucleation event classification method

    Directory of Open Access Journals (Sweden)

    S. Mikkonen

    2006-01-01

    Full Text Available More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  1. External events analysis in PSA studies for Czech NPPs

    International Nuclear Information System (INIS)

    Holy, J.; Hustak, S.; Kolar, L.; Jaros, M.; Hladky, M.; Mlady, O.

    2014-01-01

    The purpose of the paper is to summarize current status of natural external hazards analysis in the PSA projects maintained in Czech Republic for both Czech NPPs - Dukovany and Temelin. The focus of the presentation is put upon the basic milestones in external event analysis effort - identification of external hazards important for Czech NPPs sites, screening out of the irrelevant hazards, modeling of plant response to the initiating events, including the basic activities regarding vulnerability and fragility analysis (supported with on-site analysis), quantification of accident sequences, interpretation of results and development of measures decreasing external events risk. The following external hazards are discussed in the paper, which have been addressed during several last years in PSA projects for Czech NPPs: 1)seismicity, 2)extremely low temperature 3)extremely high temperature 4)extreme wind 5)extreme precipitation (water, snow) 6)transport of dangerous substances (as an example of man-made hazard with some differences identified in comparison with natural hazards) 7)other hazards, which are not considered as very important for Czech NPPs, were screened out in the initial phase of the analysis, but are known as potential problem areas abroad. The paper is a result of coordinated effort with participation of experts and staff from engineering support organization UJV Rez, a.s. and NPPs located in Czech Republic - Dukovany and Temelin. (authors)

  2. Analysis of system and of course of events

    International Nuclear Information System (INIS)

    Hoertner, H.; Kersting, E.J.; Puetter, B.M.

    1986-01-01

    The analysis of the system and of the course of events is used to determine the frequency of core melt-out accidents and to describe the safety-related boundary conditions of appropriate accidents. The lecture is concerned with the effect of system changes in the reference plant and the effect of triggering events not assessed in detail or not sufficiently assessed in detail in phase A of the German Risk Study on the frequency of core melt-out accidents, the minimum requirements for system functions for controlling triggering events, i.e. to prevent core melt-out accidents, the reliability data important for reliability investigations and frequency assessments. (orig./DG) [de

  3. EVNTRE, Code System for Event Progression Analysis for PRA

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: EVNTRE is a generalized event tree processor that was developed for use in probabilistic risk analysis of severe accident progressions for nuclear power plants. The general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. EVNTRE efficiently processes large, complex event trees. It can assign probabilities to event tree branch points in several different ways, classify pathways or outcomes into user-specified groupings, and sample input distributions of probabilities and parameters. PSTEVNT, a post-processor program used to sort and reclassify the 'binned' data output from EVNTRE and generate summary tables, is included. 2 - Methods: EVNTRE processes event trees that are cast in the form of questions or events, with multiple choice answers for each question. Split fractions (probabilities or frequencies that sum to unity) are either supplied or calculated for the branches of each question in a path-dependent manner. EVNTRE traverses the tree, enumerating the leaves of the tree and calculating their probabilities or frequencies based upon the initial probability or frequency and the split fractions for the branches taken along the corresponding path to an individual leaf. The questions in the event tree are usually grouped to address specific phases of time regimes in the progression of the scenario or severe accident. Grouping or binning of each path through the event tree in terms of a small number of characteristics or attributes is allowed. Boolean expressions of the branches taken are used to select the appropriate values of the characteristics of interest for the given path. Typically, the user specifies a cutoff tolerance for the frequency of a pathway to terminate further exploration. Multiple sets of input to an event tree can be processed by using Monte Carlo sampling to generate

  4. Incident sequence analysis; event trees, methods and graphical symbols

    International Nuclear Information System (INIS)

    1980-11-01

    When analyzing incident sequences, unwanted events resulting from a certain cause are looked for. Graphical symbols and explanations of graphical representations are presented. The method applies to the analysis of incident sequences in all types of facilities. By means of the incident sequence diagram, incident sequences, i.e. the logical and chronological course of repercussions initiated by the failure of a component or by an operating error, can be presented and analyzed simply and clearly

  5. Analysis of operation events for HFETR emergency diesel generator set

    International Nuclear Information System (INIS)

    Li Zhiqiang; Ji Xifang; Deng Hong

    2015-01-01

    By the statistic analysis of the historical failure data of the emergency diesel generator set, the specific mode, the attribute, and the direct and root origin for each failure are reviewed and summarized. Considering the current status of the emergency diesel generator set, the preventive measures and solutions in terms of operation, handling and maintenance are proposed, and the potential events for the emergency diesel generator set are analyzed. (authors)

  6. Practical guidance for statistical analysis of operational event data

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies

  7. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  8. Interactive analysis of human error factors in NPP operation events

    International Nuclear Information System (INIS)

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  9. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  10. Detection of Abnormal Events via Optical Flow Feature Analysis

    Science.gov (United States)

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  11. Vulnerability analysis of a PWR to an external event

    International Nuclear Information System (INIS)

    Aruety, S.; Ilberg, D.; Hertz, Y.

    1980-01-01

    The Vulnerability of a Nuclear Power Plant (NPP) to external events is affected by several factors such as: the degree of redundancy of the reactor systems, subsystems and components; the separation of systems provided in the general layout; the extent of the vulnerable area, i.e., the area which upon being affected by an external event will result in system failure; and the time required to repair or replace the systems, when allowed. The present study offers a methodology, using Probabilistic Safety Analysis, to evaluate the relative importance of the above parameters in reducing the vulnerability of reactor safety systems. Several safety systems of typical PWR's are analyzed as examples. It was found that the degree of redundancy and physical separation of the systems has the most prominent effect on the vulnerability of the NPP

  12. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  13. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  14. Prism reactor system design and analysis of postulated unscrammed events

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.; Slovik, G.C.

    1991-08-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristic and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. 6 refs., 4 figs

  15. PRISM reactor system design and analysis of postulated unscrammed events

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.; Slovik, G.C.

    1991-01-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristic and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. (author)

  16. PRISM reactor system design and analysis of postulated unscrammed events

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.; Slovik, G.C.; Rosztoczy, Z.; Lane, J.

    1991-01-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristics and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. 6 refs., 4 figs

  17. Bisphosphonates and risk of cardiovascular events: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Dae Hyun Kim

    Full Text Available Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV events, atrial fibrillation, myocardial infarction (MI, stroke, and CV death in adults with or at risk for low bone mass.A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs and 95% confidence intervals (CIs of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed.Absolute risks over 25-36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84-1.14]; I2 = 0.0%, atrial fibrillation (41 trials; 1.08 [0.92-1.25]; I2 = 0.0%, MI (10 trials; 0.96 [0.69-1.34]; I2 = 0.0%, stroke (10 trials; 0.99 [0.82-1.19]; I2 = 5.8%, and CV death (14 trials; 0.88 [0.72-1.07]; I2 = 0.0% with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96-1.61]; I2 = 0.0%, not for oral bisphosphonates (26 trials; 1.02 [0.83-1.24]; I2 = 0.0%. The CV effects did not vary by subgroups or study quality.Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large reduction in fractures with bisphosphonates, changes in

  18. Analysis of warm convective rain events in Catalonia

    Science.gov (United States)

    Ballart, D.; Figuerola, F.; Aran, M.; Rigo, T.

    2009-09-01

    Between the end of September and November, events with high amounts of rainfall are quite common in Catalonia. The high sea surface temperature of the Mediterranean Sea near to the Catalan Coast is one of the most important factors that help to the development of this type of storms. Some of these events have particular characteristics: elevated rain rate during short time periods, not very deep convection and low lightning activity. Consequently, the use of remote sensing tools for the surveillance is quite useless or limited. With reference to the high rain efficiency, this is caused by internal mechanisms of the clouds, and also by the air mass where the precipitation structure is developed. As aforementioned, the contribution of the sea to the air mass is very relevant, not only by the increase of the big condensation nuclei, but also by high temperature of the low layers of the atmosphere, where are allowed clouds with 5 or 6 km of particles in liquid phase. In fact, the freezing level into these clouds can be detected by -15ºC. Due to these characteristics, this type of rainy structures can produce high quantities of rainfall in a relatively brief period of time, and, in the case to be quasi-stationary, precipitation values at surface could be very important. From the point of view of remote sensing tools, the cloud nature implies that the different tools and methodologies commonly used for the analysis of heavy rain events are not useful. This is caused by the following features: lightning are rarely observed, the top temperatures of clouds are not cold enough to be enhanced in the satellite imagery, and, finally, reflectivity radar values are lower than other heavy rain cases. The third point to take into account is the vulnerability of the affected areas. An elevated percentage of the Catalan population lives in the coastal region. In the central coast of Catalonia, the urban areas are surrounded by a not very high mountain range with small basins and

  19. Analysis of the stability of events occurred in Laguna Verde

    International Nuclear Information System (INIS)

    Castillo D, R.; Ortiz V, J.; Calleros M, G.

    2005-01-01

    The new fuel designs for operation cycles more long have regions of uncertainty bigger that those of the old fuels, and therefore, they can have oscillations of power when an event is presented that causes that the reactor operates to high power and low flow of coolant. During the start up of the reactor there are continued procedures that avoid that oscillations are presented with that which makes sure the stable behavior of the reactor. However, when the reactor is operating to nominal conditions and they are shot or they are transferred to low speed the recirculation pumps, it cannot make sure that the reactor doesn't present oscillations of power when entering to the restricted operation regions. The methods of stability analysis commonly use signs of neutronic noise that require to be stationary, but after a transitory one where they commonly get lost the recirculation pumps the signs they don't have the required characteristics, for what they are used with certain level of uncertainty by the limited validity of the models. In this work the Prony method is used to determine the reactor stability, starting from signs of transitory and it is compared with autoregressive models. Four events are analyzed happened in the Laguna Verde power plant where the reactor was in the area of high power and low flow of coolant, giving satisfactory results. (Author)

  20. Formal Analysis of BPMN Models Using Event-B

    Science.gov (United States)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  1. Root cause analysis for fire events at nuclear power plants

    International Nuclear Information System (INIS)

    1999-09-01

    Fire hazard has been identified as a major contributor to a plant' operational safety risk. The International nuclear power community (regulators, operators, designers) has been studying and developing tools for defending against this hazed. Considerable advances have been achieved during past two decades in design and regulatory requirements for fire safety, fire protection technology and related analytical techniques. The IAEA endeavours to provide assistance to Member States in improving fire safety in nuclear power plants. A task was launched by IAEA in 1993 with the purpose to develop guidelines and good practices, to promote advanced fire safety assessment techniques, to exchange state of the art information, and to provide engineering safety advisory services and training in the implementation of internationally accepted practices. This TECDOC addresses a systematic assessment of fire events using the root cause analysis methodology, which is recognized as an important element of fire safety assessment

  2. The Run 2 ATLAS Analysis Event Data Model

    CERN Document Server

    SNYDER, S; The ATLAS collaboration; NOWAK, M; EIFERT, T; BUCKLEY, A; ELSING, M; GILLBERG, D; MOYSE, E; KOENEKE, K; KRASZNAHORKAY, A

    2014-01-01

    During the LHC's first Long Shutdown (LS1) ATLAS set out to establish a new analysis model, based on the experience gained during Run 1. A key component of this is a new Event Data Model (EDM), called the xAOD. This format, which is now in production, provides the following features: A separation of the EDM into interface classes that the user code directly interacts with, and data storage classes that hold the payload data. The user sees an Array of Structs (AoS) interface, while the data is stored in a Struct of Arrays (SoA) format in memory, thus making it possible to efficiently auto-vectorise reconstruction code. A simple way of augmenting and reducing the information saved for different data objects. This makes it possible to easily decorate objects with new properties during data analysis, and to remove properties that the analysis does not need. A persistent file format that can be explored directly with ROOT, either with or without loading any additional libraries. This allows fast interactive naviga...

  3. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    Science.gov (United States)

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  4. Analysis of core damage frequency: Surry, Unit 1 internal events

    International Nuclear Information System (INIS)

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the accident sequence analysis of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed and described here is an extensive of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments form numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.05-E-5 per year, with a 95% upper bound of 1.34E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency. 49 refs., 52 figs., 70 tabs

  5. Trend analysis of explosion events at overseas nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Hiroki

    2008-01-01

    We surveyed failures caused by disasters (e.g., severe storms, heavy rainfall, earthquakes, explosions and fires) which occurred during the 13 years from 1995 to 2007 at overseas nuclear power plants (NPPs) from the nuclear information database of the Institute of Nuclear Safety System. Incorporated (INSS). The results revealed that explosions were the second most frequent type of failure after fires. We conducted a trend analysis on such explosion events. The analysis by equipment, cause, and effect on the plant showed that the explosions occurred mainly at electrical facilities, and thus it is essential to manage the maintenance of electrical facilities for preventing explosions. In addition, it was shown that explosions at transformers and batteries, which have never occurred at Japan's NPPs, accounted for as much as 55% of all explosions. The fact infers that this difference is attributable to the difference in maintenance methods of transformers (condition based maintenance adopted by NPPs) and workforce organization of batteries (inspections performed by utilities' own maintenance workers at NPPs). (author)

  6. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    Science.gov (United States)

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    Directory of Open Access Journals (Sweden)

    Maria Cleofe Giorgino

    2017-11-01

    Full Text Available Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR. The features of this tool are that it aims to represent the multidimensional impact of the organization’s activity and assumes materiality as a guiding principle of the report drafting. Adopting the event study methodology associated with a statistical significance test for categorical data, our results verify that an organization’s release of IR is able to produce a statistically significant impact on the related share prices. Moreover, the term “integrated” assigned to the reports plays a significant role in the impact on capital markets. Our findings have beneficial implications for both researchers and practitioners, adding new evidence for the IR usefulness as a corporate disclosure tool and the effect of an organization’s decision to disclose material information.

  8. Pertussis outbreak in Polish shooters with adverse event analysis

    Directory of Open Access Journals (Sweden)

    Monika Skrzypiec-Spring

    2017-04-01

    Full Text Available In addition to different injuries, infections are the most common reason for giving up training altogether or reducing its volume and intensity, as well as a lack of opportunities to participate in sports competitions. Nowadays, a slow but constant re‑emergence of pertussis, especially among teenagers and young adults, including athletes, can be observed. This paper describes an outbreak of pertussis among professional Polish shooters, focusing on the transmission of Bordetella pertussis infection between members of the national team, its influence on performance capacity and adverse event analysis. From 9 June, 2015 to 31 July, 2015, a total of 4 confirmed and suspected cases of pertussis were reported among members of the Polish Sport Shooting National Team, their relatives and acquaintances. Pertussis significantly decreased exercise performance of the first athlete, a 35-year-old woman, interrupted her training, and finally resulted in failure to win a medal or quota place. Pertussis also significantly decreased performance of the second athlete, a 25-year-old shooter. The other cases emerged in their families. Whooping cough is a real threat to athletes and should be prevented. Preventive measures include appropriate immunization, constant medical supervision, as well as early isolation, diagnostic tests and treatment of all infected sport team members. Regular administration of booster doses of the acellular pertussis vaccine (Tdpa every 5 years seems reasonable.

  9. Analysis of Multi Muon Events in the L3 Detector

    CERN Document Server

    Schmitt, Volker

    2000-01-01

    The muon density distribution in air showers initiated by osmi parti les is sensitive to the hemi al omposition of osmi rays. The density an be measured via the multipli ity distribution in a nite size dete tor, as it is L3. With a shallow depth of 30 meters under ground, the dete tor provides an ex ellent fa ility to measure a high muon rate, but being shielded from the hadroni and ele troni shower omponent. Subje t of this thesis is the des ription of the L3 Cosmi s experiment (L3+C), whi h is taking data sin e May 1999 and the analysis of muon bundles in the large magneti spe trometer of L3. The new osmi trigger and readout system is brie y des ribed. The in uen e of dierent primaries on the multipli ity distribution has been investigated using Monte Carlo event samples, generated with the CORSIKA program. The simulation results showed that L3+C measures in the region of the \\knee" of the primary spe trum of osmi rays. A new pattern re ognition has been developed and added to the re onstru tion ode, whi h ...

  10. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  11. Civil protection and Damaging Hydrogeological Events: comparative analysis of the 2000 and 2015 events in Calabria (southern Italy

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2017-11-01

    Full Text Available Calabria (southern Italy is a flood prone region, due to both its rough orography and fast hydrologic response of most watersheds. During the rainy season, intense rain affects the region, triggering floods and mass movements that cause economic damage and fatalities. This work presents a methodological approach to perform the comparative analysis of two events affecting the same area at a distance of 15 years, by collecting all the qualitative and quantitative features useful to describe both rain and damage. The aim is to understand if similar meteorological events affecting the same area can have different outcomes in terms of damage. The first event occurred between 8 and 10 September 2000, damaged 109 out of 409 municipalities of the region and killed 13 people in a campsite due to a flood. The second event, which occurred between 30 October and 1 November 2015, damaged 79 municipalities, and killed a man due to a flood. The comparative analysis highlights that, despite the exceptionality of triggering daily rain was higher in the 2015 event, the damage caused by the 2000 event to both infrastructures and belongings was higher, and it was strongly increased due to the 13 flood victims. We concluded that, in the 2015 event, the management of pre-event phases, with the issuing of meteorological alert, and the emergency management, with the preventive evacuation of people in hazardous situations due to landslides or floods, contributed to reduce the number of victims.

  12. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Dale [Los Alamos National Laboratory; Selby, Neil [AWE Blacknest

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  13. Analysis of early initiating event(s) in radiation-induced thymic lymphomagenesis

    International Nuclear Information System (INIS)

    Muto, Masahiro; Ying Chen; Kubo, Eiko; Mita, Kazuei

    1996-01-01

    Since the T cell receptor rearrangement is a sequential process and unique to the progeny of each clone, we investigated the early initiating events in radiation-induced thymic lymphomagenesis by comparing the oncogenic alterations with the pattern of γ T cell receptor (TCR) rearrangements. We reported previously that after leukemogenic irradiation, preneoplastic cells developed, albeit infrequently, from thymic leukemia antigen-2 + (TL-2 + ) thymocytes. Limited numbers of TL-2 + cells from individual irradiated B10.Thy-1.1 mice were injected into B10.Thy-1.2 mice intrathymically, and the common genetic changes among the donor-type T cell lymphomas were investigated with regard to p53 gene and chromosome aberrations. The results indicated that some mutations in the p53 gene had taken place in these lymphomas, but there was no common mutation among the donor-type lymphomas from individual irradiated mice, suggesting that these mutations were late-occurring events in the process of oncogenesis. On the other hand, there were common chromosome aberrations or translocations such as trisomy 15, t(7F; 10C), t(1A; 13D) or t(6A; XB) among the donor-type lymphomas derived from half of the individual irradiated mice. This indicated that the aberrations/translocations, which occurred in single progenitor cells at the early T cell differentiation either just before or after γ T cell receptor rearrangements, might be important candidates for initiating events. In the donor-type lymphomas from the other half of the individual irradiated mice, microgenetic changes were suggested to be initial events and also might take place in single progenitor cells just before or right after γ TCR rearrangements. (author)

  14. Analysis of thermal fatigue events in light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Okuda, Yasunori [Institute of Nuclear Safety System Inc., Seika, Kyoto (Japan)

    2000-09-01

    Thermal fatigue events, which may cause shutdown of nuclear power stations by wall-through-crack of pipes of RCRB (Reactor Coolant Pressure Boundary), are reported by licensees in foreign countries as well as in Japan. In this paper, thermal fatigue events reported in anomalies reports of light water reactors inside and outside of Japan are investigated. As a result, it is clarified that the thermal fatigue events can be classified in seven patterns by their characteristics, and the trend of the occurrence of the events in PWRs (Pressurized Water Reactors) has stronger co-relation to operation hours than that in BWRs (Boiling Water Reactors). Also, it is concluded that precise identification of locations where thermal fatigue occurs and its monitoring are important to prevent the thermal fatigue events by aging or miss modification. (author)

  15. Internal event analysis of Laguna Verde Unit 1 Nuclear Power Plant. System Analysis

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis of Laguna Verde Unit 1 Nuclear Power Plant , CNSNS-TR-004, in five volumes. The reports are organized as follows: CNSNS-TR-004 Volume 1: Introduction and Methodology. CNSNS-TR-004 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR-004 Volume 3: System Analysis. CNSNS-TR-004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR-004 Volume 5: Appendices A, B and C. This volume presents the results of the system analysis for the Laguna Verde Unit 1 Nuclear Power Plant. The system analysis involved the development of logical models for all the systems included in the accident sequence event tree headings, and for all the support systems required to operate the front line systems. For the Internal Event analysis for Laguna Verde, 16 front line systems and 5 support systems were included. Detailed fault trees were developed for most of the important systems. Simplified fault trees focusing on major faults were constructed for those systems that can be adequately represent,ed using this kind of modeling. For those systems where fault tree models were not constructed, actual data were used to represent the dominant failures of the systems. The main failures included in the fault trees are hardware failures, test and maintenance unavailabilities, common cause failures, and human errors. The SETS and TEMAC codes were used to perform the qualitative and quantitative fault tree analyses. (Author)

  16. Yucca Mountain Feature, Event, and Process (FEP) Analysis

    International Nuclear Information System (INIS)

    Freeze, G.

    2005-01-01

    A Total System Performance Assessment (TSPA) model was developed for the U.S. Department of Energy (DOE) Yucca Mountain Project (YMP) to help demonstrate compliance with applicable postclosure regulatory standards and support the License Application (LA). Two important precursors to the development of the TSPA model were (1) the identification and screening of features, events, and processes (FEPs) that might affect the Yucca Mountain disposal system (i.e., FEP analysis), and (2) the formation of scenarios from screened in (included) FEPs to be evaluated in the TSPA model (i.e., scenario development). YMP FEP analysis and scenario development followed a five-step process: (1) Identify a comprehensive list of FEPs potentially relevant to the long-term performance of the disposal system. (2) Screen the FEPs using specified criteria to identify those FEPs that should be included in the TSPA analysis and those that can be excluded from the analysis. (3) Form scenarios from the screened in (included) FEPs. (4) Screen the scenarios using the same criteria applied to the FEPs to identify any scenarios that can be excluded from the TSPA, as appropriate. (5) Specify the implementation of the scenarios in the computational modeling for the TSPA, and document the treatment of included FEPs. This paper describes the FEP analysis approach (Steps 1 and 2) for YMP, with a brief discussion of scenario formation (Step 3). Details of YMP scenario development (Steps 3 and 4) and TSPA modeling (Step 5) are beyond scope of this paper. The identification and screening of the YMP FEPs was an iterative process based on site-specific information, design, and regulations. The process was iterative in the sense that there were multiple evaluation and feedback steps (e.g., separate preliminary, interim, and final analyses). The initial YMP FEP list was compiled from an existing international list of FEPs from other radioactive waste disposal programs and was augmented by YMP site- and design

  17. Event-shape analysis: Sequential versus simultaneous multifragment emission

    International Nuclear Information System (INIS)

    Cebra, D.A.; Howden, S.; Karn, J.; Nadasen, A.; Ogilvie, C.A.; Vander Molen, A.; Westfall, G.D.; Wilson, W.K.; Winfield, J.S.; Norbeck, E.

    1990-01-01

    The Michigan State University 4π array has been used to select central-impact-parameter events from the reaction 40 Ar+ 51 V at incident energies from 35 to 85 MeV/nucleon. The event shape in momentum space is an observable which is shown to be sensitive to the dynamics of the fragmentation process. A comparison of the experimental event-shape distribution to sequential- and simultaneous-decay predictions suggests that a transition in the breakup process may have occurred. At 35 MeV/nucleon, a sequential-decay simulation reproduces the data. For the higher energies, the experimental distributions fall between the two contrasting predictions

  18. Event tree analysis for the system of hybrid reactor

    International Nuclear Information System (INIS)

    Yang Yongwei; Qiu Lijian

    1993-01-01

    The application of probabilistic risk assessment for fusion-fission hybrid reactor is introduced. A hybrid reactor system has been analysed using event trees. According to the character of the conceptual design of Hefei Fusion-fission Experimental Hybrid Breeding Reactor, the probabilities of the event tree series induced by 4 typical initiating events were calculated. The results showed that the conceptual design is safe and reasonable. through this paper, the safety character of hybrid reactor system has been understood more deeply. Some suggestions valuable to safety design for hybrid reactor have been proposed

  19. Time to Tenure in Spanish Universities: An Event History Analysis

    Science.gov (United States)

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  20. Time to tenure in Spanish universities: an event history analysis.

    Science.gov (United States)

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  1. Time to tenure in Spanish universities: an event history analysis.

    Directory of Open Access Journals (Sweden)

    Luis Sanz-Menéndez

    Full Text Available Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  2. Nonstochastic Analysis of Manufacturing Systems Using Timed-Event Graphs

    DEFF Research Database (Denmark)

    Hulgaard, Henrik; Amon, Tod

    1996-01-01

    Using automated methods to analyze the temporal behavior ofmanufacturing systems has proven to be essential and quite beneficial.Popular methodologies include Queueing networks, Markov chains,simulation techniques, and discrete event systems (such as Petrinets). These methodologies are primarily...

  3. Analysis of the Steam Generator Tubes Rupture Initiating Event

    International Nuclear Information System (INIS)

    Trillo, A.; Minguez, E.; Munoz, R.; Melendez, E.; Sanchez-Perea, M.; Izquierd, J.M.

    1998-01-01

    In PSA studies, Event Tree-Fault Tree techniques are used to analyse to consequences associated with the evolution of an initiating event. The Event Tree is built in the sequence identification stage, following the expected behaviour of the plant in a qualitative way. Computer simulation of the sequences is performed mainly to determine the allowed time for operator actions, and do not play a central role in ET validation. The simulation of the sequence evolution can instead be performed by using standard tools, helping the analyst obtain a more realistic ET. Long existing methods and tools can be used to automatism the construction of the event tree associated to a given initiator. These methods automatically construct the ET by simulating the plant behaviour following the initiator, allowing some of the systems to fail during the sequence evolution. Then, the sequences with and without the failure are followed. The outcome of all this is a Dynamic Event Tree. The work described here is the application of one such method to the particular case of the SGTR initiating event. The DYLAM scheduler, designed at the Ispra (Italy) JRC of the European Communities, is used to automatically drive the simulation of all the sequences constituting the Event Tree. Similarly to the static Event Tree, each time a system is demanded, two branches are open: one corresponding to the success and the other to the failure of the system. Both branches are followed by the plant simulator until a new system is demanded, and the process repeats. The plant simulation modelling allows the treatment of degraded sequences that enter into the severe accident domain as well as of success sequences in which long-term cooling is started. (Author)

  4. Detecting failure events in buildings: a numerical and experimental analysis

    OpenAIRE

    Heckman, V. M.; Kohler, M. D.; Heaton, T. H.

    2010-01-01

    A numerical method is used to investigate an approach for detecting the brittle fracture of welds associated with beam -column connections in instrumented buildings in real time through the use of time-reversed Green’s functions and wave propagation reciprocity. The approach makes use of a prerecorded catalog of Green’s functions for an instrumented building to detect failure events in the building during a later seismic event by screening continuous data for the presence of wavef...

  5. The analysis of a complex fire event using multispaceborne observations

    Directory of Open Access Journals (Sweden)

    Andrei Simona

    2018-01-01

    Full Text Available This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  6. The analysis of a complex fire event using multispaceborne observations

    Science.gov (United States)

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  7. Trend analysis of cables failure events at nuclear power plants

    International Nuclear Information System (INIS)

    Fushimi, Yasuyuki

    2007-01-01

    In this study, 152 failure events related with cables at overseas nuclear power plants are selected from Nuclear Information Database, which is owned by The Institute of Nuclear Safety System, and these events are analyzed in view of occurrence, causal factor, and so on. And 15 failure events related with cables at domestic nuclear power plants are selected from Nuclear Information Archives, which is owned by JANTI, and these events are analyzed by the same manner. As a result of comparing both trends, it is revealed following; 1) A cable insulator failure rate is lower at domestic nuclear power plants than at foreign ones. It is thought that a deterioration diagnosis is performed broadly in Japan. 2) Many buried cables failure events have been occupied a significant portion of cables failure events during work activity at overseas plants, however none has been occurred at domestic plants. It is thought that sufficient survey is conducted before excavating activity in Japan. 3) A domestic age related cables failure rate in service is lower than the overseas one and domestic improper maintenance rate is higher than the overseas one. Maintenance worker' a skill improvement is expected in order to reduce improper maintenance. (author)

  8. Preliminary Analysis of the Common Cause Failure Events for Domestic Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Daeil; Han, Sanghoon

    2007-01-01

    It is known that the common cause failure (CCF) events have a great effect on the safety and probabilistic safety assessment (PSA) results of nuclear power plants (NPPs). However, the domestic studies have been mainly focused on the analysis method and modeling of CCF events. Thus, the analysis of the CCF events for domestic NPPs were performed to establish a domestic database for the CCF events and to deliver them to the operation office of the international common cause failure data exchange (ICDE) project. This paper presents the analysis results of the CCF events for domestic nuclear power plants

  9. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  10. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Science.gov (United States)

    Fedosimova, Anastasiya; Gaitinov, Adigam; Grushevskaya, Ekaterina; Lebedev, Igor

    2017-06-01

    In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC), short-range multiparticle correlations (SC) and mixed type (MT) in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  11. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Directory of Open Access Journals (Sweden)

    Fedosimova Anastasiya

    2017-01-01

    Full Text Available In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC, short-range multiparticle correlations (SC and mixed type (MT in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  12. Analysis of water hammer events in nuclear power plants

    International Nuclear Information System (INIS)

    Sato, Masahiro; Yanagi, Chihiro

    1999-01-01

    A water hammer issue in nuclear power plants was one of unresolved safety issues listed by the United States Nuclear Regulatory Commission and was regarded as resolved. But later on, the water hammer events are still experienced intermittently, while the number of the events is decreasing. We collected water hammer events of PWRs in Japan and the United States and relevant documents, analyzed them, and studied corrective actions taken by Japanese plants. As a result, it is confirmed that preventive measured in design, operation etc. have been already taken and that mitigation mechanisms against water hammer have also been considered. However, it is clarified that attention should be continuously paid to operation of valves and/or pumps, as the prevention of water hammer still relies on operation. (author)

  13. Initiating events in the safety probabilistic analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Stasiulevicius, R.

    1989-01-01

    The importance of the initiating event in the probabilistic safety analysis of nuclear power plants are discussed and the basic procedures necessary for preparing reports, quantification and grouping of the events are described. The examples of initiating events with its occurence medium frequency, included those calculated for OCONEE reactor and Angra-1 reactor are presented. (E.G.)

  14. Event Sequence Analysis of the Air Intelligence Agency Information Operations Center Flight Operations

    National Research Council Canada - National Science Library

    Larsen, Glen

    1998-01-01

    This report applies Event Sequence Analysis, methodology adapted from aircraft mishap investigation, to an investigation of the performance of the Air Intelligence Agency's Information Operations Center (IOC...

  15. Identification and analysis of external event combinations for Hanhikivi 1PRA

    Energy Technology Data Exchange (ETDEWEB)

    Helander, Juho [Fennovoima Oy, Helsinki (Finland)

    2017-03-15

    Fennovoima's nuclear power plant, Hanhikivi 1, Pyhäjoki, Finland, is currently in design phase, and its construction is scheduled to begin in 2018 and electricity production in 2024. The objective of this paper is to produce a preliminary list of safety-significant external event combinations including preliminary probability estimates, to be used in the probabilistic risk assessment of Hanhikivi 1 plant. Starting from the list of relevant single events, the relevant event combinations are identified based on seasonal variation, preconditions related to different events, and dependencies (fundamental and cascade type) between events. Using this method yields 30 relevant event combinations of two events for the Hanhikivi site. The preliminary probability of each combination is evaluated, and event combinations with extremely low probability are excluded from further analysis. Event combinations of three or more events are identified by adding possible events to the remaining combinations of two events. Finally, 10 relevant combinations of two events and three relevant combinations of three events remain. The results shall be considered preliminary and will be updated after evaluating more detailed effects of different events on plant safety.

  16. Adverse events with use of antiepileptic drugs: a prescription and event symmetry analysis

    DEFF Research Database (Denmark)

    Tsiropoulos, Ioannis; Andersen, Morten; Hallas, Jesper

    2009-01-01

    Database (OPED) for the period of 1 August 1990-31 December 2006, and diagnoses from the County Hospital register for the period of 1994-2006 to perform sequence symmetry analysis. The method assesses the distribution of disease entities and prescription of other drugs (ODs), before and after initiation...

  17. Analysis of Paks NPP Personnel Activity during Safety Related Event Sequences

    International Nuclear Information System (INIS)

    Bareith, A.; Hollo, Elod; Karsa, Z.; Nagy, S.

    1998-01-01

    Within the AGNES Project (Advanced Generic and New Evaluation of Safety) the Level-1 PSA model of the Paks NPP Unit 3 was developed in form of a detailed event tree/fault tree structure (53 initiating events, 580 event sequences, 6300 basic events are involved). This model gives a good basis for quantitative evaluation of potential consequences of actually occurred safety-related events, i.e. for precursor event studies. To make these studies possible and efficient, the current qualitative event analysis practice should be reviewed and a new additional quantitative analysis procedure and system should be developed and applied. The present paper gives an overview of the method outlined for both qualitative and quantitative analyses of the operator crew activity during off-normal situations. First, the operator performance experienced during past operational events is discussed. Sources of raw information, the qualitative evaluation process, the follow-up actions, as well as the documentation requirements are described. Second, the general concept of the proposed precursor event analysis is described. Types of modeled interactions and the considered performance influences are presented. The quantification of the potential consequences of the identified precursor events is based on the task-oriented, Level-1 PSA model of the plant unit. A precursor analysis system covering the evaluation of operator activities is now under development. Preliminary results gained during a case study evaluation of a past historical event are presented. (authors)

  18. Events in time: Basic analysis of Poisson data

    International Nuclear Information System (INIS)

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given

  19. Applications of heavy ion microprobe for single event effects analysis

    International Nuclear Information System (INIS)

    Reed, Robert A.; Vizkelethy, Gyorgy; Pellish, Jonathan A.; Sierawski, Brian; Warren, Kevin M.; Porter, Mark; Wilkinson, Jeff; Marshall, Paul W.; Niu, Guofu; Cressler, John D.; Schrimpf, Ronald D.; Tipton, Alan; Weller, Robert A.

    2007-01-01

    The motion of ionizing-radiation-induced rogue charge carriers in a semiconductor can create unwanted voltage and current conditions within a microelectronic circuit. If sufficient unwanted charge or current occurs on a sensitive node, a variety of single event effects (SEEs) can occur with consequences ranging from trivial to catastrophic. This paper describes the application of heavy ion microprobes to assist with calibration and validation of SEE modeling approaches

  20. Events in time: Basic analysis of Poisson data

    Energy Technology Data Exchange (ETDEWEB)

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  1. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-01

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.

  2. Common-Cause Failure Analysis in Event Assessment

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Kelly, D.L.

    2008-01-01

    This paper reviews the basic concepts of modeling common-cause failures (CCFs) in reliability and risk studies and then applies these concepts to the treatment of CCF in event assessment. The cases of a failed component (with and without shared CCF potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g. failure to start and failure to run) is a new feature of this paper, as is the treatment of asymmetry within a common-cause component group

  3. Thermomechanical Stresses Analysis of a Single Event Burnout Process

    Science.gov (United States)

    Tais, Carlos E.; Romero, Eduardo; Demarco, Gustavo L.

    2009-06-01

    This work analyzes the thermal and mechanical effects arising in a power Diffusion Metal Oxide Semiconductor (DMOS) during a Single Event Burnout (SEB) process. For studying these effects we propose a more detailed simulation structure than the previously used by other authors, solving the mathematical models by means of the Finite Element Method. We use a cylindrical heat generation region, with 5 W, 10 W, 50 W and 100 W for emulating the thermal phenomena occurring during SEB processes, avoiding the complexity of the mathematical treatment of the ion-semiconductor interaction.

  4. Fault trees based on past accidents. Factorial analysis of events

    International Nuclear Information System (INIS)

    Vaillant, M.

    1977-01-01

    The method of the fault tree is already useful in the qualitative step before any reliability calculation. The construction of the tree becomes even simpler when we just want to describe how the events happened. Differently from screenplays that introduce several possibilities by means of the conjunction OR, you only have here the conjunction AND, which will not be written at all. This method is presented by INRS (1) for the study of industrial injuries; it may also be applied to material damages. (orig.) [de

  5. Analysis of loss of offsite power events reported in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Volkanovski, Andrija, E-mail: Andrija.VOLKANOVSKI@ec.europa.eu [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Ballesteros Avila, Antonio; Peinador Veira, Miguel [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Kančev, Duško [Kernkraftwerk Goesgen-Daeniken AG, CH-4658 Daeniken (Switzerland); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) gGmbH, Schwertnergasse 1, 50667 Köln (Germany); Stephan, Jean-Luc [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 – 92262 Fontenay-aux-Roses Cedex (France)

    2016-10-15

    Highlights: • Loss of offsite power events were identified in four databases. • Engineering analysis of relevant events was done. • The dominant root cause for LOOP are human failures. • Improved maintenance procedures can decrease the number of LOOP events. - Abstract: This paper presents the results of analysis of the loss of offsite power events (LOOP) in four databases of operational events. The screened databases include: the Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases, the IAEA International Reporting System for Operating Experience (IRS) and the U.S. Licensee Event Reports (LER). In total 228 relevant loss of offsite power events were identified in the IRSN database, 190 in the GRS database, 120 in U.S. LER and 52 in IRS database. Identified events were classified in predefined categories. Obtained results show that the largest percentage of LOOP events is registered during On power operational mode and lasted for two minutes or more. The plant centered events is the main contributor to LOOP events identified in IRSN, GRS and IAEA IRS database. The switchyard centered events are the main contributor in events registered in the NRC LER database. The main type of failed equipment is switchyard failures in IRSN and IAEA IRS, main or secondary lines in NRC LER and busbar failures in GRS database. The dominant root cause for the LOOP events are human failures during test, inspection and maintenance followed by human failures due to the insufficient or wrong procedures. The largest number of LOOP events resulted in reactor trip followed by EDG start. The actions that can result in reduction of the number of LOOP events and minimize consequences on plant safety are identified and presented.

  6. Analysis of internal events for the Unit 1 of the Laguna Verde nuclear power station

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    This volume presents the results of the starter event analysis and the event tree analysis for the Unit 1 of the Laguna Verde nuclear power station. The starter event analysis includes the identification of all those internal events which cause a disturbance to the normal operation of the power station and require mitigation. Those called external events stay beyond the reach of this study. For the analysis of the Laguna Verde power station eight transient categories were identified, three categories of loss of coolant accidents (LOCA) inside the container, a LOCA out of the primary container, as well as the vessel break. The event trees analysis involves the development of the possible accident sequences for each category of starter events. Events trees by systems for the different types of LOCA and for all the transients were constructed. It was constructed the event tree for the total loss of alternating current, which represents an extension of the event tree for the loss of external power transient. Also the event tree by systems for the anticipated transients without scram was developed (ATWS). The events trees for the accident sequences includes the sequences evaluation with vulnerable nucleus, that is to say those sequences in which it is had an adequate cooling of nucleus but the remoting systems of residual heat had failed. In order to model adequately the previous, headings were added to the event tree for developing the sequences until the point where be solved the nucleus state. This process includes: the determination of the failure pressure of the primary container, the evaluation of the environment generated in the reactor building as result of the container failure or cracked of itself, the determination of the localization of the components in the reactor building and the construction of boolean expressions to estimate the failure of the subordinated components to an severe environment. (Author)

  7. Trend analysis of fire events at nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Hiroki

    2007-01-01

    We performed trend analyses to compare fire events occurring overseas (1995-2005) and in Japan (1966-2006). We decided to do this after extracting data on incidents (storms, heavy rain, tsunamis, fires, etc.) occurring at overseas nuclear power plants from the Events Occurred at Overseas Nuclear Power Plants recorded in the Nuclear Information Database at the Institute of Nuclear Safety System (INSS) and finding that fires were the most common of the incidents. Analyses compared the number of fires occurring domestically and overseas and analyzed their causes and the effect of the fires on the power plants. As a result, we found that electrical fires caused by such things as current overheating and electric arcing, account for over one half of the domestic and overseas incidents of fire, which indicates that maintenance management of electric facilities is the most important aspect of fire prevention. Also, roughly the same number of operational fires occurred at domestic and overseas plants, judging from the figures for annual occurrences per unit. However, the overall number of fires per unit at domestic facilities is one fourth that of overseas facilities. We surmise that, while management of operations that utilizes fire is comparable for overseas and domestic plants, this disparity results from differences in the way maintenance is carried out at facilities. (author)

  8. Multivariate Volatility Impulse Response Analysis of GFC News Events

    NARCIS (Netherlands)

    D.E. Allen (David); M.J. McAleer (Michael); R.J. Powell (Robert); A.K. Singh (Abhay)

    2015-01-01

    textabstractThis paper applies the Hafner and Herwartz (2006) (hereafter HH) approach to the analysis of multivariate GARCH models using volatility impulse response analysis. The data set features ten years of daily returns series for the New York Stock Exchange Index and the FTSE 100 index from the

  9. Multivariate Volatility Impulse Response Analysis of GFC News Events

    NARCIS (Netherlands)

    D.E. Allen (David); M.J. McAleer (Michael); R.J. Powell (Robert)

    2015-01-01

    markdownabstract__Abstract__ This paper applies the Hafner and Herwartz (2006) (hereafter HH) approach to the analysis of multivariate GARCH models using volatility impulse response analysis. The data set features ten years of daily returns series for the New York Stock Exchange Index and the

  10. Marginal regression analysis of recurrent events with coarsened censoring times.

    Science.gov (United States)

    Hu, X Joan; Rosychuk, Rhonda J

    2016-12-01

    Motivated by an ongoing pediatric mental health care (PMHC) study, this article presents weakly structured methods for analyzing doubly censored recurrent event data where only coarsened information on censoring is available. The study extracted administrative records of emergency department visits from provincial health administrative databases. The available information of each individual subject is limited to a subject-specific time window determined up to concealed data. To evaluate time-dependent effect of exposures, we adapt the local linear estimation with right censored survival times under the Cox regression model with time-varying coefficients (cf. Cai and Sun, Scandinavian Journal of Statistics 2003, 30, 93-111). We establish the pointwise consistency and asymptotic normality of the regression parameter estimator, and examine its performance by simulation. The PMHC study illustrates the proposed approach throughout the article. © 2016, The International Biometric Society.

  11. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    Science.gov (United States)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  12. Use of PSA for the analysis of operational events in nuclear power plants

    International Nuclear Information System (INIS)

    Hulsmans, M.

    2006-01-01

    An operational event is a safety-relevant incident that occurred in an industrial installation like a nuclear power plant (NPP). The probabilistic approach to event analysis focuses on the potential consequences of an operational event. Within its scope of application, it provides a quantitative assessment of the risk significance of this event (and similar events): it calculates the risk increase induced by the event. Such analyses may result in a more objective and a more accurate event severity measure than those provided by commonly used qualitative methods. Probabilistic event analysis complements the traditional event analysis approaches that are oriented towards the understanding of the (root) causes of an event. In practice, risk-based precursor analysis consists of the mapping of an operational event on a risk model of the installation, such as a probabilistic safety analysis (PSA) model. Precursor analyses result in an objective risk ranking of safety-significant events, called accident precursors. An unexpectedly high (or low) risk increase value is in itself already an important finding. This assessment also yields a lot of information on the structure of the risk, since the underlying dominant factors can easily be determined. Relevant 'what if' studies on similar events and conditions can be identified and performed (which is generally not considered in conventional event analysis), with the potential to yield even broader findings. The findings of such a structured assessment can be used for other purposes than merely risk ranking. The operational experience feedback process can be improved by helping to identify design measures and operational practices in order to prevent re-occurrence or in order to mitigate future consequences, and even to evaluate their expected effectiveness, contributing to the validation and prioritization of corrective measures. Confirmed and re-occurring precursors with correlated characteristics may point out opportunities

  13. Investigation and analysis of hydrogen ignition and explosion events in foreign nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okuda, Yasunori [Institute of Nuclear Safety System, Inc., Mihama, Fukui (Japan)

    2002-09-01

    Reports about hydrogen ignition and explosion events in foreign nuclear power plants from 1980 to 2001 were investigated, and 31 events were identified. Analysis showed that they were categorized in (1) outer leakage ignition events and (2) inner accumulation ignition events. The dominant event for PWR (pressurized water reactor) was outer leakage ignition in the main generator, and in BWR (boiling water reactor) it was inner accumulation ignition in the off-gas system. The outer leakage ignition was a result of work process failure with the ignition source, operator error, or main generator hydrogen leakage. The inner accumulation ignition events were caused by equipment failure or insufficient monitoring. With careful preventive measures, the factors leading to these events could be eliminated. (author)

  14. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  15. [Analysis on the adverse events of cupping therapy in the application].

    Science.gov (United States)

    Zhou, Xin; Ruan, Jing-wen; Xing, Bing-feng

    2014-10-01

    The deep analysis has been done on the cases of adverse events and common injury of cupping therapy encountered in recent years in terms of manipulation and patient's constitution. The adverse events of cupping therapy are commonly caused by improper manipulation of medical practitioners, ignoring contraindication and patient's constitution. Clinical practitioners should use cupping therapy cautiously, follow strictly the rules of standard manipulation and medical core system, pay attention to the contraindication and take strict precautions against the occurrence of adverse events.

  16. Analysis of hypoglycemic events using negative binomial models.

    Science.gov (United States)

    Luo, Junxiang; Qu, Yongming

    2013-01-01

    Negative binomial regression is a standard model to analyze hypoglycemic events in diabetes clinical trials. Adjusting for baseline covariates could potentially increase the estimation efficiency of negative binomial regression. However, adjusting for covariates raises concerns about model misspecification, in which the negative binomial regression is not robust because of its requirement for strong model assumptions. In some literature, it was suggested to correct the standard error of the maximum likelihood estimator through introducing overdispersion, which can be estimated by the Deviance or Pearson Chi-square. We proposed to conduct the negative binomial regression using Sandwich estimation to calculate the covariance matrix of the parameter estimates together with Pearson overdispersion correction (denoted by NBSP). In this research, we compared several commonly used negative binomial model options with our proposed NBSP. Simulations and real data analyses showed that NBSP is the most robust to model misspecification, and the estimation efficiency will be improved by adjusting for baseline hypoglycemia. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Analysis and design of VEK for extreme events - a challenge

    International Nuclear Information System (INIS)

    Woelfel, H.P.; Technische Univ. Darmstadt

    2006-01-01

    For analysis and design of the VEK building - especially for design against earthquake and airplane crash - a 3D-integral-model had been developed, being able of yielding any global response quantities - displacements, accelerations, sectional forces, response spectra, global reinforcement - for any load actions from one mathematical model. Especially for airplane crash a so called dynamic design results in reinforcement quantities at every time step and so leads to a realistic and economic design. The advantages of the integral-model had been transferred to the design of the processing installation where the structural analysis of steel structures, vessels and piping had been dealt with in one integral mathematical model. (orig.)

  18. An analysis of fog events at Belgrade International Airport

    Science.gov (United States)

    Veljović, Katarina; Vujović, Dragana; Lazić, Lazar; Vučković, Vladan

    2015-01-01

    A preliminary study of the occurrence of fog at Belgrade "Nikola Tesla" Airport was carried out using a statistical approach. The highest frequency of fog has occurred in the winter months of December and January and far exceeded the number of fog days in the spring and the beginning of autumn. The exceptionally foggy months, those having an extreme number of foggy days, occurred in January 1989 (18 days), December 1998 (18 days), February 2005 (17 days) and October 2001 (15 days). During the winter months (December, January and February) from 1990 to 2005 (16 years), fog occurred most frequently between 0600 and 1000 hours, and in the autumn, between 0500 and 0800 hours. In summer, fog occurred most frequently between 0300 and 0600 hours. During the 11-year period from 1995 to 2005, it was found that there was a 13 % chance for fog to occur on two consecutive days and a 5 % chance that it would occur 3 days in a row. In October 2001, the fog was observed over nine consecutive days. During the winter half year, 52.3 % of fog events observed at 0700 hours were in the presence of stratus clouds and 41.4 % were without the presence of low clouds. The 6-h cooling observed at the surface preceding the occurrence of fog between 0000 and 0700 hours ranged mainly from 1 to 4 °C. A new method was applied to assess the probability of fog occurrence based on complex fog criteria. It was found that the highest probability of fog occurrence (51.2 %) takes place in the cases in which the relative humidity is above 97 %, the dew-point depression is 0 °C, the cloud base is lower than 50 m and the wind is calm or weak 1 h before the onset of fog.

  19. Survival analysis using S analysis of time-to-event data

    CERN Document Server

    Tableman, Mara

    2003-01-01

    Survival Analysis Using S: Analysis of Time-to-Event Data is designed as a text for a one-semester or one-quarter course in survival analysis for upper-level or graduate students in statistics, biostatistics, and epidemiology. Prerequisites are a standard pre-calculus first course in probability and statistics, and a course in applied linear regression models. No prior knowledge of S or R is assumed. A wide choice of exercises is included, some intended for more advanced students with a first course in mathematical statistics. The authors emphasize parametric log-linear models, while also detailing nonparametric procedures along with model building and data diagnostics. Medical and public health researchers will find the discussion of cut point analysis with bootstrap validation, competing risks and the cumulative incidence estimator, and the analysis of left-truncated and right-censored data invaluable. The bootstrap procedure checks robustness of cut point analysis and determines cut point(s). In a chapter ...

  20. Organization of pulse-height analysis programs for high event rates

    Energy Technology Data Exchange (ETDEWEB)

    Cohn, C E [Argonne National Lab., Ill. (USA)

    1976-09-01

    The ability of a pulse-height analysis program to handle high event rates can be enhanced by organizing it so as to minimize the time spent in interrupt housekeeping. Specifically, the routine that services the data-ready interrupt from the ADC should test whether another event is ready before performing the interrupt return.

  1. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes...

  2. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Behrmann, Gerd

    1999-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses \\emph{compositionality} and \\emph{dependency analysis} to significantly improve the efficiency of symbolic model checking of state/event models...

  3. Event Reconstruction and Analysis in the R3BRoot Framework

    International Nuclear Information System (INIS)

    Kresan, Dmytro; Al-Turany, Mohammad; Bertini, Denis; Karabowicz, Radoslaw; Manafov, Anar; Rybalchenko, Alexey; Uhlig, Florian

    2014-01-01

    The R 3 B experiment (Reaction studies with Relativistic Radioactive Beams) will be built within the future FAIR / GSI (Facility for Antiproton and Ion Research) in Darmstadt, Germany. The international collaboration R 3 B has a scientific program devoted to the physics of stable and radioactive beams at energies between 150 MeV and 1.5 GeV per nucleon. In preparation for the experiment, the R3BRoot software framework is under development, it deliver detector simulation, reconstruction and data analysis. The basic functionalities of the framework are handled by the FairRoot framework which is used also by the other FAIR experiments (CBM, PANDA, ASYEOS, etc) while the R 3 B detector specifics and reconstruction code are implemented inside R3BRoot. In this contribution first results of data analysis from the detector prototype test in November 2012 will be reported, moreover, comparison of the tracker performance versus experimental data, will be presented

  4. Offline analysis of HEP events by ''dynamic perceptron'' neural network

    International Nuclear Information System (INIS)

    Perrone, A.L.; Basti, G.; Messi, R.; Pasqualucci, E.; Paoluzi, L.

    1997-01-01

    In this paper we start from a critical analysis of the fundamental problems of the parallel calculus in linear structures and of their extension to the partial solutions obtained with non-linear architectures. Then, we present shortly a new dynamic architecture able to solve the limitations of the previous architectures through an automatic re-definition of the topology. This architecture is applied to real-time recognition of particle tracks in high-energy accelerators. (orig.)

  5. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    OpenAIRE

    Maria Cleofe Giorgino; Enrico Supino; Federico Barnabè

    2017-01-01

    Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR). The features of this tool are that it aims to represent the multidimensional imp...

  6. Analysis of spectral data with rare events statistics

    International Nuclear Information System (INIS)

    Ilyushchenko, V.I.; Chernov, N.I.

    1990-01-01

    The case is considered of analyzing experimental data, when the results of individual experimental runs cannot be summed due to large systematic errors. A statistical analysis of the hypothesis about the persistent peaks in the spectra has been performed by means of the Neyman-Pearson test. The computations demonstrate the confidence level for the hypothesis about the presence of a persistent peak in the spectrum is proportional to the square root of the number of independent experimental runs, K. 5 refs

  7. Analysis of 16 plasma vortex events in the geomagnetic tail

    International Nuclear Information System (INIS)

    Birn, J.; Hones, E.W. Jr.; Bame, S.J.; Russel, C.T.

    1985-01-01

    The analysis of 16 plasma vortex occurrences in the magnetotail plasma sheet of Hones et al. (1983) is extended. We used two- and three-dimensional plasma measurements and three-dimensional magnetic field measurements to study phase relations, energy propagation, and polarization properties. The results point toward an interpretation as a slow strongly damped MHD eigenmode which is generated by tailward traveling perturbations at the low-latitude interface between plasma sheet and magnetosheath

  8. Ultimate design load analysis of planetary gearbox bearings under extreme events

    DEFF Research Database (Denmark)

    Gallego Calderon, Juan Felipe; Natarajan, Anand; Cutululis, Nicolaos Antonio

    2017-01-01

    This paper investigates the impact of extreme events on the planet bearings of a 5 MW gearbox. The system is simulated using an aeroelastic tool, where the turbine structure is modeled, and MATLAB/Simulink, where the drivetrain (gearbox and generator) are modeled using a lumped-parameter approach....... Three extreme events are assessed: low-voltage ride through, emergency stop and normal stop. The analysis is focused on finding which event has the most negative impact on the bearing extreme radial loads. The two latter events are carried out following the guidelines of the International...

  9. ALFA detector, Background removal and analysis for elastic events

    CERN Document Server

    Belaloui, Nazim

    2017-01-01

    I worked on the ALFA project, which has the aim to measure the total cross section in PP collisions as a function of t, the momentum transfer by measuring the scattering angle of the protons. This measurement is done for all available energies; so far 7, 8 and 13 TeV. There are many analysis steps and we have focused on enhancing the signal-to-noise ratio. First of all I tried to be more familiar with ROOT, worked on understanding the code used to access to the data, plotting histograms, then cutting-off background.

  10. Integration of risk matrix and event tree analysis: a natural stone ...

    Indian Academy of Sciences (India)

    M Kemal Özfirat

    2017-09-27

    Sep 27, 2017 ... Different types of accidents may occur in natural stone facilities during movement, dimensioning, cutting ... are numerous risk analysis methods such as preliminary ..... machine type and maintenance (MM) event, block control.

  11. Political Shocks and Abnormal Returns During the Taiwan Crisis: An Event Study Analysis

    National Research Council Canada - National Science Library

    Steeves, Geoffrey

    2002-01-01

    .... Focusing on the 1996 Taiwan Crisis, by means of event study analysis, this paper attempts to determine the extent to which this political shock affected the Taiwanese, and surrounding Japanese stock markets...

  12. Mixed Methods Analysis of Medical Error Event Reports: A Report from the ASIPS Collaborative

    National Research Council Canada - National Science Library

    Harris, Daniel M; Westfall, John M; Fernald, Douglas H; Duclos, Christine W; West, David R; Niebauer, Linda; Marr, Linda; Quintela, Javan; Main, Deborah S

    2005-01-01

    .... This paper presents a mixed methods approach to analyzing narrative error event reports. Mixed methods studies integrate one or more qualitative and quantitative techniques for data collection and analysis...

  13. JINR supercomputer of the module type for event parallel analysis

    International Nuclear Information System (INIS)

    Kolpakov, I.F.; Senner, A.E.; Smirnov, V.A.

    1987-01-01

    A model of a supercomputer with 50 million of operations per second is suggested. Its realization allows one to solve JINR data analysis problems for large spectrometers (in particular DELPHY collaboration). The suggested module supercomputer is based on 32-bit commercial available microprocessor with a processing rate of about 1 MFLOPS. The processors are combined by means of VME standard busbars. MicroVAX-11 is a host computer organizing the operation of the system. Data input and output is realized via microVAX-11 computer periphery. Users' software is based on the FORTRAN-77. The supercomputer is connected with a JINR net port and all JINR users get an access to the suggested system

  14. HiggsToFourLeptonsEV in the ATLAS EventView Analysis Framework

    CERN Document Server

    Lagouri, T; Del Peso, J

    2008-01-01

    ATLAS is one of the four experiments at the Large Hadron Collider (LHC) at CERN. This experiment has been designed to study a large range of physics topics, including searches for previously unobserved phenomena such as the Higgs Boson and super-symmetry. The physics analysis package HiggsToFourLeptonsEV for the Standard Model (SM) Higgs to four leptons channel with ATLAS is presented. The physics goal is to investigate with the ATLAS detector, the SM Higgs boson discovery potential through its observation in the four-lepton (electron and muon) final state. HiggsToFourLeptonsEV is based on the official ATLAS software ATHENA and the EventView (EV) analysis framework. EventView is a highly flexible and modular analysis framework in ATHENA and it is one of several analysis schemes for ATLAS physics user analysis. At the core of the EventView is the representative "view" of an event, which defines the contents of event data suitable for event-level physics analysis. The HiggsToFourLeptonsEV package, presented in ...

  15. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    Science.gov (United States)

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  16. Preterm Versus Term Children: Analysis of Sedation/Anesthesia Adverse Events and Longitudinal Risk.

    Science.gov (United States)

    Havidich, Jeana E; Beach, Michael; Dierdorf, Stephen F; Onega, Tracy; Suresh, Gautham; Cravero, Joseph P

    2016-03-01

    Preterm and former preterm children frequently require sedation/anesthesia for diagnostic and therapeutic procedures. Our objective was to determine the age at which children who are born risk for sedation/anesthesia adverse events. Our secondary objective was to describe the nature and incidence of adverse events. This is a prospective observational study of children receiving sedation/anesthesia for diagnostic and/or therapeutic procedures outside of the operating room by the Pediatric Sedation Research Consortium. A total of 57,227 patients 0 to 22 years of age were eligible for this study. All adverse events and descriptive terms were predefined. Logistic regression and locally weighted scatterplot regression were used for analysis. Preterm and former preterm children had higher adverse event rates (14.7% vs 8.5%) compared with children born at term. Our analysis revealed a biphasic pattern for the development of adverse sedation/anesthesia events. Airway and respiratory adverse events were most commonly reported. MRI scans were the most commonly performed procedures in both categories of patients. Patients born preterm are nearly twice as likely to develop sedation/anesthesia adverse events, and this risk continues up to 23 years of age. We recommend obtaining birth history during the formulation of an anesthetic/sedation plan, with heightened awareness that preterm and former preterm children may be at increased risk. Further prospective studies focusing on the etiology and prevention of adverse events in former preterm patients are warranted. Copyright © 2016 by the American Academy of Pediatrics.

  17. An analysis of post-event processing in social anxiety disorder.

    Science.gov (United States)

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  18. Uncertainty analysis of one Main Circulation Pump trip event at the Ignalina NPP

    International Nuclear Information System (INIS)

    Vileiniskis, V.; Kaliatka, A.; Uspuras, E.

    2004-01-01

    One Main Circulation Pump (MCP) trip event is an anticipated transient with expected frequency of approximately one event per year. There were a few events when one MCP was inadvertently tripped. The throughput of the rest running pumps in the affected Main Circulation Circuit loop increased, however, the total coolant flow through the affected loop decreased. The main question arises whether this coolant flow rate is sufficient for adequate core cooling. This paper presents an investigation of one MCP trip event at the Ignalina NPP. According to international practice, the transient analysis should consist of deterministic analysis by employing best-estimate codes and uncertainty analysis. For that purpose, the plant's RELAP5 model and the GRS (Germany) System for Uncertainty and Sensitivity Analysis package (SUSA) were employed. Uncertainty analysis of flow energy loss in different parts of the Main Circulation Circuit, initial conditions and code-selected models was performed. Such analysis allows to estimate the influence of separate parameters on calculation results and to find the modelling parameters that have the largest impact on the event studied. On the basis of this analysis, recommendations for the further improvement of the model have been developed. (author)

  19. Corrective action program at the Krsko NPP. Trending and analysis of minor events

    International Nuclear Information System (INIS)

    Bach, B.; Kavsek, D.

    2007-01-01

    Industry and On-site Operating Experience has shown that the significant events, minor events and near misses all share something in common: latent weaknesses that result in failed barriers and the same or similar (root) causes for that failure. All these types of events differ only in their resulting consequences: minor events and near misses have no immediate or significant impact to plant safety or reliability. However, the significant events are usually preceded by a number of those kinds of events and could be prevented from occurring if the root cause(s) of these precursor events could be identified and eliminated. It would be therefore poor management to leave minor events and near misses unreported and unanalysed. Reporting and analysing of minor events allows detection of latent weaknesses that may indicate the need for improvement. The benefit of low level event analysis is that deficiencies can be found in barriers that normally go unchallenged and may not be known that they are ineffective in stopping a significant event. In addition, large numbers of minor events and near misses may increase the probability of occurrence of a significant event, which in itself is a sufficient reason for addressing these types of events. However, as it is not often practical neither feasible to perform a detailed root cause determination for every minor events, trending and trend analysis are used to identify and correct the causes prior to their resulting in a significant event. Trending is monitoring a change in frequency of similar minor events occurrence. Adverse trend is an increase in the frequency of minor events which are sorted by commonality such as common equipment failure, human factors, common or similar causal factors, activity etc. or worsening performance of processes that has been trending. The primary goal of any trending programme should be to identify an adverse trend early enough that the operating organization can initiate an investigation to help

  20. Computer-aided event tree analysis by the impact vector method

    International Nuclear Information System (INIS)

    Lima, J.E.P.

    1984-01-01

    In the development of the Probabilistic Risk Analysis of Angra I, the ' large event tree/small fault tree' approach was adopted for the analysis of the plant behavior in an emergency situation. In this work, the event tree methodology is presented along with the adaptations which had to be made in order to attain a correct description of the safety system performances according to the selected analysis method. The problems appearing in the application of the methodology and their respective solutions are presented and discussed, with special emphasis to the impact vector technique. A description of the ETAP code ('Event Tree Analysis Program') developed for constructing and quantifying event trees is also given in this work. A preliminary version of the small-break LOCA analysis for Angra 1 is presented as an example of application of the methodology and of the code. It is shown that the use of the ETAP code sigmnificantly contributes to decreasing the time spent in event tree analyses, making it viable the practical application of the analysis approach referred above. (author) [pt

  1. Analysis of events with isolated leptons and missing transverse momentum in ep collisions at HERA

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, G.

    2007-02-07

    A study of events with isolated leptons and missing transverse momentum in ep collisions is presented. Within the Standard Model (SM) such topologies are expected mainly from production of real W bosons with subsequent leptonic decay. This thesis continues the analysis of such events done in the HERA-1 period where an excess over the SM prediction was observed for events with high hadronic transverse momentum P{sup X}{sub T}>25 GeV. New data of the HERA-2 period are added. The analysed data sample recorded in e{sup +}p collisions corresponds to an integrated luminosity of 220 pb{sup -1} which is a factor of two more with respect to the HERA-1 analysis. The e{sup -}p data correspond to 186 pb{sup -1} which is a factor of 13 more with respect to HERA-1. All three lepton generations (electrons muons and tau leptons) are analysed. In the electron and muon channels a total of 53 events are observed in 406 pb{sup -1}. This compares well to the SM expectation of 53.7{+-}6.5 events, dominated by W production. However a difference in the event rate is observed for different electron beam charges. In e{sup +}p data the excess of events with P{sup X}{sub T}>25 GeV is sustained, while the e{sup -}p data agree with the SM. In the tau channel 18 events are observed in all HERA data, with 20{+-}3 expected from the SM. The events are dominated by irreducible background from charged currents. The contribution from W production amounts to about 22%. One event with P{sup X}{sub T}>25 GeV is observed, where 1.4{+-}0.3 are expected from the SM. (orig.)

  2. Analysis of event tree with imprecise inputs by fuzzy set theory

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Chun, Moon Hyun

    1990-01-01

    Fuzzy set theory approach is proposed as a method to analyze event trees with imprecise or linguistic input variables such as 'likely' or 'improbable' instead of the numerical probability. In this paper, it is shown how the fuzzy set theory can be applied to the event tree analysis. The result of this study shows that the fuzzy set theory approach can be applied as an acceptable and effective tool for analysis of the event tree with fuzzy type of inputs. Comparisons of the fuzzy theory approach with the probabilistic approach of computing probabilities of final states of the event tree through subjective weighting factors and LHS technique show that the two approaches have common factors and give reasonable results

  3. Markov chains and semi-Markov models in time-to-event analysis.

    Science.gov (United States)

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  4. Analysis of human error and organizational deficiency in events considering risk significance

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Kim, Yoonik; Kim, Say Hyung; Kim, Chansoo; Chung, Chang Hyun; Jung, Won Dea

    2004-01-01

    In this study, we analyzed human and organizational deficiencies in the trip events of Korean nuclear power plants. K-HPES items were used in human error analysis, and the organizational factors by Jacobs and Haber were used for organizational deficiency analysis. We proposed the use of CCDP as a risk measure to consider risk information in prioritizing K-HPES items and organizational factors. Until now, the risk significance of events has not been considered in human error and organizational deficiency analysis. Considering the risk significance of events in the process of analysis is necessary for effective enhancement of nuclear power plant safety by focusing on causes of human error and organizational deficiencies that are associated with significant risk

  5. Regression analysis of mixed recurrent-event and panel-count data.

    Science.gov (United States)

    Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L

    2014-07-01

    In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20: , 1-42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  7. Making sense of root cause analysis investigations of surgery-related adverse events.

    Science.gov (United States)

    Cassin, Bryce R; Barach, Paul R

    2012-02-01

    This article discusses the limitations of root cause analysis (RCA) for surgical adverse events. Making sense of adverse events involves an appreciation of the unique features in a problematic situation, which resist generalization to other contexts. The top priority of adverse event investigations must be to inform the design of systems that help clinicians to adapt and respond effectively in real time to undesirable combinations of design, performance, and circumstance. RCAs can create opportunities in the clinical workplace for clinicians to reflect on local barriers and identify enablers of safe and reliable outcomes. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Application and Use of PSA-based Event Analysis in Belgium

    International Nuclear Information System (INIS)

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  9. Analysis of external flooding events occurred in foreign nuclear power plant sites

    International Nuclear Information System (INIS)

    Li Dan; Cai Hankun; Xiao Zhi; An Hongzhen; Mao Huan

    2013-01-01

    This paper screens and studies 17 external flooding events occurred in foreign NPP sites, analysis the characteristic of external flooding events based on the source of the flooding, the impact on the building, systems and equipment, as well as the threat to nuclear safety. Furthermore, based on the experiences and lessons learned from Fukushima nuclear accident relating to external flooding and countermeasures carried out in the world, some suggestions are proposed in order to improve external flooding response capacity for Chinese NPPs. (authors)

  10. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  11. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Directory of Open Access Journals (Sweden)

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  12. Relation of air mass history to nucleation events in Po Valley, Italy, using back trajectories analysis

    Directory of Open Access Journals (Sweden)

    L. Sogacheva

    2007-01-01

    Full Text Available In this paper, we study the transport of air masses to San Pietro Capofiume (SPC in Po Valley, Italy, by means of back trajectories analysis. Our main aim is to investigate whether air masses originate over different regions on nucleation event days and on nonevent days, during three years when nucleation events have been continuously recorded at SPC. The results indicate that nucleation events occur frequently in air masses arriving from Central Europe, whereas event frequency is much lower in the air transported from southern directions and from the Atlantic Ocean. We also analyzed the behaviour of meteorological parameters during 96 h transport to SPC, and found that, on average, event trajectories undergo stronger subsidence during the last 12 h before the arrival at SPC than nonevent trajectories. This causes a reversal in the temperature and relative humidity (RH differences between event and nonevent trajectories: between 96 and 12 h back time, temperature is lower and RH is higher for event than nonevent trajectories and between 12 and 0 h vice versa. Boundary layer mixing is stronger along the event trajectories compared to nonevent trajectories. The absolute humidity (AH is similar for the event and nonevent trajectories between about 96 h and about 60 h back time, but after that, the event trajectories AH becomes lower due to stronger rain. We also studied transport of SO2 to SPC, and conclude that although sources in Po Valley most probably dominate the measured concentrations, certain Central and Eastern European sources also make a substantial contribution.

  13. Analysis of internal events for the Unit 1 of the Laguna Verde Nuclear Power Station. Appendixes

    International Nuclear Information System (INIS)

    Huerta B, A.; Lopez M, R.

    1995-01-01

    This volume contains the appendices for the accident sequences analysis for those internally initiated events for Laguna Verde Unit 1, Nuclear Power Plant. The appendix A presents the comments raised by the Sandia National Laboratories technical staff as a result of the review of the Internal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant. This review was performed during a joint Sandia/CNSNS multi-day meeting by the end 1992. Also included is a brief evaluation on the applicability of these comments to the present study. The appendix B presents the fault tree models printed for each of the systems included and.analyzed in the Internal Event Analysis for LVNPP. The appendice C presents the outputs of the TEMAC code, used for the cuantification of the dominant accident sequences as well as for the final core damage evaluation. (Author)

  14. Analysis of internal events for the Unit 1 of the Laguna Verde Nuclear Power Station. Appendixes

    International Nuclear Information System (INIS)

    Huerta B, A.; Lopez M, R.

    1995-01-01

    This volume contains the appendices for the accident sequences analysis for those internally initiated events for Laguna Verde Unit 1, Nuclear Power Plant. The appendix A presents the comments raised by the Sandia National Laboratories technical staff as a result of the review of the Internal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant. This review was performed during a joint Sandia/CNSNS multi-day meeting by the end 1992. Also included is a brief evaluation on the applicability of these comments to the present study. The appendix B presents the fault tree models printed for each of the systems included and analyzed in the Internal Event Analysis for LVNPP. The appendice C presents the outputs of the TEMAC code, used for the cuantification of the dominant accident sequences as well as for the final core damage evaluation. (Author)

  15. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    Science.gov (United States)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  16. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    Science.gov (United States)

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  17. Superposed epoch analysis of O+ auroral outflow during sawtooth events and substorms

    Science.gov (United States)

    Nowrouzi, N.; Kistler, L. M.; Lund, E. J.; Cai, X.

    2017-12-01

    Sawtooth events are repeated injection of energetic particles at geosynchronous orbit. Studies have shown that 94% of sawtooth events occurred during magnetic storm times. The main factor that causes a sawtooth event is still an open question. Simulations have suggested that heavy ions like O+ may play a role in triggering the injections. One of the sources of the O+ in the Earth's magnetosphere is the nightside aurora. O+ ions coming from the nightside auroral region have direct access to the near-earth magnetotail. A model (Brambles et al. 2013) for interplanetary coronal mass ejection driven sawtooth events found that nightside O+ outflow caused the subsequent teeth of the sawtooth event through a feedback mechanism. This work is a superposed epoch analysis to test whether the observed auroral outflow supports this model. Using FAST spacecraft data from 1997-2007, we examine the auroral O+ outflow as a function of time relative to an injection onset. Then we determine whether the profile of outflow flux of O+ during sawtooth events is different from the outflow observed during isolated substorms. The auroral region boundaries are estimated using the method of (Andersson et al. 2004). Subsequently the O+ outflow flux inside these boundaries are calculated and binned as a function of superposed epoch time for substorms and sawtooth "teeth". In this way, we will determine if sawtooth events do in fact have greater O+ outflow, and if that outflow is predominantly from the nightside, as suggested by the model results.

  18. Regression analysis of mixed panel count data with dependent terminal events.

    Science.gov (United States)

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L

    2017-05-10

    Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Analysis of adverse events occurred at overseas nuclear power plants in 2003

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa; Sato, Masahiro; Takagawa, Kenichi; Fushimi, Yasuyuki; Shimada, Hiroki; Shimada, Yoshio

    2004-01-01

    The adverse events that have occurred in the overseas nuclear power plants can be studied to provide an indication of how to improve the safety and the reliability of nuclear power plants in Japan. The Institute of Nuclear Safety Systems (INSS) obtains information related to overseas adverse events and incidents, and by evaluating them proposes improvements to prevent similar occurrences in Japanese PWR plants. In 2003, INSS obtained approximately 2800 pieces of information and, by evaluating them, proposed nine recommendations to Japanese utilities. This report shows a summary of the evaluation activity and of the tendency analysis based on individual event analyzed in 2003. The tendency analysis was undertaken on about 1600 analyzed events, from the view point of Mechanics, Electrics, Instruments and Controls and Operations, about the causes, countermeasures, troubled equipments and the possible of lessons learnt from overseas events. This report is to show the whole tendency of overseas events and incidents for the improvement of the safety and reliability of domestic PWR plants. (author)

  20. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Lisbeth A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  1. Analysis of events occurred at overseas nuclear power plants in 2004

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa; Nishioka, Hiromasa; Sato, Masahiro; Chiba, Gorou; Takagawa, Kenichi; Shimada, Hiroki

    2005-01-01

    The Institute of Nuclear Safety Systems (INSS) investigates the information related to events and incidents occurred at overseas nuclear power plants, and proposes recommendations for the improvement of the safety and reliability of domestic PWR plants by evaluating them. Succeeding to the 2003 report, this report shows the summary of the evaluation activity and of the tendency analysis based on about 2800 information obtained in 2004. The tendency analysis was undertaken on about 1700 analyzed events, from the view point of mechanics, electrics and operations, about the causes, troubled equipments and so on. (author)

  2. Logistic Organization of Mass Events in the Light of SWOT Analysis - Case Study

    Directory of Open Access Journals (Sweden)

    Joanna Woźniak

    2018-02-01

    Full Text Available Rzeszow Juwenalia is the largest free-entry student event in Subcarpathia, and, at the same time, one of the best in Poland. On average, more than 25,000 people stay on the campus of Rzeszow University of Technology for every single day of the event. Such an enormous undertaking requires developing a strategy which will make it possible to design and coordinate the event effectively. In connection with that, the principal objective of this paper is to present the strengths and weaknesses of Rzeszow Juwenalia, and also to attempt to verify opportunities and threats related to the event. SWOT analysis was used in order to attain the adopted objective. With the use of it, results making it possible to conduct a detailed assessment of the undertaking were obtained. In the publication were presented proposals of improvement activities which may be implemented in the future.

  3. DISPELLING ILLUSIONS OF REFLECTION: A NEW ANALYSIS OF THE 2007 MAY 19 CORONAL 'WAVE' EVENT

    International Nuclear Information System (INIS)

    Attrill, Gemma D. R.

    2010-01-01

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified 'reflections' are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  4. Analysis of events related to cracks and leaks in the reactor coolant pressure boundary

    Energy Technology Data Exchange (ETDEWEB)

    Ballesteros, Antonio, E-mail: Antonio.Ballesteros-Avila@ec.europa.eu [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Sanda, Radian; Peinador, Miguel; Zerger, Benoit [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Negri, Patrice [IRSN: Institut de Radioprotection et de Sûreté Nucléaire (France); Wenke, Rainer [GRS: Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) mbH (Germany)

    2014-08-15

    Highlights: • The important role of Operating Experience Feedback is emphasised. • Events relating to cracks and leaks in the reactor coolant pressure boundary are analysed. • A methodology for event investigation is described. • Some illustrative results of the analysis of events for specific components are presented. - Abstract: The presence of cracks and leaks in the reactor coolant pressure boundary may jeopardise the safe operation of nuclear power plants. Analysis of cracks and leaks related events is an important task for the prevention of their recurrence, which should be performed in the context of activities on Operating Experience Feedback. In response to this concern, the EU Clearinghouse operated by the JRC-IET supports and develops technical and scientific work to disseminate the lessons learned from past operating experience. In particular, concerning cracks and leaks, the studies carried out in collaboration with IRSN and GRS have allowed to identify the most sensitive areas to degradation in the plant primary system and to elaborate recommendations for upgrading the maintenance, ageing management and inspection programmes. An overview of the methodology used in the analysis of cracks and leaks related events is presented in this paper, together with the relevant results obtained in the study.

  5. Erectile dysfunction and cardiovascular events in diabetic men: a meta-analysis of observational studies.

    Directory of Open Access Journals (Sweden)

    Tomohide Yamada

    Full Text Available BACKGROUND: Several studies have shown that erectile dysfunction (ED influences the risk of cardiovascular events (CV events. However, a meta-analysis of the overall risk of CV events associated with ED in patients with diabetes has not been performed. METHODOLOGY/PRINCIPAL FINDINGS: We searched MEDLINE and the Cochrane Library for pertinent articles (including references published between 1951 and April 22, 2012. English language reports of original observational cohort studies and cross-sectional studies were included. Pooled effect estimates were obtained by random effects meta-analysis. A total of 3,791 CV events were reported in 3 cohort studies and 9 cross-sectional studies (covering 22,586 subjects. Across the cohort studies, the overall odds ratio (OR of diabetic men with ED versus those without ED was 1.74 (95% confidence interval [CI]: 1.34-2.27; P0.05. Moreover, meta-regression analysis found no relationship between the method used to assess ED (questionnaire or interview, mean age, mean hemoglobin A(1c, mean body mass index, or mean duration of diabetes and the risk of CV events or CHD. In the cross-sectional studies, the OR of diabetic men with ED versus those without ED was 3.39 (95% CI: 2.58-4.44; P<0.001 for CV events (N = 9, 3.43 (95% CI: 2.46-4.77; P<0.001 for CHD (N = 7, and 2.63 (95% CI: 1.41-4.91; P = 0.002 for peripheral vascular disease (N = 5. CONCLUSION/SIGNIFICANCE: ED was associated with an increased risk of CV events in diabetic patients. Prevention and early detection of cardiovascular disease are important in the management of diabetes, especially in view of the rapid increase in its prevalence.

  6. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual

    International Nuclear Information System (INIS)

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  7. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual. Companion CD

    International Nuclear Information System (INIS)

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  8. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    Science.gov (United States)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two

  9. The limiting events transient analysis by RETRAN02 and VIPRE01 for an ABWR

    International Nuclear Information System (INIS)

    Tsai Chiungwen; Shih Chunkuan; Wang Jongrong; Lin Haotzu; Jin Jiunan; Cheng Suchin

    2009-01-01

    This paper describes the transient analysis of generator load rejection (LR) and One Turbine Control Valve Closure (OTCVC) events for Lungmen nuclear power plant (LMNPP). According to the Critical Power Ratio (CPR) criterion, the Preliminary Safety Analysis Report (PSAR) concluded that LR and OTCVC are the first and second limiting events respectively. In addition, the fuel type is changed from GE12 to GE14 now. It's necessary to re-analyze these two events for safety consideration. In this study, to quantify the impact to reactor, the difference of initial critical power ratio (ICPR) and minimum critical power ratio (MCPR), ie. ΔCPR is calculated. The ΔCPRs of the LR and OTCVC events are calculated with the combination of RETRAN02 and VIPRE01 codes. In RETRAN02 calculation, a thermal-hydraulic model was prepared for the transient analysis. The data including upper plenum pressure, core inlet flow, normalized power, and axial power shapes during transient are furthermore submitted into VIPRE01 for ΔCPR calculation. In VIPRE01 calculation, there was a hot channel model built to simulate the hottest fuel bundle. Based on the thermal-hydraulic data from RETRAN02, the ΔCPRs are calculated by VIPRE01 hot channel model. Additionally, the different TCV control modes are considered to study the influence of different TCV closure curves on transient behavior. Meanwhile, sensitivity studies including different initial system pressure and different initial power/flow conditions are also considered. Based on this analysis, the maximum ΔCPRs for LR and OTCVC are 0.162 and 0.191 respectively. According CPR criterion, the result shows that the impact caused by OTCVC event leads to be larger than LR event. (author)

  10. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)

    2014-07-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  11. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    International Nuclear Information System (INIS)

    Kančev, Duško; Duchac, Alexander; Zerger, Benoit; Maqua, Michael; Wattrelos, Didier

    2014-01-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  12. Time-to-event analysis of mastitis at first-lactation in Valle del Belice ewes

    NARCIS (Netherlands)

    Portolano, B.; Firlocchiaro, R.; Kaam, van J.B.C.H.M.; Riggio, V.; Maizon, D.O.

    2007-01-01

    A time-to-event study for mastitis at first-lactation in Valle del Belice ewes was conducted, using survival analysis with an animal model. The goals were to evaluate the effect of lambing season and level of milk production on the time from lambing to the day when a ewe experienced a test-day with

  13. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    Science.gov (United States)

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  14. Analysis of operational events by ATHEANA framework for human factor modelling

    International Nuclear Information System (INIS)

    Bedreaga, Luminita; Constntinescu, Cristina; Doca, Cezar; Guzun, Basarab

    2007-01-01

    In the area of human reliability assessment, the experts recognise the fact that the current methods have not represented correctly the role of human in prevention, initiating and mitigating the accidents in nuclear power plants. The nature of this deficiency appears because the current methods used in modelling of human factor have not taken into account the human performance and reliability such as it has been observed in the operational events. ATHEANA - A Technique for Human Error ANAlysis - is a new methodology for human analysis that has included the specific data of operational events and also psychological models for human behaviour. This method has included new elements such as the unsafe action and error mechanisms. In this paper we present the application of ATHEANA framework in the analysis of operational events that appeared in different nuclear power plants during 1979-2002. The analysis of operational events has consisted of: - identification of the unsafe actions; - including the unsafe actions into a category, omission ar commission; - establishing the type of error corresponding to the unsafe action: slip, lapse, mistake and circumvention; - establishing the influence of performance by shaping the factors and some corrective actions. (authors)

  15. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    Science.gov (United States)

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  16. The Analysis of the Properties of Super Solar Proton Events and the Associated Phenomena

    Science.gov (United States)

    Cheng, L. B.; Le, G. M.; Lu, Y. P.; Chen, M. H.; Li, P.; Yin, Z. Q.

    2014-05-01

    The solar flare, the propagation speed of shock driven by coronal mass ejection (CME) from the sun to the Earth, the source longitudes and Carrington longitudes, and the geomagnetic storms associated with each super solar proton event with the peak flux equal to or exceeding 10000 pfu have been investigated. The analysis results show that the source longitudes of super solar proton events ranged from E30° to W75°. The Carrington longitudes of source regions of super solar proton events distributed in the two longitude bands, 130°˜220° and 260°˜320°, respectively. All super solar proton events were accompanied by major solar flares and fast CMEs. The averaged speeds of shocks propagated from the sun to the Earth were greater than 1200 km/s. Eight super solar proton events were followed by major geomagnetic storms (Dst≤-100 nT). One super solar proton event was followed by a geomagnetic storm with Dst=-96 nT.

  17. Probabilistic safety analysis for fire events for the NPP Isar 2

    International Nuclear Information System (INIS)

    Schmaltz, H.; Hristodulidis, A.

    2007-01-01

    The 'Probabilistic Safety Analysis for Fire Events' (Fire-PSA KKI2) for the NPP Isar 2 was performed in addition to the PSA for full power operation and considers all possible events which can be initiated due to a fire. The aim of the plant specific Fire-PSA was to perform a quantitative assessment of fire events during full power operation, which is state of the art. Based on simplistic assumptions referring to the fire induced failures, the influence of system- and component-failures on the frequency of the core damage states was analysed. The Fire-PSA considers events, which will result due to fire-induced failures of equipment on the one hand in a SCRAM and on the other hand in events, which will not have direct operational effects but because of the fire-induced failure of safety related installations the plant will be shut down as a precautionary measure. These events are considered because they may have a not negligible influence on the frequency of core damage states in case of failures during the plant shut down because of the reduced redundancy of safety related systems. (orig.)

  18. Fuel element thermo-mechanical analysis during transient events using the FMS and FETMA codes

    International Nuclear Information System (INIS)

    Hernandez Lopez Hector; Hernandez Martinez Jose Luis; Ortiz Villafuerte Javier

    2005-01-01

    In the Instituto Nacional de Investigaciones Nucleares of Mexico, the Fuel Management System (FMS) software package has been used for long time to simulate the operation of a BWR nuclear power plant in steady state, as well as in transient events. To evaluate the fuel element thermo-mechanical performance during transient events, an interface between the FMS codes and our own Fuel Element Thermo Mechanical Analysis (FETMA) code is currently being developed and implemented. In this work, the results of the thermo-mechanical behavior of fuel rods in the hot channel during the simulation of transient events of a BWR nuclear power plant are shown. The transient events considered for this work are a load rejection and a feedwater control failure, which among the most important events that can occur in a BWR. The results showed that conditions leading to fuel rod failure at no time appeared for both events. Also, it is shown that a transient due load rejection is more demanding on terms of safety that the failure of a controller of the feedwater. (authors)

  19. Root cause analysis of critical events in neurosurgery, New South Wales.

    Science.gov (United States)

    Perotti, Vanessa; Sheridan, Mark M P

    2015-09-01

    Adverse events reportedly occur in 5% to 10% of health care episodes. Not all adverse events are the result of error; they may arise from systemic faults in the delivery of health care. Catastrophic events are not only physically devastating to patients, but they also attract medical liability and increase health care costs. Root cause analysis (RCA) has become a key tool for health care services to understand those adverse events. This study is a review of all the RCA case reports involving neurosurgical patients in New South Wales between 2008 and 2013. The case reports and data were obtained from the Clinical Excellence Commission database. The data was then categorized by the root causes identified and the recommendations suggested by the RCA committees. Thirty-two case reports were identified in the RCA database. Breaches in policy account for the majority of root causes identified, for example, delays in transfer of patients or wrong-site surgery, which always involved poor adherence to correct patient and site identification procedures. The RCA committees' recommendations included education for staff, and improvements in rostering and procedural guidelines. RCAs have improved the patient safety profile; however, the RCA committees have no power to enforce any recommendation or ensure compliance. A single RCA may provide little learning beyond the unit and staff involved. However, through aggregation of RCA data and dissemination strategies, health care workers can learn from adverse events and prevent future events from occurring. © 2015 Royal Australasian College of Surgeons.

  20. Hazard analysis of typhoon-related external events using extreme value theory

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yo Chan; Jang, Seung Cheol [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lim, Tae Jin [Dept. of Industrial Information Systems Engineering, Soongsil University, Seoul (Korea, Republic of)

    2015-02-15

    After the Fukushima accident, the importance of hazard analysis for extreme external events was raised. To analyze typhoon-induced hazards, which are one of the significant disasters of East Asian countries, a statistical analysis using the extreme value theory, which is a method for estimating the annual exceedance frequency of a rare event, was conducted for an estimation of the occurrence intervals or hazard levels. For the four meteorological variables, maximum wind speed, instantaneous wind speed, hourly precipitation, and daily precipitation, the parameters of the predictive extreme value theory models were estimated. The 100-year return levels for each variable were predicted using the developed models and compared with previously reported values. It was also found that there exist significant long-term climate changes of wind speed and precipitation. A fragility analysis should be conducted to ensure the safety levels of a nuclear power plant for high levels of wind speed and precipitation, which exceed the results of a previous analysis.

  1. Screening Analysis of Criticality Features, Events, and Processes for License Application

    International Nuclear Information System (INIS)

    J.A. McClure

    2004-01-01

    This report documents the screening analysis of postclosure criticality features, events, and processes. It addresses the probability of criticality events resulting from degradation processes as well as disruptive events (i.e., seismic, rock fall, and igneous). Probability evaluations are performed utilizing the configuration generator described in ''Configuration Generator Model'', a component of the methodology from ''Disposal Criticality Analysis Methodology Topical Report''. The total probability per package of criticality is compared against the regulatory probability criterion for inclusion of events established in 10 CFR 63.114(d) (consider only events that have at least one chance in 10,000 of occurring over 10,000 years). The total probability of criticality accounts for the evaluation of identified potential critical configurations of all baselined commercial and U.S. Department of Energy spent nuclear fuel waste form and waste package combinations, both internal and external to the waste packages. This criticality screening analysis utilizes available information for the 21-Pressurized Water Reactor Absorber Plate, 12-Pressurized Water Reactor Absorber Plate, 44-Boiling Water Reactor Absorber Plate, 24-Boiling Water Reactor Absorber Plate, and the 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Short waste package types. Where defensible, assumptions have been made for the evaluation of the following waste package types in order to perform a complete criticality screening analysis: 21-Pressurized Water Reactor Control Rod, 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Long, and 2-Multi-Canister Overpack/2-Defense High-Level Radioactive Waste package types. The inputs used to establish probabilities for this analysis report are based on information and data generated for the Total System Performance Assessment for the License Application, where available. This analysis report determines whether criticality is to be

  2. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  3. Identification of fire modeling issues based on an analysis of real events from the OECD FIRE database

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, Dominik [Swiss Federal Nuclear Safety Inspectorate ENSI, Brugg (Switzerland)

    2017-03-15

    Precursor analysis is widely used in the nuclear industry to judge the significance of events relevant to safety. However, in case of events that may damage equipment through effects that are not ordinary functional dependencies, the analysis may not always fully appreciate the potential for further evolution of the event. For fires, which are one class of such events, this paper discusses modelling challenges that need to be overcome when performing a probabilistic precursor analysis. The events used to analyze are selected from the Organisation for Economic Cooperation and Development (OECD) Fire Incidents Records Exchange (FIRE) Database.

  4. Is the efficacy of antidepressants in panic disorder mediated by adverse events? A mediational analysis.

    Directory of Open Access Journals (Sweden)

    Irene Bighelli

    Full Text Available It has been hypothesised that the perception of adverse events in placebo-controlled antidepressant clinical trials may induce patients to conclude that they have been randomized to the active arm of the trial, leading to the breaking of blind. This may enhance the expectancies for improvement and the therapeutic response. The main objective of this study is to test the hypothesis that the efficacy of antidepressants in panic disorder is mediated by the perception of adverse events. The present analysis is based on a systematic review of published and unpublished randomised trials comparing antidepressants with placebo for panic disorder. The Baron and Kenny approach was applied to investigate the mediational role of adverse events in the relationship between antidepressants treatment and efficacy. Fourteen placebo-controlled antidepressants trials were included in the analysis. We found that: (a antidepressants treatment was significantly associated with better treatment response (ß = 0.127, 95% CI 0.04 to 0.21, p = 0.003; (b antidepressants treatment was not associated with adverse events (ß = 0.094, 95% CI -0.05 to 0.24, p = 0.221; (c adverse events were negatively associated with treatment response (ß = 0.035, 95% CI -0.06 to -0.05, p = 0.022. Finally, after adjustment for adverse events, the relationship between antidepressants treatment and treatment response remained statistically significant (ß = 0.122, 95% CI 0.01 to 0.23, p = 0.039. These findings do not support the hypothesis that the perception of adverse events in placebo-controlled antidepressant clinical trials may lead to the breaking of blind and to an artificial inflation of the efficacy measures. Based on these results, we argue that the moderate therapeutic effect of antidepressants in individuals with panic disorder is not an artefact, therefore reflecting a genuine effect that doctors can expect to replicate under real-world conditions.

  5. A multiprocessor system for the analysis of pictures of nuclear events

    CERN Document Server

    Bacilieri, P; Matteuzzi, P; Sini, G P; Zanotti, U

    1979-01-01

    The pictures of nuclear events obtained from the bubble chambers such as Gargamelle and BEBC at CERN and others from Serpukhov are geometrically processed at CNAF (Centro Nazionale Analysis Photogrammi) in Bologna. The analysis system includes an Erasme table and a CRT flying spot digitizer. The difficulties connected with the pictures of the four stereoscopic views of the bubble chambers are overcome by the choice of a strong interactive system. (0 refs).

  6. Working group of experts on rare events in human error analysis and quantification

    International Nuclear Information System (INIS)

    Goodstein, L.P.

    1977-01-01

    In dealing with the reference problem of rare events in nuclear power plants, the group has concerned itself with the man-machine system and, in particular, with human error analysis and quantification. The Group was requested to review methods of human reliability prediction, to evaluate the extent to which such analyses can be formalized and to establish criteria to be met by task conditions and system design which would permit a systematic, formal analysis. Recommendations are given on the Fessenheim safety system

  7. Identification of homogeneous regions for rainfall regional frequency analysis considering typhoon event in South Korea

    Science.gov (United States)

    Heo, J. H.; Ahn, H.; Kjeldsen, T. R.

    2017-12-01

    South Korea is prone to large, and often disastrous, rainfall events caused by a mixture of monsoon and typhoon rainfall phenomena. However, traditionally, regional frequency analysis models did not consider this mixture of phenomena when fitting probability distributions, potentially underestimating the risk posed by the more extreme typhoon events. Using long-term observed records of extreme rainfall from 56 sites combined with detailed information on the timing and spatial impact of past typhoons from the Korea Meteorological Administration (KMA), this study developed and tested a new mixture model for frequency analysis of two different phenomena; events occurring regularly every year (monsoon) and events only occurring in some years (typhoon). The available annual maximum 24 hour rainfall data were divided into two sub-samples corresponding to years where the annual maximum is from either (1) a typhoon event, or (2) a non-typhoon event. Then, three-parameter GEV distribution was fitted to each sub-sample along with a weighting parameter characterizing the proportion of historical events associated with typhoon events. Spatial patterns of model parameters were analyzed and showed that typhoon events are less commonly associated with annual maximum rainfall in the North-West part of the country (Seoul area), and more prevalent in the southern and eastern parts of the country, leading to the formation of two distinct typhoon regions: (1) North-West; and (2) Southern and Eastern. Using a leave-one-out procedure, a new regional frequency model was tested and compared to a more traditional index flood method. The results showed that the impact of typhoon on design events might previously have been underestimated in the Seoul area. This suggests that the use of the mixture model should be preferred where the typhoon phenomena is less frequent, and thus can have a significant effect on the rainfall-frequency curve. This research was supported by a grant(2017-MPSS31

  8. An Initiating-Event Analysis for PSA of Hanul Units 3 and 4: Results and Insights

    International Nuclear Information System (INIS)

    Kim, Dong-San; Park, Jin Hee

    2015-01-01

    As a part of the PSA, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard related to IE analysis. This paper describes the methods of, results and some insights from the IE analysis for the PSA of the Hanul units 3 and 4. In this study, as a part of the PSA for the Hanul units 3 and 4, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard. In comparison with the previous IE analysis, this study performed a more systematic and detailed analysis to identify potential initiating events, and calculated the IE frequencies by using the state-of-the-art methods and the latest data. As a result, not a few IE frequencies are quite different from the previous frequencies, which can change the major accident sequences obtained from the quantification of the PSA model

  9. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    Science.gov (United States)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  10. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    Science.gov (United States)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  11. Disruptive event uncertainties in a perturbation approach to nuclear waste repository risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, T.F.

    1980-09-01

    A methodology is developed for incorporating a full range of the principal forecasting uncertainties into a risk analysis of a nuclear waste repository. The result of this methodology is a set of risk curves similar to those used by Rasmussen in WASH-1400. The set of curves is partially derived from a perturbation approach to analyze potential disruptive event sequences. Such a scheme could be useful in truncating the number of disruptive event scenarios and providing guidance to those establishing data-base development priorities.

  12. Analysis of transverse momentum and event shape in νN scattering

    International Nuclear Information System (INIS)

    Bosetti, P.C.; Graessler, H.; Lanske, D.; Schulte, R.; Schultze, K.; Simopoulou, E.; Vayaki, A.; Barnham, K.W.J.; Hamisi, F.; Miller, D.B.; Mobayyen, M.M.; Wainstein, S.; Aderholz, M.; Hantke, D.; Hoffmann, E.; Katz, U.F.; Kern, J.; Schmitz, N.; Wittek, W.; Albajar, C.; Batley, J.R.; Myatt, G.; Perkins, D.H.; Radojicic, D.; Renton, P.; Saitta, S.; Bullock, F.W.; Burke, S.

    1990-01-01

    The transverse momentum distributions of hadrons produced in neutrino-nucleon charged current interactions and their dependence on W are analysed in detail. It is found that the components of the transverse momentum in the event plane and normal to it increase with W at about the same rate throughout the available W range. A comparison with e + e - data is made. Studies of the energy flow and angular distributions in the events classified as planar do not show clear evidence for high energy, wide angle gluon radiation, in contrast to the conclusion of a previous analysis of similar neutrino data. (orig.)

  13. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  14. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  15. Internal event analysis for Laguna Verde Unit 1 Nuclear Power Plant. Accident sequence quantification and results

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1994-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant, CNSNS-TR 004, in five volumes. The reports are organized as follows: CNSNS-TR 004 Volume 1: Introduction and Methodology. CNSNS-TR4 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR 004 Volume 3: System Analysis. CNSNS-TR 004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR 005 Volume 5: Appendices A, B and C. This volume presents the development of the dependent failure analysis, the treatment of the support system dependencies, the identification of the shared-components dependencies, and the treatment of the common cause failure. It is also presented the identification of the main human actions considered along with the possible recovery actions included. The development of the data base and the assumptions and limitations in the data base are also described in this volume. The accident sequences quantification process and the resolution of the core vulnerable sequences are presented. In this volume, the source and treatment of uncertainties associated with failure rates, component unavailabilities, initiating event frequencies, and human error probabilities are also presented. Finally, the main results and conclusions for the Internal Event Analysis for Laguna Verde Nuclear Power Plant are presented. The total core damage frequency calculated is 9.03x 10-5 per year for internal events. The most dominant accident sequences found are the transients involving the loss of offsite power, the station blackout accidents, and the anticipated transients without SCRAM (ATWS). (Author)

  16. Analysis methodology for the post-trip return to power steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chul Shin; Kim, Chul Woo; You, Hyung Keun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different from that of non-RTP SLB and is more difficult. Therefore, it is necessary to develop a methodology to analyze the response of the NSSS parameters to the post-trip RTP SLB events and the fuel performance after the total reactivity exceeds the criticality. In this analysis, the cases with and without offsite power were simulated crediting 3-D reactivity feedback effect due to a local heatup in the vicinity of stuck CEA and compared with the cases without 3-D reactivity feedback with respect to post-trip fuel performance. Departure-to Nucleate Boiling Ratio (DNBR) and Linear Heat Generation Rate (LHGR). 36 tabs., 32 figs., 11 refs. (Author) .new.

  17. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, de H.C.W.; Zwaan, L.; Timmermans, D.R.M.; Groenewegen, P.P.; Wagner, C.

    2009-01-01

    BACKGROUND: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis: is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  18. Analysis of unintended events in hospitals : inter-rater reliability of constructing causal trees and classifying root causes

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Groenewegen, P.P.; Timmermans, D.

    2009-01-01

    Background. Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  19. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes.

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Timmermans, D.; Groenewegen, P.; Wagner, C.

    2009-01-01

    Background: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  20. Antipsychotics, glycemic disorders, and life-threatening diabetic events: a Bayesian data-mining analysis of the FDA adverse event reporting system (1968-2004).

    Science.gov (United States)

    DuMouchel, William; Fram, David; Yang, Xionghu; Mahmoud, Ramy A; Grogg, Amy L; Engelhart, Luella; Ramaswamy, Krishnan

    2008-01-01

    This analysis compared diabetes-related adverse events associated with use of different antipsychotic agents. A disproportionality analysis of the US Food and Drug Administration (FDA) Adverse Event Reporting System (AERS) was performed. Data from the FDA postmarketing AERS database (1968 through first quarter 2004) were evaluated. Drugs studied included aripiprazole, clozapine, haloperidol, olanzapine, quetiapine, risperidone, and ziprasidone. Fourteen Medical Dictionary for Regulatory Activities (MedDRA) Primary Terms (MPTs) were chosen to identify diabetes-related adverse events; 3 groupings into higher-level descriptive categories were also studied. Three methods of measuring drug-event associations were used: proportional reporting ratio, the empirical Bayes data-mining algorithm known as the Multi-Item Gamma Poisson Shrinker, and logistic regression (LR) analysis. Quantitative measures of association strength, with corresponding confidence intervals, between drugs and specified adverse events were computed and graphed. Some of the LR analyses were repeated separately for reports from patients under and over 45 years of age. Differences in association strength were declared statistically significant if the corresponding 90% confidence intervals did not overlap. Association with various glycemic events differed for different drugs. On average, the rankings of association strength agreed with the following ordering: low association, ziprasidone, aripiprazole, haloperidol, and risperidone; medium association, quetiapine; and strong association, clozapine and olanzapine. The median rank correlation between the above ordering and the 17 sets of LR coefficients (1 set for each glycemic event) was 93%. Many of the disproportionality measures were significantly different across drugs, and ratios of disproportionality factors of 5 or more were frequently observed. There are consistent and substantial differences between atypical antipsychotic drugs in the

  1. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data.

    Science.gov (United States)

    Luo, Jake; Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-10-17

    Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (Pgroup. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older patients (older than 40 years) were more likely to observe a diverse range of adverse events (Page group (older

  2. Living with extreme weather events - perspectives from climatology, geomorphological analysis, chronicles and opinion polls

    Science.gov (United States)

    Auer, I.; Kirchengast, A.; Proske, H.

    2009-09-01

    The ongoing climate change debate focuses more and more on changing extreme events. Information on past events can be derived from a number of sources, such as instrumental data, residual impacts in the landscape, but also chronicles and people's memories. A project called "A Tale of Two Valleys” within the framework of the research program "proVision” allowed to study past extreme events in two inner-alpine valleys from the sources mentioned before. Instrumental climate time series provided information for the past 200 years, however great attention had to be given to the homogeneity of the series. To derive homogenized time series of selected climate change indices methods like HOCLIS and Vincent have been applied. Trend analyses of climate change indices inform about increase or decrease of extreme events. Traces of major geomorphodynamic processes of the past (e.g. rockfalls, landslides, debris flows) which were triggered or affected by extreme weather events are still apparent in the landscape and could be evaluated by geomorphological analysis using remote sensing and field data. Regional chronicles provided additional knowledge and covered longer periods back in time, however compared to meteorological time series they enclose a high degree of subjectivity and intermittent recordings cannot be obviated. Finally, questionnaires and oral history complemented our picture of past extreme weather events. People were differently affected and have different memories of it. The joint analyses of these four data sources showed agreement to some extent, however also showed some reasonable differences: meteorological data are point measurements only with a sometimes too coarse temporal resolution. Due to land-use changes and improved constructional measures the impact of an extreme meteorological event may be different today compared to earlier times.

  3. The logic of surveillance guidelines: an analysis of vaccine adverse event reports from an ontological perspective.

    Directory of Open Access Journals (Sweden)

    Mélanie Courtot

    Full Text Available BACKGROUND: When increased rates of adverse events following immunization are detected, regulatory action can be taken by public health agencies. However to be interpreted reports of adverse events must be encoded in a consistent way. Regulatory agencies rely on guidelines to help determine the diagnosis of the adverse events. Manual application of these guidelines is expensive, time consuming, and open to logical errors. Representing these guidelines in a format amenable to automated processing can make this process more efficient. METHODS AND FINDINGS: Using the Brighton anaphylaxis case definition, we show that existing clinical guidelines used as standards in pharmacovigilance can be logically encoded using a formal representation such as the Adverse Event Reporting Ontology we developed. We validated the classification of vaccine adverse event reports using the ontology against existing rule-based systems and a manually curated subset of the Vaccine Adverse Event Reporting System. However, we encountered a number of critical issues in the formulation and application of the clinical guidelines. We report these issues and the steps being taken to address them in current surveillance systems, and in the terminological standards in use. CONCLUSIONS: By standardizing and improving the reporting process, we were able to automate diagnosis confirmation. By allowing medical experts to prioritize reports such a system can accelerate the identification of adverse reactions to vaccines and the response of regulatory agencies. This approach of combining ontology and semantic technologies can be used to improve other areas of vaccine adverse event reports analysis and should inform both the design of clinical guidelines and how they are used in the future. AVAILABILITY: Sufficient material to reproduce our results is available, including documentation, ontology, code and datasets, at http://purl.obolibrary.org/obo/aero.

  4. Analysis of the Power oscillations event in Laguna Verde Nuclear Power Plant. Preliminary Report

    International Nuclear Information System (INIS)

    Gonzalez M, V.M.; Amador G, R.; Castillo, R.; Hernandez, J.L.

    1995-01-01

    The event occurred at Unit 1 of Laguna Verde Nuclear Power Plant in January 24, 1995, is analyzed using the Ramona 3 B code. During this event, Unit 1 suffered power oscillation when operating previous to the transfer at high speed recirculating pumps. This phenomenon was timely detected by reactor operator who put the reactor in shut-down doing a manual Scram. Oscillations reached a maximum extent of 10.5% of nominal power from peak to peak with a frequency of 0.5 Hz. Preliminary evaluations show that the event did not endangered the fuel integrity. The results of simulating the reactor core with Ramona 3 B code show that this code is capable to moderate reactor oscillations. Nevertheless it will be necessary to perform a more detailed simulation of the event in order to prove that the code can predict the beginning of oscillations. It will be need an additional analysis which permit the identification of factors that influence the reactor stability in order to express recommendations and in this way avoid the recurrence of this kind of events. (Author)

  5. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    Science.gov (United States)

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.

  6. A hydrological analysis of the 4 November 2011 event in Genoa

    Directory of Open Access Journals (Sweden)

    F. Silvestro

    2012-09-01

    Full Text Available On the 4 November 2011 a flash flood event hit the area of Genoa with dramatic consequences. Such an event represents, from the meteorological and hydrological perspective, a paradigm of flash floods in the Mediterranean environment.

    The hydro-meteorological probabilistic forecasting system for small and medium size catchments in use at the Civil Protection Centre of Liguria region exhibited excellent performances for the event, by predicting, 24–48 h in advance, the potential level of risk associated with the forecast. It greatly helped the decision makers in issuing a timely and correct alert.

    In this work we present the operational outputs of the system provided during the Liguria events and the post event hydrological modelling analysis that has been carried out accounting also for the crowd sourcing information and data. We discuss the benefit of the implemented probabilistic systems for decision-making under uncertainty, highlighting how, in this case, the multi-catchment approach used for predicting floods in small basins has been crucial.

  7. Accuracy analysis of measurements on a stable power-law distributed series of events

    International Nuclear Information System (INIS)

    Matthews, J O; Hopcraft, K I; Jakeman, E; Siviour, G B

    2006-01-01

    We investigate how finite measurement time limits the accuracy with which the parameters of a stably distributed random series of events can be determined. The model process is generated by timing the emigration of individuals from a population that is subject to deaths and a particular choice of multiple immigration events. This leads to a scale-free discrete random process where customary measures, such as mean value and variance, do not exist. However, converting the number of events occurring in fixed time intervals to a 1-bit 'clipped' process allows the construction of well-behaved statistics that still retain vestiges of the original power-law and fluctuation properties. These statistics include the clipped mean and correlation function, from measurements of which both the power-law index of the distribution of events and the time constant of its fluctuations can be deduced. We report here a theoretical analysis of the accuracy of measurements of the mean of the clipped process. This indicates that, for a fixed experiment time, the error on measurements of the sample mean is minimized by an optimum choice of the number of samples. It is shown furthermore that this choice is sensitive to the power-law index and that the approach to Poisson statistics is dominated by rare events or 'outliers'. Our results are supported by numerical simulation

  8. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  9. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  10. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  11. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  12. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    Science.gov (United States)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  13. Cryogenic dark matter search (CDMS II): Application of neural networks and wavelets to event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Attisha, Michael J. [Brown U.

    2006-01-01

    The Cryogenic Dark Matter Search (CDMS) experiment is designed to search for dark matter in the form of Weakly Interacting Massive Particles (WIMPs) via their elastic scattering interactions with nuclei. This dissertation presents the CDMS detector technology and the commissioning of two towers of detectors at the deep underground site in Soudan, Minnesota. CDMS detectors comprise crystals of Ge and Si at temperatures of 20 mK which provide ~keV energy resolution and the ability to perform particle identification on an event by event basis. Event identification is performed via a two-fold interaction signature; an ionization response and an athermal phonon response. Phonons and charged particles result in electron recoils in the crystal, while neutrons and WIMPs result in nuclear recoils. Since the ionization response is quenched by a factor ~ 3(2) in Ge(Si) for nuclear recoils compared to electron recoils, the relative amplitude of the two detector responses allows discrimination between recoil types. The primary source of background events in CDMS arises from electron recoils in the outer 50 µm of the detector surface which have a reduced ionization response. We develop a quantitative model of this ‘dead layer’ effect and successfully apply the model to Monte Carlo simulation of CDMS calibration data. Analysis of data from the two tower run March-August 2004 is performed, resulting in the world’s most sensitive limits on the spin-independent WIMP-nucleon cross-section, with a 90% C.L. upper limit of 1.6 × 10-43 cm2 on Ge for a 60 GeV WIMP. An approach to performing surface event discrimination using neural networks and wavelets is developed. A Bayesian methodology to classifying surface events using neural networks is found to provide an optimized method based on minimization of the expected dark matter limit. The discrete wavelet analysis of CDMS phonon pulses improves surface event discrimination in conjunction with the neural

  14. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    Science.gov (United States)

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  15. Brief communication: Post-event analysis of loss of life due to hurricane Harvey

    OpenAIRE

    Jonkman, Sebastiaan N.; Godfroy, Maartje; Sebastian, Antonia; Kolen, Bas

    2018-01-01

    An analysis was made of the loss of life directly caused by hurricane Harvey. Information was collected for 70 fatalities that occurred directly due to the event. Most of the fatalities occurred in the greater Houston area, which was most severely affected by extreme rainfall and heavy flooding. The majority of fatalities in this area were recovered outside the designated 100 and 500 year flood zones. Most fatalities occurred due to drowning (81 %), particularly in and around vehicles...

  16. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    Science.gov (United States)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  17. SHAREHOLDERS VALUE AND CATASTROPHE BONDS. AN EVENT STUDY ANALYSIS AT EUROPEAN LEVEL

    OpenAIRE

    Constantin, Laura-Gabriela; Cernat-Gruici, Bogdan; Lupu, Radu; Nadotti Loris, Lino Maria

    2015-01-01

    Considering that the E.U. based (re)insurance companies are increasingly active within the segment of alternative risk transfer market, the aim of the present paper is to emphasize the impact of issuing cat bonds on the shareholders’ value for highlighting the competitive advantages of the analysed (re)insurance companies while pursuing the consolidation of their resilience in a turbulent economic environment.Eminently an applicative research, the analysis employs an event study methodology w...

  18. FINANCIAL MARKET REACTIONS TO INTERNATIONAL MERGERS & ACQUISITIONS IN THE BREWING INDUSTRY: AN EVENT STUDY ANALYSIS

    OpenAIRE

    Heyder, Matthias; Ebneth, Oliver; Theuvsen, Ludwig

    2008-01-01

    Cross-border acquisitions have been the growing trend in recent years in the world brewing industry, giving brewers the opportunity to enhance their degree of internationalization and market share remarkably. This study employs event study analysis to examine 31 mergers and acquisitions among leading European brewing groups. Differences regarding financial market reactions can be determined within the European peer group. Managerial implications as well as future research propositions conclud...

  19. Neural network approach in multichannel auditory event-related potential analysis.

    Science.gov (United States)

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  20. Advanced reactor passive system reliability demonstration analysis for an external event

    International Nuclear Information System (INIS)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2017-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event

  1. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    Directory of Open Access Journals (Sweden)

    Matthew Bucknor

    2017-03-01

    Full Text Available Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general for the postulated transient event.

  2. Advanced reactor passive system reliability demonstration analysis for an external event

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin [Argonne National Laboratory, Argonne (United States)

    2017-03-15

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.

  3. Analysis and Modelling of Taste and Odour Events in a Shallow Subtropical Reservoir

    Directory of Open Access Journals (Sweden)

    Edoardo Bertone

    2016-08-01

    Full Text Available Understanding and predicting Taste and Odour events is as difficult as critical for drinking water treatment plants. Following a number of events in recent years, a comprehensive statistical analysis of data from Lake Tingalpa (Queensland, Australia was conducted. Historical manual sampling data, as well as data remotely collected by a vertical profiler, were collected; regression analysis and self-organising maps were the used to determine correlations between Taste and Odour compounds and potential input variables. Results showed that the predominant Taste and Odour compound was geosmin. Although one of the main predictors was the occurrence of cyanobacteria blooms, it was noticed that the cyanobacteria species was also critical. Additionally, water temperature, reservoir volume and oxidised nitrogen availability, were key inputs determining the occurrence and magnitude of the geosmin peak events. Based on the results of the statistical analysis, a predictive regression model was developed to provide indications on the potential occurrence, and magnitude, of peaks in geosmin concentration. Additionally, it was found that the blue green algae probe of the lake’s vertical profiler has the potential to be used as one of the inputs for an automated geosmin early warning system.

  4. RELAP5/MOD 3.3 analysis of Reactor Coolant Pump Trip event at NPP Krsko

    International Nuclear Information System (INIS)

    Bencik, V.; Debrecin, N.; Foretic, D.

    2003-01-01

    In the paper the results of the RELAP5/MOD 3.3 analysis of the Reactor Coolant Pump (RCP) Trip event at NPP Krsko are presented. The event was initiated by an operator action aimed to prevent the RCP 2 bearing damage. The action consisted of a power reduction, that lasted for 50 minutes, followed by a reactor and a subsequent RCP 2 trip when the reactor power was reduced to 28 %. Two minutes after reactor trip, the Main Steam Isolation Valves (MSIV) were isolated and the steam dump flow was closed. On the secondary side the Steam Generator (SG) pressure rose until SG 1 Safety Valve (SV) 1 opened. The realistic RELAP5/MOD 3.3 analysis has been performed in order to model the particular plant behavior caused by operator actions. The comparison of the RELAP5/MOD 3.3 results with the measurement for the power reduction transient has shown small differences for the major parameters (nuclear power, average temperature, secondary pressure). The main trends and physical phenomena following the RCP Trip event were well reproduced in the analysis. The parameters that have the major influence on transient results have been identified. In the paper the influence of SG 1 relief and SV valves on transient results was investigated more closely. (author)

  5. The January 2001, El Salvador event: a multi-data analysis

    Science.gov (United States)

    Vallee, M.; Bouchon, M.; Schwartz, S. Y.

    2001-12-01

    On January 13, 2001, a large normal event (Mw=7.6) occured 100 kilometers away from the Salvadorian coast (Central America) with a centroid depth of about 50km. The size of this event is surprising according to the classical idea that such events have to be much weaker than thrust events in subduction zones. We analysed this earthquake with different types of data: because teleseismic waves are the only data which offer a good azimuthal coverage, we first built a kinematic source model with P and SH waves provided by the IRIS-GEOSCOPE networks. The ambiguity between the 30o plane (plunging toward Pacific Ocean) and the 60o degree plane (plunging toward Central America) leaded us to do a parallel analysis of the two possible planes. We used a simple point-source modelling in order to define the main characteristics of the event and then used an extended source to retrieve the kinematic features of the rupture. For the 2 possible planes, this analysis reveals a downdip and northwest rupture propagation but the difference of fit remains subtle even when using the extended source. In a second part we confronted our models for the two planes with other seismological data, which are (1) regional data, (2) surface wave data through an Empirical Green Function given by a similar but much weaker earthquake which occured in July 1996 and lastly (3) nearfield data provided by Universidad Centroamericana (UCA) and Centro de Investigationes Geotecnicas (CIG). Regional data do not allow to discriminate the 2 planes neither but surface waves and especially near field data confirm that the fault plane is the steepest one plunging toward Central America. Moreover, the slight directivity toward North is confirmed by surface waves.

  6. Analysis on Outcome of 3537 Patients with Coronary Artery Disease: Integrative Medicine for Cardiovascular Events

    Directory of Open Access Journals (Sweden)

    Zhu-ye Gao

    2013-01-01

    Full Text Available Aims. To investigate the treatment of hospitalized patients with coronary artery disease (CAD and the prognostic factors in Beijing, China. Materials and Methods. A multicenter prospective study was conducted through an integrative platform of clinical and research at 12 hospitals in Beijing, China. The clinical information of 3537 hospitalized patients with CAD was collected from September 2009 to May 2011, and the efficacy of secondary prevention during one-year followup was evaluated. In addition, a logistic regression analysis was performed to identify some factors which will have independent impact on the prognosis. Results. The average age of all patients was 64.88 ± 11.97. Of them, 65.42% are males. The medicines for patients were as follows: antiplatelet drugs accounting for 91.97%, statins accounting for 83.66%, β-receptor blockers accounting for 72.55%, ACEI/ARB accounting for 58.92%, and revascularization (including PCI and CABG accounting for 40.29%. The overall incidence of cardiovascular events was 13.26% (469/3537. The logistic stepwise regression analysis showed that heart failure (OR, 3.707, 95% CI = 2.756–4.986, age ≥ 65 years old (OR, 2.007, 95% CI = 1.587–2.53, and myocardial infarction (OR, 1.649, 95% CI = 1.322–2.057 were the independent risk factors of others factors for cardiovascular events that occurred during followup of one-year period. Integrative medicine (IM therapy showed the beneficial tendency for decreasing incidence of cardiovascular events, although no statistical significance was found (OR, 0.797, 95% CI = 0.613~1.036. Conclusions. Heart failure, age ≥ 65 years old, and myocardial infarction were associated with an increase in incidence of cardiovascular events, and treatment with IM showed a tendency for decreasing incidence of cardiovascular events.

  7. Probabilistic Dynamics for Integrated Analysis of Accident Sequences considering Uncertain Events

    Directory of Open Access Journals (Sweden)

    Robertas Alzbutas

    2015-01-01

    Full Text Available The analytical/deterministic modelling and simulation/probabilistic methods are used separately as a rule in order to analyse the physical processes and random or uncertain events. However, in the currently used probabilistic safety assessment this is an issue. The lack of treatment of dynamic interactions between the physical processes on one hand and random events on the other hand causes the limited assessment. In general, there are a lot of mathematical modelling theories, which can be used separately or integrated in order to extend possibilities of modelling and analysis. The Theory of Probabilistic Dynamics (TPD and its augmented version based on the concept of stimulus and delay are introduced for the dynamic reliability modelling and the simulation of accidents in hybrid (continuous-discrete systems considering uncertain events. An approach of non-Markovian simulation and uncertainty analysis is discussed in order to adapt the Stimulus-Driven TPD for practical applications. The developed approach and related methods are used as a basis for a test case simulation in view of various methods applications for severe accident scenario simulation and uncertainty analysis. For this and for wider analysis of accident sequences the initial test case specification is then extended and discussed. Finally, it is concluded that enhancing the modelling of stimulated dynamics with uncertainty and sensitivity analysis allows the detailed simulation of complex system characteristics and representation of their uncertainty. The developed approach of accident modelling and analysis can be efficiently used to estimate the reliability of hybrid systems and at the same time to analyze and possibly decrease the uncertainty of this estimate.

  8. Procedure proposed for performance of a probabilistic safety analysis for the event of ''Air plane crash''

    International Nuclear Information System (INIS)

    Hoffmann, H.H.

    1998-01-01

    A procedures guide for a probabilistic safety analysis for the external event 'Air plane crash' has been prepared. The method is based on analysis done within the framework of PSA for German NPPs as well as on international documents. Both crashes of military air planes and commercial air planes contribute to the plant risk. For the determination of the plant related crash rate the air traffic will be divided into 3 different categories of air traffic: - The landing and takeoff phase, - the airlane traffic and waiting loop traffic, - the free air traffic, and the air planes into different types and weight classes. (orig./GL) [de

  9. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    Science.gov (United States)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q- qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  10. Cost analysis of adverse events associated with non-small cell lung cancer management in France

    Directory of Open Access Journals (Sweden)

    Chouaid C

    2017-07-01

    , anemia (€5,752 per event, dehydration (€5,207 per event and anorexia (€4,349 per event. Costs were mostly driven by hospitalization costs.Conclusion: Among the AEs identified, a majority appeared to have an important economic impact, with a management cost of at least €2,000 per event mainly driven by hospitalization costs. This study may be of interest for economic evaluations of new interventions in NSCLC. Keywords: non-small cell lung cancer, adverse events, cost analysis, chemotherapy, immunotherapy

  11. Analysis and modeling of a hail event consequences on a building portfolio

    Science.gov (United States)

    Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel

    2014-05-01

    North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the

  12. Shock events and flood risk management: a media analysis of the institutional long-term effects of flood events in the Netherlands and Poland

    Directory of Open Access Journals (Sweden)

    Maria Kaufmann

    2016-12-01

    Full Text Available Flood events that have proven to create shock waves in society, which we will call shock events, can open windows of opportunity that allow different actor groups to introduce new ideas. Shock events, however, can also strengthen the status quo. We will take flood events as our object of study. Whereas others focus mainly on the immediate impact and disaster management, we will focus on the long-term impact on and resilience of flood risk governance arrangements. Over the last 25 years, both the Netherlands and Poland have suffered several flood-related events. These triggered strategic and institutional changes, but to different degrees. In a comparative analysis these endogenous processes, i.e., the importance of framing of the flood event, its exploitation by different actor groups, and the extent to which arrangements are actually changing, are examined. In line with previous research, our analysis revealed that shock events test the capacity to resist and bounce back and provide opportunities for adapting and learning. They "open up" institutional arrangements and make them more susceptible to change, increasing the opportunity for adaptation. In this way they can facilitate a shift toward different degrees of resilience, i.e., by adjusting the current strategic approach or by moving toward another strategic approach. The direction of change is influenced by the actors and the frames they introduce, and their ability to increase the resonance of the frame. The persistence of change seems to be influenced by the evolution of the initial management approach, the availability of resources, or the willingness to allocate resources.

  13. Prehospital Interventions During Mass-Casualty Events in Afghanistan: A Case Analysis.

    Science.gov (United States)

    Schauer, Steven G; April, Michael D; Simon, Erica; Maddry, Joseph K; Carter, Robert; Delorenzo, Robert A

    2017-08-01

    Mass-casualty (MASCAL) events are known to occur in the combat setting. There are very limited data at this time from the Joint Theater (Iraq and Afghanistan) wars specific to MASCAL events. The purpose of this report was to provide preliminary data for the development of prehospital planning and guidelines. Cases were identified using the Department of Defense (DoD; Virginia USA) Trauma Registry (DoDTR) and the Prehospital Trauma Registry (PHTR). These cases were identified as part of a research study evaluating Tactical Combat Casualty Care (TCCC) guidelines. Cases that were designated as or associated with denoted MASCAL events were included. Data Fifty subjects were identified during the course of this project. Explosives were the most common cause of injuries. There was a wide range of vital signs. Tourniquet placement and pressure dressings were the most common interventions, followed by analgesia administration. Oral transmucosal fentanyl citrate (OTFC) was the most common parenteral analgesic drug administered. Most were evacuated as "routine." Follow-up data were available for 36 of the subjects and 97% were discharged alive. The most common prehospital interventions were tourniquet and pressure dressing hemorrhage control, along with pain medication administration. Larger data sets are needed to guide development of MASCAL in-theater clinical practice guidelines. Schauer SG , April MD , Simon E , Maddry JK , Carter R III , Delorenzo RA . Prehospital interventions during mass-casualty events in Afghanistan: a case analysis. Prehosp Disaster Med. 2017;32(4):465-468.

  14. Psychiatric adverse events during treatment with brodalumab: Analysis of psoriasis clinical trials.

    Science.gov (United States)

    Lebwohl, Mark G; Papp, Kim A; Marangell, Lauren B; Koo, John; Blauvelt, Andrew; Gooderham, Melinda; Wu, Jashin J; Rastogi, Shipra; Harris, Susan; Pillai, Radhakrishnan; Israel, Robert J

    2018-01-01

    Individuals with psoriasis are at increased risk for psychiatric comorbidities, including suicidal ideation and behavior (SIB). To distinguish between the underlying risk and potential for treatment-induced psychiatric adverse events in patients with psoriasis being treated with brodalumab, a fully human anti-interleukin 17 receptor A monoclonal antibody. Data were evaluated from a placebo-controlled, phase 2 clinical trial; the open-label, long-term extension of the phase 2 clinical trial; and three phase 3, randomized, double-blind, controlled clinical trials (AMAGINE-1, AMAGINE-2, and AMAGINE-3) and their open-label, long-term extensions of patients with moderate-to-severe psoriasis. The analysis included 4464 patients with 9161.8 patient-years of brodalumab exposure. The follow-up time-adjusted incidence rates of SIB events were comparable between the brodalumab and ustekinumab groups throughout the 52-week controlled phases (0.20 vs 0.60 per 100 patient-years). In the brodalumab group, 4 completed suicides were reported, 1 of which was later adjudicated as indeterminate; all patients had underlying psychiatric disorders or stressors. There was no comparator arm past week 52. Controlled study periods were not powered to detect differences in rare events such as suicide. Comparison with controls and the timing of events do not indicate a causal relationship between SIB and brodalumab treatment. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  15. Revisiting Slow Slip Events Occurrence in Boso Peninsula, Japan, Combining GPS Data and Repeating Earthquakes Analysis

    Science.gov (United States)

    Gardonio, B.; Marsan, D.; Socquet, A.; Bouchon, M.; Jara, J.; Sun, Q.; Cotte, N.; Campillo, M.

    2018-02-01

    Slow slip events (SSEs) regularly occur near the Boso Peninsula, central Japan. Their time of recurrence has been decreasing from 6.4 to 2.2 years from 1996 to 2014. It is important to better constrain the slip history of this area, especially as models show that the recurrence intervals could become shorter prior to the occurrence of a large interplate earthquake nearby. We analyze the seismic waveforms of more than 2,900 events (M≥1.0) taking place in the Boso Peninsula, Japan, from 1 April 2004 to 4 November 2015, calculating the correlation and the coherence between each pair of events in order to define groups of repeating earthquakes. The cumulative number of repeating earthquakes suggests the existence of two slow slip events that have escaped detection so far. Small transient displacements observed in the time series of nearby GPS stations confirm these results. The detection scheme coupling repeating earthquakes and GPS analysis allow to detect small SSEs that were not seen before by classical methods. This work brings new information on the diversity of SSEs and demonstrates that the SSEs in Boso area present a more complex history than previously considered.

  16. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    Science.gov (United States)

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  17. Urbanization and fertility: an event-history analysis of coastal Ghana.

    Science.gov (United States)

    White, Michael J; Muhidin, Salut; Andrzejewski, Catherine; Tagoe, Eva; Knight, Rodney; Reed, Holly

    2008-11-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field.

  18. Development of time dependent safety analysis code for plasma anomaly events in fusion reactors

    International Nuclear Information System (INIS)

    Honda, Takuro; Okazaki, Takashi; Bartels, H.W.; Uckan, N.A.; Seki, Yasushi.

    1997-01-01

    A safety analysis code SAFALY has been developed to analyze plasma anomaly events in fusion reactors, e.g., a loss of plasma control. The code is a hybrid code comprising a zero-dimensional plasma dynamics and a one-dimensional thermal analysis of in-vessel components. The code evaluates the time evolution of plasma parameters and temperature distributions of in-vessel components. As the plasma-safety interface model, we proposed a robust plasma physics model taking into account updated data for safety assessment. For example, physics safety guidelines for beta limit, density limit and H-L mode confinement transition threshold power, etc. are provided in the model. The model of the in-vessel components are divided into twenty temperature regions in the poloidal direction taking account of radiative heat transfer between each surface of each region. This code can also describe the coolant behavior under hydraulic accidents with the results by hydraulics code and treat vaporization (sublimation) from plasma facing components (PFCs). Furthermore, the code includes the model of impurity transport form PFCs by using a transport probability and a time delay. Quantitative analysis based on the model is possible for a scenario of plasma passive shutdown. We examined the possibility of the code as a safety analysis code for plasma anomaly events in fusion reactors and had a prospect that it would contribute to the safety analysis of the International Thermonuclear Experimental Reactor (ITER). (author)

  19. On Event/Time Triggered and Distributed Analysis of a WSN System for Event Detection, Using Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Sofia Maria Dima

    2016-01-01

    Full Text Available Event detection in realistic WSN environments is a critical research domain, while the environmental monitoring comprises one of its most pronounced applications. Although efforts related to the environmental applications have been presented in the current literature, there is a significant lack of investigation on the performance of such systems, when applied in wireless environments. Aiming at addressing this shortage, in this paper an advanced multimodal approach is followed based on fuzzy logic. The proposed fuzzy inference system (FIS is implemented on TelosB motes and evaluates the probability of fire detection while aiming towards power conservation. Additionally to a straightforward centralized approach, a distributed implementation of the above FIS is also proposed, aiming towards network congestion reduction while optimally distributing the energy consumption among network nodes so as to maximize network lifetime. Moreover this work proposes an event based execution of the aforementioned FIS aiming to further reduce the computational as well as the communication cost, compared to a periodical time triggered FIS execution. As a final contribution, performance metrics acquired from all the proposed FIS implementation techniques are thoroughly compared and analyzed with respect to critical network conditions aiming to offer realistic evaluation and thus objective conclusions’ extraction.

  20. Characterization of a Flood Event through a Sediment Analysis: The Tescio River Case Study

    Directory of Open Access Journals (Sweden)

    Silvia Di Francesco

    2016-07-01

    Full Text Available This paper presents the hydrological analysis and grain size characteristics of fluvial sediments in a river basin and their combination to characterize a flood event. The overall objective of the research is the development of a practical methodology based on experimental surveys to reconstruct the hydraulic history of ungauged river reaches on the basis of the modifications detected on the riverbed during the dry season. The grain size analysis of fluvial deposits usually requires great technical and economical efforts and traditional sieving based on physical sampling is not appropriate to adequately represent the spatial distribution of sediments in a wide area of a riverbed with a reasonable number of samples. The use of photographic sampling techniques, on the other hand, allows for the quick and effective determination of the grain size distribution, through the use of a digital camera and specific graphical algorithms in large river stretches. A photographic sampling is employed to characterize the riverbed in a 3 km ungauged reach of the Tescio River, a tributary of the Chiascio River, located in central Italy, representative of many rivers in the same geographical area. To this end, the particle size distribution is reconstructed through the analysis of digital pictures of the sediments taken on the riverbed in dry conditions. The sampling has been performed after a flood event of known duration, which allows for the identification of the removal of the armor in one section along the river reach under investigation. The volume and composition of the eroded sediments made it possible to calculate the average flow rate associated with the flood event which caused the erosion, by means of the sediment transport laws and the hydrological analysis of the river basin. A hydraulic analysis of the river stretch under investigation was employed to verify the validity of the proposed procedure.

  1. Adverse events following yellow fever immunization: Report and analysis of 67 neurological cases in Brazil.

    Science.gov (United States)

    Martins, Reinaldo de Menezes; Pavão, Ana Luiza Braz; de Oliveira, Patrícia Mouta Nunes; dos Santos, Paulo Roberto Gomes; Carvalho, Sandra Maria D; Mohrdieck, Renate; Fernandes, Alexandre Ribeiro; Sato, Helena Keico; de Figueiredo, Patricia Mandali; von Doellinger, Vanessa Dos Reis; Leal, Maria da Luz Fernandes; Homma, Akira; Maia, Maria de Lourdes S

    2014-11-20

    Neurological adverse events following administration of the 17DD substrain of yellow fever vaccine (YEL-AND) in the Brazilian population are described and analyzed. Based on information obtained from the National Immunization Program through passive surveillance or intensified passive surveillance, from 2007 to 2012, descriptive analysis, national and regional rates of YFV associated neurotropic, neurological autoimmune disease, and reporting rate ratios with their respective 95% confidence intervals were calculated for first time vaccinees stratified on age and year. Sixty-seven neurological cases were found, with the highest rate of neurological adverse events in the age group from 5 to 9 years (2.66 per 100,000 vaccine doses in Rio Grande do Sul state, and 0.83 per 100,000 doses in national analysis). Two cases had a combination of neurotropic and autoimmune features. This is the largest sample of YEL-AND already analyzed. Rates are similar to other recent studies, but on this study the age group from 5 to 9 years of age had the highest risk. As neurological adverse events have in general a good prognosis, they should not contraindicate the use of yellow fever vaccine in face of risk of infection by yellow fever virus. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Analysis of Loss-of-Offsite-Power Events 1997-2015

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Nancy Ellen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.

  3. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    Science.gov (United States)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  4. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    Science.gov (United States)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  5. Analysis of mutual events of Galilean satellites observed from VBO during 2014-2015

    Science.gov (United States)

    Vasundhara, R.; Selvakumar, G.; Anbazhagan, P.

    2017-06-01

    Results of analysis of 23 events of the 2014-2015 mutual event series from the Vainu Bappu Observatory are presented. Our intensity distribution model for the eclipsed/occulted satellite is based on the criterion that it simulates a rotational light curve that matches the ground-based light curve. Dichotomy in the scattering characteristics of the leading and trailing sides explains the basic shape of the rotational light curves of Europa, Ganymede and Callisto. In the case of Io, the albedo map (courtesy United States Geological Survey) along with global values of scattering parameters works well. Mean values of residuals in (O - C) along and perpendicular to the track are found to be -3.3 and -3.4 mas, respectively, compared to 'L2' theory for the seven 2E1/2O1 events. The corresponding rms values are 8.7 and 7.8 mas, respectively. For the five 1E3/1O3 events, the along and perpendicular to the track mean residuals are 5.6 and 3.2 mas, respectively. The corresponding rms residuals are 6.8 and 10.5 mas, respectively. We compare the results using the chosen model (Model 1) with a uniform but limb-darkened disc (Model 2). The residuals with Model 2 of the 2E1/2O1 and 1E3/1O3 events indicate a bias along the satellite track. The extent and direction of bias are consistent with the shift of the light centre from the geometric centre. Results using Model 1, which intrinsically takes into account the intensity distribution, show no such bias.

  6. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    Science.gov (United States)

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  7. Exploitation of a component event data bank for common cause failure analysis

    International Nuclear Information System (INIS)

    Games, A.M.; Amendola, A.; Martin, P.

    1985-01-01

    Investigations into using the European Reliability Data System Component Event Data Bank for common cause failure analysis have been carried out. Starting from early exercises where data were analyzed without computer aid, different types of linked multiple failures have been identified. A classification system is proposed based on this experience. It defines a multiple failure event space wherein each category defines causal, modal, temporal and structural links between failures. It is shown that a search algorithm which incorporates the specific interrogative procedures of the data bank can be developed in conjunction with this classification system. It is concluded that the classification scheme and the search algorithm are useful organizational tools in the field of common cause failures studies. However, it is also suggested that the use of the term common cause failure should be avoided since it embodies to many different types of linked multiple failures

  8. Turning a Private Story into a Public Event. Frame Analysis of Scandals in Television Performance

    Directory of Open Access Journals (Sweden)

    Olga Galanova

    2012-07-01

    Full Text Available It does not suffice to treat scandals only as supra-individual discourses on the macro level of  social communication. Rather we have to develop concrete methodical principles for the description of the practice of doing scandal in certain media. In this paper we look at these practices from a micro-sociological perspective and analyze how and through which concrete actions an event is staged as a scandal. Practices of scandal build a special frame of media communication, which allows  television producers to solve certain "communicative problems." Based on the detailed analysis of a video recording of a television show we exemplify how a private case turns to a public event by means of  scandal-framing. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120398

  9. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  10. Benchmark analysis of three main circulation pump sequential trip event at Ignalina NPP

    International Nuclear Information System (INIS)

    Uspuras, E.; Kaliatka, A.; Urbonas, R.

    2001-01-01

    The Ignalina Nuclear Power Plant is a twin-unit with two RBMK-1500 reactors. The primary circuit consists of two symmetrical loops. Eight Main Circulation Pumps (MCPs) at the Ignalina NPP are employed for the coolant water forced circulation through the reactor core. The MCPs are joined in groups of four pumps for each loop (three for normal operation and one on standby). This paper presents the benchmark analysis of three main circulation pump sequential trip event at RBMK-1500 using RELAP5 code. During this event all three MCPs in one circulation loop at Unit 2 Ignalina NPP were tripped one after another, because of inadvertent activation of the fire protection system. The comparison of calculated and measured parameters led us to establish realistic thermal hydraulic characteristics of different main circulation circuit components and to verify the model of drum separators pressure and water level controllers.(author)

  11. A sequential threshold cure model for genetic analysis of time-to-event data

    DEFF Research Database (Denmark)

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    In analysis of time-to-event data, classical survival models ignore the presence of potential nonsusceptible (cured) individuals, which, if present, will invalidate the inference procedures. Existence of nonsusceptible individuals is particularly relevant under challenge testing with specific...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...... survival data were generated through 2 underlying traits: susceptibility and endurance (risk of dying per time-unit), associated with 2 sets of underlying liabilities. Despite considerable phenotypic confounding, the proposed model was largely able to distinguish the 2 traits. Furthermore, if selection...

  12. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  13. Analysis of a potential meteorite-dropping event over the south of Spain in 2007

    Science.gov (United States)

    Madiedo, J. M.; Trigo-Rodríguez, J. M.

    2008-09-01

    the case of Puerto Lápice, there are no pictures or videos of the June 29, 2007 bolide and just some images of the distorted train taken several minutes later are available. A forth potential meteoritedropping bolide could be directly recorded by SPMN video cameras on March 25, 2007. We were lucky enough of having this event near to the zenith of two SPMN stations, exhibiting all its magnificence (Fig. 2). We focus here on the preliminary analysis of this event, which was observed over an

  14. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  15. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    Science.gov (United States)

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  16. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    International Nuclear Information System (INIS)

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J.

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PER) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs

  17. Statistical Analysis of Solar Events Associated with SSC over Year of Solar Maximum during Cycle 23: 1. Identification of Related Sun-Earth Events

    Science.gov (United States)

    Grison, B.; Bocchialini, K.; Menvielle, M.; Chambodut, A.; Cornilleau-Wehrlin, N.; Fontaine, D.; Marchaudon, A.; Pick, M.; Pitout, F.; Schmieder, B.; Regnier, S.; Zouganelis, Y.

    2017-12-01

    Taking the 32 sudden storm commencements (SSC) listed by the observatory de l'Ebre / ISGI over the year 2002 (maximal solar activity) as a starting point, we performed a statistical analysis of the related solar sources, solar wind signatures, and terrestrial responses. For each event, we characterized and identified, as far as possible, (i) the sources on the Sun (Coronal Mass Ejections -CME-), with the help of a series of herafter detailed criteria (velocities, drag coefficient, radio waves, polarity), as well as (ii) the structure and properties in the interplanetary medium, at L1, of the event associated to the SSC: magnetic clouds -MC-, non-MC interplanetary coronal mass ejections -ICME-, co-rotating/stream interaction regions -SIR/CIR-, shocks only and unclear events that we call "miscellaneous" events. The categorization of the events at L1 is made on published catalogues. For each potential CME/L1 event association we compare the velocity observed at L1 with the one observed at the Sun and the estimated balistic velocity. Observations of radio emissions (Type II, Type IV detected from the ground and /or by WIND) associated to the CMEs make the solar source more probable. We also compare the polarity of the magnetic clouds with the hemisphere of the solar source. The drag coefficient (estimated with the drag-based model) is calculated for each potential association and it is compared to the expected range values. We identified a solar source for 26 SSC related events. 12 of these 26 associations match all criteria. We finally discuss the difficulty to perform such associations.

  18. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    Science.gov (United States)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  19. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    Science.gov (United States)

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. An analysis on boron dilution events during SBLOCA for the KNGR

    International Nuclear Information System (INIS)

    Kim, Young In; Hwang, Young Dong; Park, Jong Kuen; Chung, Young Jong; Sim, Suk Gu

    1999-02-01

    An analysis on boron dilution events during small break loss of coolant accident (LOCA) for Korea Next Generation Reactor (KNGR) was performed using Computational Fluid Dynamic (CFD) computer program FLUENT code. The maximum size of the water slug was determined based on the source of un borated water slug and the possible flow paths. Axisymmetric computational fluid dynamic analysis model is applied for conservative scoping analysis of un borated water slug mixing with recirculation water of the reactor system following small break LOCA assuming one Reactor Coolant Pump (RCP) restart. The computation grid was determined through the sensitivity study on the grid size, which calculates the most conservative results, and the preliminary calculation for boron mixing was performed using the grid. (Author). 17 refs., 3 tabs., 26 figs

  1. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs

  2. Hydroacoustic monitoring of a salt cavity: an analysis of precursory events of the collapse

    Science.gov (United States)

    Lebert, F.; Bernardie, S.; Mainsant, G.

    2011-09-01

    One of the main features of "post mining" research relates to available methods for monitoring mine-degradation processes that could directly threaten surface infrastructures. In this respect, GISOS, a French scientific interest group, is investigating techniques for monitoring the eventual collapse of underground cavities. One of the methods under investigation was monitoring the stability of a salt cavity through recording microseismic-precursor signals that may indicate the onset of rock failure. The data were recorded in a salt mine in Lorraine (France) when monitoring the controlled collapse of 2 000 000 m3 of rocks surrounding a cavity at 130 m depth. The monitoring in the 30 Hz to 3 kHz frequency range highlights the occurrence of events with high energy during periods of macroscopic movement, once the layers had ruptured; they appear to be the consequence of the post-rupture rock movements related to the intense deformation of the cavity roof. Moreover the analysis shows the presence of some interesting precursory signals before the cavity collapsed. They occurred a few hours before the failure phases, when the rocks were being weakened and damaged. They originated from the damaging and breaking process, when micro-cracks appear and then coalesce. From these results we expect that deeper signal analysis and statistical analysis on the complete event time distribution (several millions of files) will allow us to finalize a complete typology of each signal families and their relations with the evolution steps of the cavity over the five years monitoring.

  3. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    Science.gov (United States)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  4. Analysis of area events as part of probabilistic safety assessment for Romanian TRIGA SSR 14 MW reactor

    International Nuclear Information System (INIS)

    Mladin, D.; Stefan, I.

    2005-01-01

    The international experience has shown that the external events could be an important contributor to plant/ reactor risk. For this reason such events have to be included in the PSA studies. In the context of PSA for nuclear facilities, external events are defined as events originating from outside the plant, but with the potential to create an initiating event at the plant. To support plant safety assessment, PSA can be used to find methods for identification of vulnerable features of the plant and to suggest modifications in order to mitigate the impact of external events or the producing of initiating events. For that purpose, probabilistic assessment of area events concerning fire and flooding risk and impact is necessary. Due to the relatively large power level amongst research reactors, the approach to safety analysis of Romanian 14 MW TRIGA benefits from an ongoing PSA project. In this context, treatment of external events should be considered. The specific tasks proposed for the complete evaluation of area event analysis are: identify the rooms important for facility safety, determine a relative area event risk index for these rooms and a relative area event impact index if the event occurs, evaluate the rooms specific area event frequency, determine the rooms contribution to reactor hazard state frequencies, analyze power supply and room dependencies of safety components (as pumps, motor operated valves). The fire risk analysis methodology is based on Berry's method [1]. This approach provides a systematic procedure to carry out a relative index of different rooms. The factors, which affect the fire probability, are: personal presence in the room, number and type of ignition sources, type and area of combustibles, fuel available in the room, fuel location, and ventilation. The flooding risk analysis is based on the amount of piping in the room. For accuracy of the information regarding piping a facility walk-about is necessary. In case of flooding risk

  5. An analysis of potential costs of adverse events based on Drug Programs in Poland. Pulmonology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-06-01

    Full Text Available The project was performed within the Polish Society for Pharmacoeconomics (PTFE. The objective was to estimate the potential costs of treatment of side effects, which theoretically may occur as a result of treatment of selected diseases. We analyzed the Drug Programs financed by National Health Fund in Poland in 2012 and for the first analysis we selected those Programs where the same medicinal products were used. We based the adverse events selection on the Summary of Product Characteristics of the chosen products. We extracted all the potential adverse events defined as frequent and very frequent, grouping them according to therapeutic areas. This paper is related to the results in the pulmonology area. The events described as very common had an incidence of ≥ 1/10, and the common ones ≥ 1/100, <1/10. In order to identify the resources used, we performed a survey with the engagement of clinical experts. On the basis of the collected data we allocated direct costs incurred by the public payer. We used the costs valid in December 2013. The paper presents the estimated costs of treatment of side effects related to the pulmonology disease area. Taking into account the costs incurred by the NHF and the patient separately e calculated the total spending and the percentage of each component cost in detail. The treatment of adverse drug reactions generates a significant cost incurred by both the public payer and the patient.

  6. Preliminary analysis of beam trip and beam jump events in an ADS prototype

    International Nuclear Information System (INIS)

    D'Angelo, A.; Bianchini, G.; Carta, M.

    2001-01-01

    A core dynamics analysis relevant to some typical current transient events has been carried out on an 80 MW energy amplifier prototype (EAP) fuelled by mixed oxides and cooled by lead-bismuth. Fuel and coolant temperature trends relevant to recovered beam trip and beam jump events have been preliminary investigated. Beam trip results show that the drop in temperature of the core outlet coolant would be reduced a fair amount if the beam intensity could be recovered within few seconds. Due to the low power density in the EAP fuel, the beam jump from 50% of the nominal power transient evolves benignly. The worst thinkable current transient, beam jump with cold reactor, mainly depends on the coolant flow conditions. In the EAP design, the primary loop coolant flow is assured by natural convection and is enhanced by a particular system of cover gas injection into the bottom part of the riser. If this system of coolant flow enhancement is assumed in function, even the beam jump with cold reactor event evolves without severe consequences. (authors)

  7. Rare event computation in deterministic chaotic systems using genealogical particle analysis

    International Nuclear Information System (INIS)

    Wouters, J; Bouchet, F

    2016-01-01

    In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein–Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function. (paper)

  8. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    Science.gov (United States)

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  9. A systemic approach for managing extreme risk events-dynamic financial analysis

    Directory of Open Access Journals (Sweden)

    Ph.D.Student Rodica Ianole

    2011-12-01

    Full Text Available Following the Black Swan logic, it often happens that what we do not know becomes more relevant that what we (believe to know. The management of extreme risks falls under this paradigm in the sense that it cannot be limited to a static approach based only on objective and easily quantifiable variables. Making appeal to the operational tools developed primarily for the insurance industry, the present paper aims to investigate how dynamic financial analysis (DFA can be used within the framework of extreme risk events.

  10. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    Science.gov (United States)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture

  11. Development and evaluation of a computerbased instrument supporting event analysis in NPP. Final report

    International Nuclear Information System (INIS)

    Szameitat, S.

    2002-11-01

    Information technologies (IT) in safety management are seen as opportunity to control and reduce risks. The processing of safety-critical event information, a central function of safety management, could be more efficient using computers. But organization structures are different, the processes of experience transfer are complex and the opportunities of IT are broad. To implement a support system is to question about design criteria of computer support for an event-based safety management system (eSMS). Two studies were conducted. Study 1 compares the organizational, technical and legal conditions for Safety Management in the German Nuclear Industry and the Norwegian Offshore Industry. It could identify design criteria, which independent from the operator influence the eSMS. Study 2 compares the eSMS of different nuclear power plants analyzing the organization structures and processes. Those internal and external criteria have a significant influence on a efficient design of a system support for eSMS. The third study makes the options and impact of computer support on experience transfer in eSMS as subject of discussion. For this purpose a simulation environment has been created. Groups investigated typical event scenarios from a nuclear power plant for contributing factors. The communication medium has been manipulated (face-to-face versus computer-mediated). Results of 15 groups indicate, that the collection of event information was promoted by computer-mediated communication. This is the foundation of the identification of contributing factors. The depth of analysis has not been influenced. Computer mediation hampers the learning of facts. IT can support safety management effectively. (orig.) [de

  12. Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment

    International Nuclear Information System (INIS)

    Mercurio, D.

    2011-01-01

    Current Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) methodologies model the evolution of accident sequences in Nuclear Power Plants (NPPs) mainly based on Logic Trees. The evolution of these sequences is a result of the interactions between the crew and plant; in current PRA methodologies, simplified models of these complex interactions are used. In this study, the Accident Dynamic Simulator (ADS), a modeling framework based on the Discrete Dynamic Event Tree (DDET), has been used for the simulation of crew-plant interactions during potential accident scenarios in NPPs. In addition, an operator/crew model has been developed to treat the response of the crew to the plant. The 'crew model' is made up of three operators whose behavior is guided by a set of rules-of-behavior (which represents the knowledge and training of the operators) coupled with written and mental procedures. In addition, an approach for addressing the crew timing variability in DDETs has been developed and implemented based on a set of HRA data from a simulator study. Finally, grouping techniques were developed and applied to the analysis of the scenarios generated by the crew-plant simulation. These techniques support the post-simulation analysis by grouping similar accident sequences, identifying the key contributing events, and quantifying the conditional probability of the groups. These techniques are used to characterize the context of the crew actions in order to obtain insights for HRA. The model has been applied for the analysis of a Small Loss Of Coolant Accident (SLOCA) event for a Pressurized Water Reactor (PWR). The simulation results support an improved characterization of the performance conditions or context of operator actions, which can be used in an HRA, in the analysis of the reliability of the actions. By providing information on the evolution of system indications, dynamic of cues, crew timing in performing procedure steps, situation

  13. Modeling time-to-event (survival) data using classification tree analysis.

    Science.gov (United States)

    Linden, Ariel; Yarnold, Paul R

    2017-12-01

    Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.

  14. Analysis of brand personality to involve event involvement and loyalty: A case study of Jakarta Fashion Week 2017

    Science.gov (United States)

    Nasution, A. H.; Rachmawan, Y. A.

    2018-04-01

    Fashion trend in the world changed extremely fast. Fashion has become the one of people’s lifestyle in the world. Fashion week events in several areas can be a measurement of fahion trend nowadays. There was a fashion week event in Indonesia called Jakarta Fashion Week (JFW) aims to show fashion trend to people who want to improve their fashion style. People will join some events if the event has involvement to them, hence they will come to that event again and again. Annually and continuously event is really important to create loyalty among people who are involved in it, in order to increase positive development towards the organizer in organizing the next event. Saving a huge amount from the marketing budget, and creating a higher quality event. This study aims to know the effect of 5 brand personality dimension to event involvement and loyalty in Jakarta Fashion Week (JFW). This study use quantitative confirmative method with Structural Equation Model (SEM) analysis technique. The sample of this study is 150 respondents who became a participant of Jakarta Fashion Week 2017. Result show that there was significant effect of 5 brand personality dimension to 3 dimension of event involvement and loyalty. Meanwhile, there was one dimension of event involvement called personal self-expression that has not effect to loyalty.

  15. The analysis of competing events like cause-specific mortality--beware of the Kaplan-Meier method

    NARCIS (Netherlands)

    Verduijn, Marion; Grootendorst, Diana C.; Dekker, Friedo W.; Jager, Kitty J.; le Cessie, Saskia

    2011-01-01

    Kaplan-Meier analysis is a popular method used for analysing time-to-event data. In case of competing event analyses such as that of cardiovascular and non-cardiovascular mortality, however, the Kaplan-Meier method profoundly overestimates the cumulative mortality probabilities for each of the

  16. AN ANALYSIS OF RISK EVENTS IN THE OIL-TANKER MAINTENANCE BUSINESS

    Directory of Open Access Journals (Sweden)

    Roque Rabechini Junior

    2012-12-01

    Full Text Available This work presents the results of an investigation into risk events and their respective causes, carried out in ship maintenance undertakings in the logistical sector of the Brazilian oil industry. Its theoretical, conceptual positioning lies in those aspects related to risk management of the undertakings as instruments of support in decision making by executives in the tanker-maintenance business. The case-study method was used as an alternative methodology with a qualitative approach of an exploratory nature and, for the presentation of data, a descriptive format was chosen. Through the analysis of 75 risk events in projects of tanker docking it was possible to extract eight of the greatest relevance. The risk analysis facilitated the identification of actions aimed at their mitigation. As a conclusion it was possible to propose a risk-framework model in four categories, HSE (health, safety and the environment, technicians, externalities and management, designed to provide tanker-docking business executives and administrators, with evidence of actions to assist in their decision-making processes. Finally, the authors identified proposals for further study as well as showing the principal limitations of the study.

  17. Spectral analysis of time series of events: effect of respiration on heart rate in neonates

    International Nuclear Information System (INIS)

    Van Drongelen, Wim; Williams, Amber L; Lasky, Robert E

    2009-01-01

    Certain types of biomedical processes such as the heart rate generator can be considered as signals that are sampled by the occurring events, i.e. QRS complexes. This sampling property generates problems for the evaluation of spectral parameters of such signals. First, the irregular occurrence of heart beats creates an unevenly sampled data set which must either be pre-processed (e.g. by using trace binning or interpolation) prior to spectral analysis, or analyzed with specialized methods (e.g. Lomb's algorithm). Second, the average occurrence of events determines the Nyquist limit for the sampled time series. Here we evaluate different types of spectral analysis of recordings of neonatal heart rate. Coupling between respiration and heart rate and the detection of heart rate itself are emphasized. We examine both standard and data adaptive frequency bands of heart rate signals generated by models of coupled oscillators and recorded data sets from neonates. We find that an important spectral artifact occurs due to a mirror effect around the Nyquist limit of half the average heart rate. Further we conclude that the presence of respiratory coupling can only be detected under low noise conditions and if a data-adaptive respiratory band is used

  18. Soft error rate analysis methodology of multi-Pulse-single-event transients

    International Nuclear Information System (INIS)

    Zhou Bin; Huo Mingxue; Xiao Liyi

    2012-01-01

    As transistor feature size scales down, soft errors in combinational logic because of high-energy particle radiation is gaining more and more concerns. In this paper, a combinational logic soft error analysis methodology considering multi-pulse-single-event transients (MPSETs) and re-convergence with multi transient pulses is proposed. In the proposed approach, the voltage pulse produced at the standard cell output is approximated by a triangle waveform, and characterized by three parameters: pulse width, the transition time of the first edge, and the transition time of the second edge. As for the pulse with the amplitude being smaller than the supply voltage, the edge extension technique is proposed. Moreover, an efficient electrical masking model comprehensively considering transition time, delay, width and amplitude is proposed, and an approach using the transition times of two edges and pulse width to compute the amplitude of pulse is proposed. Finally, our proposed firstly-independently-propagating-secondly-mutually-interacting (FIP-SMI) is used to deal with more practical re-convergence gate with multi transient pulses. As for MPSETs, a random generation model of MPSETs is exploratively proposed. Compared to the estimates obtained using circuit level simulations by HSpice, our proposed soft error rate analysis algorithm has 10% errors in SER estimation with speed up of 300 when the single-pulse-single-event transient (SPSET) is considered. We have also demonstrated the runtime and SER decrease with the increment of P0 using designs from the ISCAS-85 benchmarks. (authors)

  19. Replica analysis of overfitting in regression models for time-to-event data

    Science.gov (United States)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  20. Potential Indoor Worker Exposure From Handling Area Leakage: Example Event Sequence Frequency Analysis

    International Nuclear Information System (INIS)

    Benke, Roland R.; Adams, George R.

    2008-01-01

    The U.S. Department of Energy (DOE) is currently considering design options for the facilities that will handle spent nuclear fuel and high-level radioactive waste at the potential nuclear waste repository at Yucca Mountain, Nevada. The license application must demonstrate compliance with the performance objectives of 10 CFR Part 63, which include occupational dose limits from 10 CFR Part 20. If DOE submits a license application under 10 CFR Part 63, the U.S. Nuclear Regulatory Commission (NRC) will conduct a risk-informed, performance-based review of the DOE license application and its preclosure safety analysis, in which in-depth technical evaluations are focused on technical areas that are significant to preclosure safety and risk. As part of pre-licensing activities, the Center for Nuclear Waste Regulatory Analyses (CNWRA) developed the Preclosure Safety Analysis Tool software to aid in the regulatory review of a DOE license application and support any independent confirmatory assessments that may be needed. Recent DOE information indicates a primarily canister-based handling approach that includes the wet transfer of individual assemblies where Heating, Ventilation, and Air Conditioning (HVAC) systems may be relied on to provide confinement and limit the spread of any airborne radioactive material from handling operations. Workers may be involved in manual and remote operations in handling transportation casks, canisters, waste packages, or bare spent nuclear fuel assemblies inside facility buildings. As part of routine operations within these facilities, radioactive material may potentially become airborne if canisters are opened or bare fuel assemblies are handled. Leakage of contaminated air from the handling area into adjacent occupied areas, therefore, represents a potential radiological exposure pathway for indoor workers. The objective of this paper is to demonstrate modeling capabilities that can be used by the regulator to estimate frequencies of

  1. Event rates, hospital utilization, and costs associated with major complications of diabetes: a multicountry comparative analysis.

    Directory of Open Access Journals (Sweden)

    Philip M Clarke

    2010-02-01

    Full Text Available Diabetes imposes a substantial burden globally in terms of premature mortality, morbidity, and health care costs. Estimates of economic outcomes associated with diabetes are essential inputs to policy analyses aimed at prevention and treatment of diabetes. Our objective was to estimate and compare event rates, hospital utilization, and costs associated with major diabetes-related complications in high-, middle-, and low-income countries.Incidence and history of diabetes-related complications, hospital admissions, and length of stay were recorded in 11,140 patients with type 2 diabetes participating in the Action in Diabetes and Vascular Disease (ADVANCE study (mean age at entry 66 y. The probability of hospital utilization and number of days in hospital for major events associated with coronary disease, cerebrovascular disease, congestive heart failure, peripheral vascular disease, and nephropathy were estimated for three regions (Asia, Eastern Europe, and Established Market Economies using multiple regression analysis. The resulting estimates of days spent in hospital were multiplied by regional estimates of the costs per hospital bed-day from the World Health Organization to compute annual acute and long-term costs associated with the different types of complications. To assist, comparability, costs are reported in international dollars (Int$, which represent a hypothetical currency that allows for the same quantities of goods or services to be purchased regardless of country, standardized on purchasing power in the United States. A cost calculator accompanying this paper enables the estimation of costs for individual countries and translation of these costs into local currency units. The probability of attending a hospital following an event was highest for heart failure (93%-96% across regions and lowest for nephropathy (15%-26%. The average numbers of days in hospital given at least one admission were greatest for stroke (17-32 d across

  2. Do climate extreme events foster violent civil conflicts? A coincidence analysis

    Science.gov (United States)

    Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.

    2014-05-01

    Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.

  3. Spousal communication and contraceptive use in rural Nepal: an event history analysis.

    Science.gov (United States)

    Link, Cynthia F

    2011-06-01

    This study analyzes longitudinal data from couples in rural Nepal to investigate the influence of spousal communication about family planning on their subsequent contraceptive use. The study expands current understanding of the communication-contraception link by (a) exploiting monthly panel data to conduct an event history analysis, (b) incorporating both wives' and husbands' perceptions of communication, and (c) distinguishing effects of spousal communication on the use of four contraceptive methods. The findings provide new evidence of a strong positive impact of spousal communication on contraceptive use, even when controlling for confounding variables. Wives' reports of communication are substantial explanatory factors in couples' initiation of all contraceptive methods examined. Husbands' reports of communication predict couples'subsequent use of male-controlled methods. This analysis advances our understanding of how marital dynamics--as well as husbands' perceptions of these dynamics--influence fertility behavior, and should encourage policies to promote greater integration of men into family planning programs.

  4. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    International Nuclear Information System (INIS)

    FORESTER, JOHN A.; BLEY, DENNIS C.; COOPER, SUSANE; KOLACZKOWSKI, ALAN M.; THOMPSON, CATHERINE; RAMEY-SMITH, ANN; WREATHALL, JOHN

    2000-01-01

    This paper describes the most recent version of a human reliability analysis (HRA) method called ''A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed

  5. Event based neutron activation spectroscopy and analysis algorithm using MLE and meta-heuristics

    International Nuclear Information System (INIS)

    Wallace, B.

    2014-01-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes involved was used to create a statistical model. Maximum likelihood estimation was combined with meta-heuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research. (author)

  6. Clinical usefulness and feasibility of time-frequency analysis of chemosensory event-related potentials.

    Science.gov (United States)

    Huart, C; Rombaux, Ph; Hummel, T; Mouraux, A

    2013-09-01

    The clinical usefulness of olfactory event-related brain potentials (OERPs) to assess olfactory function is limited by the relatively low signal-to-noise ratio of the responses identified using conventional time-domain averaging. Recently, it was shown that time-frequency analysis of the obtained EEG signals can markedly improve the signal-to-noise ratio of OERPs in healthy controls, because it enhances both phase-locked and non phase-locked EEG responses. The aim of the present study was to investigate the clinical usefulness of this approach and evaluate its feasibility in a clinical setting. We retrospectively analysed EEG recordings obtained from 45 patients (15 anosmic, 15 hyposmic and 15 normos- mic). The responses to olfactory stimulation were analysed using conventional time-domain analysis and joint time-frequency analysis. The ability of the two methods to discriminate between anosmic, hyposmic and normosmic patients was assessed using a Receiver Operating Characteristic analysis. The discrimination performance of OERPs identified using conventional time-domain averaging was poor. In contrast, the discrimination performance of the EEG response identified in the time-frequency domain was relatively high. Furthermore, we found a significant correlation between the magnitude of this response and the psychophysical olfactory score. Time-frequency analysis of the EEG responses to olfactory stimulation could be used as an effective and reliable diagnostic tool for the objective clinical evaluation of olfactory function in patients.

  7. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    Science.gov (United States)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  8. Trend analysis of human error events and assessment of their proactive prevention measure at Rokkasho reprocessing plant

    International Nuclear Information System (INIS)

    Yamazaki, Satoru; Tanaka, Izumi; Wakabayashi, Toshio

    2012-01-01

    A trend analysis of human error events is important for preventing the recurrence of human error events. We propose a new method for identifying the common characteristics from results of trend analysis, such as the latent weakness of organization, and a management process for strategic error prevention. In this paper, we describe a trend analysis method for human error events that have been accumulated in the organization and the utilization of the results of trend analysis to prevent accidents proactively. Although the systematic analysis of human error events, the monitoring of their overall trend, and the utilization of the analyzed results have been examined for the plant operation, such information has never been utilized completely. Sharing information on human error events and analyzing their causes lead to the clarification of problems in the management and human factors. This new method was applied to the human error events that occurred in the Rokkasho reprocessing plant from 2010 October. Results revealed that the output of this method is effective in judging the error prevention plan and that the number of human error events is reduced to about 50% those observed in 2009 and 2010. (author)

  9. Analysis of the events on the operating of the wrong compartment of NPPs

    International Nuclear Information System (INIS)

    Zheng Lixin; Zhou Hong; Zhang Hao; Che Shuwei; Zhang Jiajun

    2013-01-01

    In this paper, an operational event that unit trip caused by the operating of the wrong compartment, due to the personnel error is introduced. Through in-depth research on this kind of events the causes of the events are found, some suggestions are put forward. It can provide a reference for preventing the similar events from recurring to other NPPs. (authors)

  10. Statistical methods for the time-to-event analysis of individual participant data from multiple epidemiological studies

    DEFF Research Database (Denmark)

    Thompson, Simon; Kaptoge, Stephen; White, Ian

    2010-01-01

    Meta-analysis of individual participant time-to-event data from multiple prospective epidemiological studies enables detailed investigation of exposure-risk relationships, but involves a number of analytical challenges....

  11. Analysis of the highest transverse energy events seen in the UAl detector at the Spp-barS collider

    International Nuclear Information System (INIS)

    1987-06-01

    The first full solid angle analysis is presented of large transverse energy events in pp-bar collisions at the CERN collider. Events with transverse energies in excess of 200 GeV at √s = 630 GeV are studied for any non-standard physics and quantitatively compared with expectations from perturbative QCD Monte Carlo models. A corrected differential cross section is presented. A detailed examination is made of jet profiles, event jet multiplicities and the fraction of the transverse energy carried by the two jets with the highest transverse jet energies. There is good agreement with standard theory for events with transverse energies up to the largest observed values (approx. √s/2) and the analysis shows no evidence for any non-QCD mechanism to account for the event characteristics. (author)

  12. Exploratory trend and pattern analysis of 1981 through 1983 Licensee Event Report data. Main report. Volume 1

    International Nuclear Information System (INIS)

    Hester, O.V.; Groh, M.R.; Farmer, F.G.

    1986-10-01

    This report presents an overview of the 1981 through 1983 Sequence Coding and Search System (SCSS) data base that contains nuclear power plant operational data derived from Licensee Event Reports (LERs) submitted to the United States Nuclear Regulatory Commission (USNRC). Both overall event reporting and events related to specific components, subsystems, systems, and personnel are discussed. At all of these levels of information, software is used to generate count data for contingency tables. Contingency table analysis is the main tool for the trend and pattern analysis. The tables focus primarily on faults associated with various components and other items of interest across different plants. The abstracts and other SCSS information on the LERs accounting for unusual counts in the tables were examined to gain insights from the events. Trends and patterns in LER reporting and reporting of events for various component groups were examined through log-linear modeling techniques

  13. Analysis of the highest transverse energy events seen in the UA1 detector at the Spanti pS collider

    International Nuclear Information System (INIS)

    Albajar, C.; Bezaguet, A.; Cennini, P.

    1987-01-01

    This is the first full solid angle analysis of large transverse energy events in panti p collisions at the CERN collider. Events with transverse energies in excess of 200 GeV at √s=630 GeV are studied for any non-standard physics and quantitatively compared with expectations from perturbative QCD Monte Carlo models. A corrected differential cross section is presented. A detailed examination is made of jet profiles, event jet multiplicities and the fraction of the transverse energy carried by the two jets with the highest transverse jet energies. There is good agreement with standard theory for events with transverse energies up to the largest observed values (≅ √s/2) and the analysis shows no evidence for any non-QCD mechanism to account for the event characteristics. (orig.)

  14. SENTINEL EVENTS

    Directory of Open Access Journals (Sweden)

    Andrej Robida

    2004-09-01

    Full Text Available Background. The Objective of the article is a two year statistics on sentinel events in hospitals. Results of a survey on sentinel events and the attitude of hospital leaders and staff are also included. Some recommendations regarding patient safety and the handling of sentinel events are given.Methods. In March 2002 the Ministry of Health introduce a voluntary reporting system on sentinel events in Slovenian hospitals. Sentinel events were analyzed according to the place the event, its content, and root causes. To show results of the first year, a conference for hospital directors and medical directors was organized. A survey was conducted among the participants with the purpose of gathering information about their view on sentinel events. One hundred questionnaires were distributed.Results. Sentinel events. There were 14 reports of sentinel events in the first year and 7 in the second. In 4 cases reports were received only after written reminders were sent to the responsible persons, in one case no reports were obtained. There were 14 deaths, 5 of these were in-hospital suicides, 6 were due to an adverse event, 3 were unexplained. Events not leading to death were a suicide attempt, a wrong side surgery, a paraplegia after spinal anaesthesia, a fall with a femoral neck fracture, a damage of the spleen in the event of pleural space drainage, inadvertent embolization with absolute alcohol into a femoral artery and a physical attack on a physician by a patient. Analysis of root causes of sentinel events showed that in most cases processes were inadequate.Survey. One quarter of those surveyed did not know about the sentinel events reporting system. 16% were having actual problems when reporting events and 47% beleived that there was an attempt to blame individuals. Obstacles in reporting events openly were fear of consequences, moral shame, fear of public disclosure of names of participants in the event and exposure in mass media. The majority of

  15. A framework for analysis of sentinel events in medical student education.

    Science.gov (United States)

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  16. Recent adaptive events in human brain revealed by meta-analysis of positively selected genes.

    Directory of Open Access Journals (Sweden)

    Yue Huang

    Full Text Available BACKGROUND AND OBJECTIVES: Analysis of positively-selected genes can help us understand how human evolved, especially the evolution of highly developed cognitive functions. However, previous works have reached conflicting conclusions regarding whether human neuronal genes are over-represented among genes under positive selection. METHODS AND RESULTS: We divided positively-selected genes into four groups according to the identification approaches, compiling a comprehensive list from 27 previous studies. We showed that genes that are highly expressed in the central nervous system are enriched in recent positive selection events in human history identified by intra-species genomic scan, especially in brain regions related to cognitive functions. This pattern holds when different datasets, parameters and analysis pipelines were used. Functional category enrichment analysis supported these findings, showing that synapse-related functions are enriched in genes under recent positive selection. In contrast, immune-related functions, for instance, are enriched in genes under ancient positive selection revealed by inter-species coding region comparison. We further demonstrated that most of these patterns still hold even after controlling for genomic characteristics that might bias genome-wide identification of positively-selected genes including gene length, gene density, GC composition, and intensity of negative selection. CONCLUSION: Our rigorous analysis resolved previous conflicting conclusions and revealed recent adaptation of human brain functions.

  17. Analysis of syntactic and semantic features for fine-grained event-spatial understanding in outbreak news reports

    Directory of Open Access Journals (Sweden)

    Chanlekha Hutchatai

    2010-03-01

    Full Text Available Abstract Background Previous studies have suggested that epidemiological reasoning needs a fine-grained modelling of events, especially their spatial and temporal attributes. While the temporal analysis of events has been intensively studied, far less attention has been paid to their spatial analysis. This article aims at filling the gap concerning automatic event-spatial attribute analysis in order to support health surveillance and epidemiological reasoning. Results In this work, we propose a methodology that provides a detailed analysis on each event reported in news articles to recover the most specific locations where it occurs. Various features for recognizing spatial attributes of the events were studied and incorporated into the models which were trained by several machine learning techniques. The best performance for spatial attribute recognition is very promising; 85.9% F-score (86.75% precision/85.1% recall. Conclusions We extended our work on event-spatial attribute recognition by focusing on machine learning techniques, which are CRF, SVM, and Decision tree. Our approach avoided the costly development of an external knowledge base by employing the feature sources that can be acquired locally from the analyzed document. The results showed that the CRF model performed the best. Our study indicated that the nearest location and previous event location are the most important features for the CRF and SVM model, while the location extracted from the verb's subject is the most important to the Decision tree model.

  18. Multi dimensional analysis of Design Basis Events using MARS-LMR

    International Nuclear Information System (INIS)

    Woo, Seung Min; Chang, Soon Heung

    2012-01-01

    Highlights: ► The one dimensional analyzed sodium hot pool is modified to a three dimensional node system, because the one dimensional analysis cannot represent the phenomena of the inside pool of a big size pool with many compositions. ► The results of the multi-dimensional analysis compared with the one dimensional analysis results in normal operation, TOP (Transient of Over Power), LOF (Loss of Flow), and LOHS (Loss of Heat Sink) conditions. ► The difference of the sodium flow pattern due to structure effect in the hot pool and mass flow rates in the core lead the different sodium temperature and temperature history under transient condition. - Abstract: KALIMER-600 (Korea Advanced Liquid Metal Reactor), which is a pool type SFR (Sodium-cooled Fast Reactor), was developed by KAERI (Korea Atomic Energy Research Institute). DBE (Design Basis Events) for KALIMER-600 has been analyzed in the one dimension. In this study, the one dimensional analyzed sodium hot pool is modified to a three dimensional node system, because the one dimensional analysis cannot represent the phenomena of the inside pool of a big size pool with many compositions, such as UIS (Upper Internal Structure), IHX (Intermediate Heat eXchanger), DHX (Decay Heat eXchanger), and pump. The results of the multi-dimensional analysis compared with the one dimensional analysis results in normal operation, TOP (Transient of Over Power), LOF (Loss of Flow), and LOHS (Loss of Heat Sink) conditions. First, the results in normal operation condition show the good agreement between the one and multi-dimensional analysis. However, according to the sodium temperatures of the core inlet, outlet, the fuel central line, cladding and PDRC (Passive Decay heat Removal Circuit), the temperatures of the one dimensional analysis are generally higher than the multi-dimensional analysis in conditions except the normal operation state, and the PDRC operation time in the one dimensional analysis is generally longer than

  19. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    Science.gov (United States)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  20. Monitoring As A Helpful Means In Forensic Analysis Of Dams Static Instability Events

    Science.gov (United States)

    Solimene, Pellegrino

    2013-04-01

    Monitoring is a means of controlling the behavior of a structure, which during its operational life is subject to external actions as ordinary loading conditions and disturbing ones; these factors overlap with the random manner defined by the statistical parameter of the return period. The analysis of the monitoring data is crucial to gain a reasoned opinion on the reliability of the structure and its components, and also allows to identify, in the overall operational scenario, the time when preparing interventions aimed at maintaining the optimum levels of functionality and safety. The concept of monitoring in terms of prevention is coupled with the activity of Forensic Engineer who, by Judiciary appointment for the occurrence of an accident, turns its experience -the "Scientific knowledge"- in an "inverse analysis" in which he summed up the results of a survey, which also draws on data sets arising in the course of the constant control of the causes and effects, so to determine the correlations between these factors. His activity aims at giving a contribution to the identification of the typicality of an event, which represents, together with "causal link" between the conduct and events and contra-juridical, the factors judging if there an hypothesis of crime, and therefore liable according to law. In Italy there are about 10,000 dams of varying sizes, but only a small portion of them are considered "large dams" and subjected to a rigorous program of regular inspections and monitoring, in application of specific rules. The rest -"small" dams, conventionally defined as such by the standard, but not for the impact on the area- is affected by a heterogeneous response from the local authorities entrusted with this task: there is therefore a high potential risk scenario, as determined by the presence of not completely controlled structures that insist even on areas heavily populated. Risk can be traced back to acceptable levels if they were implemented with the

  1. Probabilistic safety analysis on an SBWR 72 hours after the initiating event

    International Nuclear Information System (INIS)

    Dominguez Bautista, M.T.; Peinador Veira, M.

    1996-01-01

    Passive plants, including SBWRs, are designed to carry out safety functions with passive systems during the first 72 hours after the initiation event with no need for manual actions or external support. After this period, some recovery actions are required to enable the passive systems to continue performing their safety functions. The study was carried out by the INITEC-Empresarios Agrupados Joint Venture within the framework of the international group collaborating with GE on this project. Its purpose has been to assess, by means of probabilistic criteria, the importance to safety of each of these support actions, in order to define possible requirements to be considered in the design in respect of said recovery actions. In brief, the methodology developed for this objective consists of (1) quantifying success event trees from the PSA up to 72 hours, (2) determining the actions required in each sequence to maintain Steady State after 72 hours, (3) identifying available alternative core cooling methods in each sequence, (4) establishing the approximate (order of magnitude) realizability of each alternative method, (5) calculating the frequency of core damage as a function of the failure probability of post-72-hour actions and (6) analysing the importance of post-72-hour actions. The results of this analysis permit the establishment, right from the conceptual design phase, of the requirements that will arise to ensure these actions in the long term, enhancing their reliability and preventing the accident from continuing beyond this period. (Author)

  2. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    Science.gov (United States)

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  3. ERPLAB: An Open-Source Toolbox for the Analysis of Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Javier eLopez-Calderon

    2014-04-01

    Full Text Available ERPLAB Toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  4. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    events dating back several decades is analysed. Precipitation thresholds varying in space and time are established using highly resolved INCA data of the Austrian weather service. Parameters possibly controlling the basic susceptibility of catchments are evaluated in a regional GIS analysis (vegetation, geology, topography, stream network, proxies for sediment availability). Similarity measures are then used to group catchments into sensitivity classes. Applying different climate scenarios, the spatiotemporal distribution of catchments sensitive towards heavier and more frequent precipitation can be determined giving valuable advice for planning and managing mountain protection zones.

  5. Time compression of soil erosion by the effect of largest daily event. A regional analysis of USLE database.

    Science.gov (United States)

    Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.

    2009-04-01

    When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE

  6. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    Science.gov (United States)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  7. Investigation of Lab Fire Prevention Management System of Combining Root Cause Analysis and Analytic Hierarchy Process with Event Tree Analysis

    Directory of Open Access Journals (Sweden)

    Cheng-Chan Shih

    2016-01-01

    Full Text Available This paper proposed a new approach, combining root cause analysis (RCA, analytic hierarchy process (AHP, and event tree analysis (ETA in a loop to systematically evaluate various laboratory safety prevention strategies. First, 139 fire accidents were reviewed to identify the root causes and draw out prevention strategies. Most fires were caused due to runaway reactions, operation error and equipment failure, and flammable material release. These mostly occurred in working places of no prompt fire protection. We also used AHP to evaluate the priority of these strategies and found that chemical fire prevention strategy is the most important control element, and strengthening maintenance and safety inspection intensity is the most important action. Also together with our surveys results, we proposed that equipment design is also critical for fire prevention. Therefore a technical improvement was propounded: installing fire detector, automatic sprinkler, and manual extinguisher in the lab hood as proactive fire protections. ETA was then used as a tool to evaluate laboratory fire risks. The results indicated that the total risk of a fire occurring decreases from 0.0351 to 0.0042 without/with equipment taking actions. Establishing such system can make Environment, Health and Safety (EH&S office not only analyze and prioritize fire prevention policies more practically, but also demonstrate how effective protective equipment improvement can achieve and the probabilities of the initiating event developing into a serious accident or controlled by the existing safety system.

  8. Single-Event Effects in High-Frequency Linear Amplifiers: Experiment and Analysis

    Science.gov (United States)

    Zeinolabedinzadeh, Saeed; Ying, Hanbin; Fleetwood, Zachary E.; Roche, Nicolas J.-H.; Khachatrian, Ani; McMorrow, Dale; Buchner, Stephen P.; Warner, Jeffrey H.; Paki-Amouzou, Pauline; Cressler, John D.

    2017-01-01

    The single-event transient (SET) response of two different silicon-germanium (SiGe) X-band (8-12 GHz) low noise amplifier (LNA) topologies is fully investigated in this paper. The two LNAs were designed and implemented in 130nm SiGe HBT BiCMOS process technology. Two-photon absorption (TPA) laser pulses were utilized to induce transients within various devices in these LNAs. Impulse response theory is identified as a useful tool for predicting the settling behavior of the LNAs subjected to heavy ion strikes. Comprehensive device and circuit level modeling and simulations were performed to accurately simulate the behavior of the circuits under ion strikes. The simulations agree well with TPA measurements. The simulation, modeling and analysis presented in this paper can be applied for any other circuit topologies for SET modeling and prediction.

  9. A Study on Degree of Conservatism of PZR Inventory during Event Analysis

    International Nuclear Information System (INIS)

    Lee, Sang Seob; Park, Min Soo; Huh, Jae Yong; Lee, Gyu Cheon

    2016-01-01

    The pressurizer safety valves (PSVs) are installed in OPR1000 plants. While the pressurizer pilot operated safety relief valve (POSRV) of APR1400 is designed to discharge steam and/or water, the PSV is designed to discharge steam only. To check degree of conservatism of a PZR water level during PSV operation, a study has been performed using a computer code, RELAP5/ MOD3.3. Degree of conservatism is described herein, and the results are shown to evaluate degree of conservatism. Degree of conservatism is evaluated with respect to the PZR inventory for OPR1000 plant. It could be concluded that there is no possibility the liquid goes through PSVs during PLCS malfunction, because the expected maximum PZR inventory would remain below PSV nozzle based on the conservative assumptions. With the site specific PSV characteristics, a degree of conservatism would be determined to guarantee the PSV integrity during the event. To guarantee the PSV integrity, an independent analysis is recommended

  10. A Study on Degree of Conservatism of PZR Inventory during Event Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Seob; Park, Min Soo; Huh, Jae Yong; Lee, Gyu Cheon [KEPCO Engineering and Construction Co. Ltd., Deajeon (Korea, Republic of)

    2016-10-15

    The pressurizer safety valves (PSVs) are installed in OPR1000 plants. While the pressurizer pilot operated safety relief valve (POSRV) of APR1400 is designed to discharge steam and/or water, the PSV is designed to discharge steam only. To check degree of conservatism of a PZR water level during PSV operation, a study has been performed using a computer code, RELAP5/ MOD3.3. Degree of conservatism is described herein, and the results are shown to evaluate degree of conservatism. Degree of conservatism is evaluated with respect to the PZR inventory for OPR1000 plant. It could be concluded that there is no possibility the liquid goes through PSVs during PLCS malfunction, because the expected maximum PZR inventory would remain below PSV nozzle based on the conservative assumptions. With the site specific PSV characteristics, a degree of conservatism would be determined to guarantee the PSV integrity during the event. To guarantee the PSV integrity, an independent analysis is recommended.

  11. Emergency Load Shedding Strategy Based on Sensitivity Analysis of Relay Operation Margin against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun Sun

    2012-01-01

    the runtime emergent states of related system component. Based on sensitivity analysis between the relay operation margin and power system state variables, an optimal load shedding strategy is applied to adjust the emergent states timely before the unwanted relay operation. Load dynamics is also taken...... into account to compensate load shedding amount calculation. And the multi-agent technology is applied for the whole strategy implementation. A test system is built in real time digital simulator (RTDS) and has demonstrated the effectiveness of the proposed strategy.......In order to prevent long term voltage instability and induced cascading events, a load shedding strategy based on the sensitivity of relay operation margin to load powers is discussed and proposed in this paper. The operation margin of critical impedance backup relay is defined to identify...

  12. Superposed ruptile deformational events revealed by field and VOM structural analysis

    Science.gov (United States)

    Kumaira, Sissa; Guadagnin, Felipe; Keller Lautert, Maiara

    2017-04-01

    Virtual outcrop models (VOM) is becoming an important application in the analysis of geological structures due to the possibility of obtaining the geometry and in some cases kinematic aspects of analyzed structures in a tridimensional photorealistic space. These data are used to gain quantitative information on the deformational features which coupled with numeric models can assist in understands deformational processes. Old basement units commonly register superposed deformational events either ductile or ruptile along its evolution. The Porongos Belt, located at southern Brazil, have a complex deformational history registering at least five ductile and ruptile deformational events. In this study, we presents a structural analysis of a quarry in the Porongos Belt, coupling field and VOM structural information to understand process involved in the last two deformational events. Field information was acquired using traditional structural methods for analysis of ruptile structures, such as the descriptions, drawings, acquisition of orientation vectors and kinematic analysis. VOM was created from the image-based modeling method through photogrammetric data acquisition and orthorectification. Photogrammetric data acquisition was acquired using Sony a3500 camera and a total of 128 photographs were taken from ca. 10-20 m from the outcrop in different orientations. Thirty two control point coordinates were acquired using a combination of RTK dGPS surveying and total station work, providing a precision of few millimeters for x, y and z. Photographs were imported into the Photo Scan software to create a 3D dense point cloud from structure from-motion algorithm, which were triangulated and textured to generate the VOM. VOM was georreferenced (oriented and scaled) using the ground control points, and later analyzed in OpenPlot software to extract structural information. Data was imported in Wintensor software to obtain tensor orientations, and Move software to process and

  13. Analysis of core-concrete interaction event with flooding for the Advanced Neutron Source reactor

    International Nuclear Information System (INIS)

    Kim, S.H.; Taleyarkhan, R.P.; Georgevich, V.; Navarro-Valenti, S.

    1993-01-01

    This paper discusses salient aspects of the methodology, assumptions, and modeling of various features related to estimation of source terms from an accident involving a molten core-concrete interaction event (with and without flooding) in the Advanced Neutron Source (ANS) reactor at the Oak Ridge National Laboratory. Various containment configurations are considered for this postulated severe accident. Several design features (such as rupture disks) are examined to study containment response during this severe accident. Also, thermal-hydraulic response of the containment and radionuclide transport and retention in the containment are studied. The results are described as transient variations of source terms, which are then used for studying off-site radiological consequences and health effects for the support of the Conceptual Safety Analysis Report for ANS. The results are also to be used to examine the effectiveness of subpile room flooding during this type of severe accident

  14. Retrospective Analysis of Communication Events - Understanding the Dynamics of Collaborative Multi-Party Discourse

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, Andrew J.; Haack, Jereme N.; McColgin, Dave W.

    2006-06-08

    This research is aimed at understanding the dynamics of collaborative multi-party discourse across multiple communication modalities. Before we can truly make sig-nificant strides in devising collaborative communication systems, there is a need to understand how typical users utilize com-putationally supported communications mechanisms such as email, instant mes-saging, video conferencing, chat rooms, etc., both singularly and in conjunction with traditional means of communication such as face-to-face meetings, telephone calls and postal mail. Attempting to un-derstand an individual’s communications profile with access to only a single modal-ity is challenging at best and often futile. Here, we discuss the development of RACE – Retrospective Analysis of Com-munications Events – a test-bed prototype to investigate issues relating to multi-modal multi-party discourse.

  15. Transition-Region Ultraviolet Explosive Events in IRIS Si IV: A Statistical Analysis

    Science.gov (United States)

    Bartz, Allison

    2018-01-01

    Explosive events (EEs) in the solar transition region are characterized by broad, non-Gaussian line profiles with wings at Doppler velocities exceeding the speed of sound. We present a statistical analysis of 23 IRIS (Interface Region Imaging Spectrograph) sit-and-stare observations, observed between April 2014 and March 2017. Using the IRIS Si IV 1394 Å and 1403 Å spectral windows and the 1400Å Slit Jaw images we have identified 581 EEs. We found that most EEs last less than 20 min. and have a spatial scale on the slit less than 10”, agreeing with measurements in previous work. We observed most EEs in active regions, regardless of date of observation, but selection bias of IRIS observations cannot be ruled out. We also present preliminary findings of optical depth effects from our statistical study.

  16. An Entry Point for Formal Methods: Specification and Analysis of Event Logs

    Directory of Open Access Journals (Sweden)

    Howard Barringer

    2010-03-01

    Full Text Available Formal specification languages have long languished, due to the grave scalability problems faced by complete verification methods. Runtime verification promises to use formal specifications to automate part of the more scalable art of testing, but has not been widely applied to real systems, and often falters due to the cost and complexity of instrumentation for online monitoring. In this paper we discuss work in progress to apply an event-based specification system to the logging mechanism of the Mars Science Laboratory mission at JPL. By focusing on log analysis, we exploit the "instrumentation" already implemented and required for communicating with the spacecraft. We argue that this work both shows a practical method for using formal specifications in testing and opens interesting research avenues, including a challenging specification learning problem.

  17. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  18. Analysis of factors associated with hiccups based on the Japanese Adverse Drug Event Report database.

    Science.gov (United States)

    Hosoya, Ryuichiro; Uesawa, Yoshihiro; Ishii-Nozawa, Reiko; Kagaya, Hajime

    2017-01-01

    Hiccups are occasionally experienced by most individuals. Although hiccups are not life-threatening, they may lead to a decline in quality of life. Previous studies showed that hiccups may occur as an adverse effect of certain medicines during chemotherapy. Furthermore, a male dominance in hiccups has been reported. However, due to the limited number of studies conducted on this phenomenon, debate still surrounds the few factors influencing hiccups. The present study aimed to investigate the influence of medicines and patient characteristics on hiccups using a large-sized adverse drug event report database and, specifically, the Japanese Adverse Drug Event Report (JADER) database. Cases of adverse effects associated with medications were extracted from JADER, and Fisher's exact test was performed to assess the presence or absence of hiccups for each medication. In a multivariate analysis, we conducted a multiple logistic regression analysis using medication and patient characteristic variables exhibiting significance. We also examined the role of dexamethasone in inducing hiccups during chemotherapy. Medicines associated with hiccups included dexamethasone, levofolinate, fluorouracil, oxaliplatin, carboplatin, and irinotecan. Patient characteristics associated with hiccups included a male gender and greater height. The combination of anti-cancer agent and dexamethasone use was noted in more than 95% of patients in the dexamethasone-use group. Hiccups also occurred in patients in the anti-cancer agent-use group who did not use dexamethasone. Most of the medications that induce hiccups are used in chemotherapy. The results of the present study suggest that it is possible to predict a high risk of hiccups using patient characteristics. We confirmed that dexamethasone was the drug that has the strongest influence on the induction of hiccups. However, the influence of anti-cancer agents on the induction of hiccups cannot be denied. We consider the results of the present

  19. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    Science.gov (United States)

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  20. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs

  1. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  2. Top-down and bottom-up definitions of human failure events in human reliability analysis

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2014-01-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down - defined as a subset of the PRA - whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up - derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  3. Top-down proteomics for the analysis of proteolytic events - Methods, applications and perspectives.

    Science.gov (United States)

    Tholey, Andreas; Becker, Alexander

    2017-11-01

    Mass spectrometry based proteomics is an indispensable tool for almost all research areas relevant for the understanding of proteolytic processing, ranging from the identification of substrates, products and cleavage sites up to the analysis of structural features influencing protease activity. The majority of methods for these studies are based on bottom-up proteomics performing analysis at peptide level. As this approach is characterized by a number of pitfalls, e.g. loss of molecular information, there is an ongoing effort to establish top-down proteomics, performing separation and MS analysis both at intact protein level. We briefly introduce major approaches of bottom-up proteomics used in the field of protease research and highlight the shortcomings of these methods. We then discuss the present state-of-the-art of top-down proteomics. Together with the discussion of known challenges we show the potential of this approach and present a number of successful applications of top-down proteomics in protease research. This article is part of a Special Issue entitled: Proteolysis as a Regulatory Event in Pathophysiology edited by Stefan Rose-John. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Cardiopulmonary resuscitation in the elderly: analysis of the events in the emergency department

    Directory of Open Access Journals (Sweden)

    Augusto Tricerri

    2013-10-01

    Full Text Available With the increasing number of old people in all western countries and increasing life expectancy at birth, many seniors spend the last period of their life with various afflictions that may lead to cardiac arrest. Bystander cardiopulmonary resuscitation (CPR increases survival rates. Octogenarians are the fastest growing segment of the population and despite empirical evidence that CPR is of questionable effectiveness in seniors with comorbidities, it is still the only treatment among life-sustaining ones. Cardiopulmonary resuscitation is frequently unsuccessful, but if survival is achieved, a fairly good quality of life can be expected. Various papers analyzed the effect of CPR in hospitalized patients or in cardiac arrest occurring at home or in public places, while less is known about events occurring in the emergency room (ER. We performed a retrospective analysis of cardiac arrest events occurred in ER within 54 months: we analyzed 415,001 records of ER visits (from 01/01/1999 to 30/06/2003 in San Giovanni Addolorata Hospital. Data were analyzed in terms of age and outcome. We identified 475 records with the outcome of death in ER or death on arrival. Out of them, we selected 290 medical records which had sufficient data to be analyzed. Of the 290 patients evaluated, 225 died in ER, 18 were deemed to die on arrival, and 47 survived the cardiac arrest and were admitted to intensive care unit (ICU. The overall mortality was 0.11%, while the incidence of the selected events was 0.072%. The mean age of the analyzed population was 71.3 years. The only possible diagnosis was often cardiac arrest, though most of the times we could specify and group the diagnosis even better. The analysis of the procedures showed that cardiac arrest treated by direct current (DC shock was similarly distributed in different age groups, and no difference was detectable between the two groups. The mean age of the patients who underwent tracheal intubation (TI was

  5. Climate change impacts on extreme events in the United States: an uncertainty analysis

    Science.gov (United States)

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  6. Coordination activities of human planners during rescheduling: Case analysis and event handling procedure

    OpenAIRE

    2010-01-01

    Abstract This paper addresses the process of event handling and rescheduling in manufacturing practice. Firms are confronted with many diverse events, like new or changed orders, machine breakdowns, and material shortages. These events influence the feasibility and optimality of schedules, and thus induce rescheduling. In many manufacturing firms, schedules are created by several human planners. Coordination between them is needed to respond to events adequately. In this paper,...

  7. 6C polarization analysis - seismic direction finding in coherent noise, automated event identification, and wavefield separation

    Science.gov (United States)

    Schmelzbach, C.; Sollberger, D.; Greenhalgh, S.; Van Renterghem, C.; Robertsson, J. O. A.

    2017-12-01

    Polarization analysis of standard three-component (3C) seismic data is an established tool to determine the propagation directions of seismic waves recorded by a single station. A major limitation of seismic direction finding methods using 3C recordings, however, is that a correct propagation-direction determination is only possible if the wave mode is known. Furthermore, 3C polarization analysis techniques break down in the presence of coherent noise (i.e., when more than one event is present in the analysis time window). Recent advances in sensor technology (e.g., fibre-optical, magnetohydrodynamic angular rate sensors, and ring laser gyroscopes) have made it possible to accurately measure all three components of rotational ground motion exhibited by seismic waves, in addition to the conventionally recorded three components of translational motion. Here, we present an extension of the theory of single station 3C polarization analysis to six-component (6C) recordings of collocated translational and rotational ground motions. We demonstrate that the information contained in rotation measurements can help to overcome some of the main limitations of standard 3C seismic direction finding, such as handling multiple arrivals simultaneously. We show that the 6C polarisation of elastic waves measured at the Earth's free surface does not only depend on the seismic wave type and propagation direction, but also on the local P- and S-wave velocities just beneath the recording station. Using an adaptation of the multiple signal classification algorithm (MUSIC), we demonstrate how seismic events can univocally be identified and characterized in terms of their wave type. Furthermore, we show how the local velocities can be inferred from single-station 6C data, in addition to the direction angles (inclination and azimuth) of seismic arrivals. A major benefit of our proposed 6C method is that it also allows the accurate recovery of the wave type, propagation directions, and phase

  8. Economic impact and market analysis of a special event: The Great New England Air Show

    Science.gov (United States)

    Rodney B. Warnick; David C. Bojanic; Atul Sheel; Apurv Mather; Deepak Ninan

    2010-01-01

    We conducted a post-event evaluation for the Great New England Air Show to assess its general economic impact and to refine economic estimates where possible. In addition to the standard economic impact variables, we examined travel distance, purchase decision involvement, event satisfaction, and frequency of attendance. Graphic mapping of event visitors' home ZIP...

  9. Analysis of economic and social costs of adverse events associated with blood transfusions in Spain

    Directory of Open Access Journals (Sweden)

    Borja Ribed-Sánchez

    2018-05-01

    Full Text Available Objective: To calculate, for the first time, the direct and social costs of transfusion-related adverse events in order to include them in the National Healthcare System's budget, calculation and studies. In Spain more than 1,500 patients yearly are diagnosed with such adverse events. Method: Blood transfusion-related adverse events recorded yearly in Spanish haemovigilance reports were studied retrospectively (2010-2015. The adverse events were coded according to the classification of Diagnosis-Related Groups. The direct healthcare costs were obtained from public information sources. The productivity loss (social cost associated with adverse events was calculated using the human capital and hedonic salary methodologies. Results: In 2015, 1,588 patients had adverse events that resulted in direct health care costs (4,568,914€ and social costs due to hospitalization (200,724€. Three adverse reactions resulted in patient death (at a social cost of 1,364,805€. In total, the cost of blood transfusion-related adverse events was 6,134,443€ in Spain. For the period 2010-2015: the trends show a reduction in the total amount of transfusions (2 vs. 1.91 M€; -4.4%. The number of adverse events increased (822 vs. 1,588; +93%, as well as their related direct healthcare cost (3.22 vs. 4.57M€; +42% and the social cost of hospitalization (110 vs 200M€; +83%. Mortality costs decreased (2.65 vs. 1.36M€; -48%. Discussion: This is the first time that the costs of post-transfusion adverse events have been calculated in Spain. These new figures and trends should be taken into consideration in any cost-effectiveness study or trial of new surgical techniques or sanitary policies that influence blood transfusion activities. Resumen: Objetivo: Calcular por primera vez los costes económicos y sociales relacionados con las reacciones adversas postransfusionales para actualizar estudios e incluirlos en los presupuestos del Sistema Nacional de Salud. En Espa

  10. The use of geoinformatic data and spatial analysis to predict faecal pollution during extreme precipitation events

    Science.gov (United States)

    Ward, Ray; Purnell, Sarah; Ebdon, James; Nnane, Daniel; Taylor, Huw

    2013-04-01

    be a major factor contributing to increased levels of FIO. This study identifies areas within the catchment that are likely to demonstrate elevated erosion rates during extreme precipitation events, which are likely to result in raised levels of FIO. The results also demonstrate that increases in the human faecal marker were associated with the discharge points of wastewater treatment works, and that levels of the marker increased whenever the works discharged untreated wastewaters during extreme precipitation. Spatial analysis also highlighted locations where human faecal pollution was present in areas away from wastewater treatment plants, highlighting the potential significance of inputs from septic tanks and other un-sewered domestic wastewater systems. Increases in the frequency of extreme precipitation events in many parts of Europe are likely to result in increased levels of water pollution from both point- and diffuse-sources, increasing the input of pathogens into surface waters, and elevating the health risks to downstream consumers of abstracted drinking water. This study suggests an approach that integrates water microbiology and geoinformatic data to support a 'prediction and prevention' approach, in place of the traditional focus on water quality monitoring. This work may therefore make a significant contribution to future European water resource management and health protection.

  11. The Association of Unfavorable Traffic Events and Cannabis Usage: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Sorin Hostiuc

    2018-02-01

    Full Text Available Background: In the last years were published many epidemiological articles aiming to link driving under the influence of cannabis (DUIC with the risk of various unfavorable traffic events (UTEs, with sometimes contradictory results.Aim: The primary objective of this study was to analyze whether there is a significant association between DUIC and UTEs.Materials and Methods: We used two meta-analytical methods to assess the statistical significance of the effect size: random-effects model and inverse variance heterogeneity model.Results: Twenty-four studies were included in the meta-analysis. We obtained significant increases in the effect size for DUIC tested through blood analysis, with an odds ratio (OR of 1.97 and a confidence interval (CI between 1.35 and 2.87; death as an outcome, with an OR of 1.56 and a CI between 1.16 and 2.09; and case–control as the type of study, with an OR of 1.99 and a CI between 1.05 and 3.80. Publication bias was very high.Conclusion: Our analysis suggests that the overall effect size for DUIC on UTEs is not statistically significant, but there are significant differences obtained through subgroup analysis. This result might be caused by either methodological flaws (which are often encountered in articles on this topic, the indiscriminate employment of the term “cannabis use,” or an actual absence of an adverse effect. When a driver is found, in traffic, with a positive reaction suggesting cannabis use, the result should be corroborated by either objective data regarding marijuana usage (like blood analyses, with clear cut-off values, or a clinical assessment of the impairment, before establishing his/her fitness to drive.

  12. The Association of Unfavorable Traffic Events and Cannabis Usage: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Sorin Hostiuc

    2018-02-01

    Full Text Available Background: In the last years were published many epidemiological articles aiming to link driving under the influence of cannabis (DUIC with the risk of various unfavorable traffic events (UTEs, with sometimes contradictory results.Aim: The primary objective of this study was to analyze whether there is a significant association between DUIC and UTEs.Materials and Methods: We used two meta-analytical methods to assess the statistical significance of the effect size: random-effects model and inverse variance heterogeneity model.Results: Twenty-four studies were included in the meta-analysis. We obtained significant increases in the effect size for DUIC tested through blood analysis, with an odds ratio (OR of 2.27 and a confidence interval (CI between 1.36 and 3.80; death as an outcome, with an OR of 1.56 and a CI between 1.16 and 2.09; and case–control as the type of study, with an OR of 1.99 and a CI between 1.05 and 3.80. Publication bias was very high.Conclusion: Our analysis suggests that the overall effect size for DUIC on UTEs is not statistically significant, but there are significant differences obtained through subgroup analysis. This result might be caused by either methodological flaws (which are often encountered in articles on this topic, the indiscriminate employment of the term “cannabis use,” or an actual absence of an adverse effect. When a driver is found, in traffic, with a positive reaction suggesting cannabis use, the result should be corroborated by either objective data regarding marijuana usage (like blood analyses, with clear cut-off values, or a clinical assessment of the impairment, before establishing his/her fitness to drive.

  13. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  14. Evaluation of Visual Field Progression in Glaucoma: Quasar Regression Program and Event Analysis.

    Science.gov (United States)

    Díaz-Alemán, Valentín T; González-Hernández, Marta; Perera-Sanz, Daniel; Armas-Domínguez, Karintia

    2016-01-01

    To determine the sensitivity, specificity and agreement between the Quasar program, glaucoma progression analysis (GPA II) event analysis and expert opinion in the detection of glaucomatous progression. The Quasar program is based on linear regression analysis of both mean defect (MD) and pattern standard deviation (PSD). Each series of visual fields was evaluated by three methods; Quasar, GPA II and four experts. The sensitivity, specificity and agreement (kappa) for each method was calculated, using expert opinion as the reference standard. The study included 439 SITA Standard visual fields of 56 eyes of 42 patients, with a mean of 7.8 ± 0.8 visual fields per eye. When suspected cases of progression were considered stable, sensitivity and specificity of Quasar, GPA II and the experts were 86.6% and 70.7%, 26.6% and 95.1%, and 86.6% and 92.6% respectively. When suspected cases of progression were considered as progressing, sensitivity and specificity of Quasar, GPA II and the experts were 79.1% and 81.2%, 45.8% and 90.6%, and 85.4% and 90.6% respectively. The agreement between Quasar and GPA II when suspected cases were considered stable or progressing was 0.03 and 0.28 respectively. The degree of agreement between Quasar and the experts when suspected cases were considered stable or progressing was 0.472 and 0.507. The degree of agreement between GPA II and the experts when suspected cases were considered stable or progressing was 0.262 and 0.342. The combination of MD and PSD regression analysis in the Quasar program showed better agreement with the experts and higher sensitivity than GPA II.

  15. Multiple daytime nucleation events in semi-clean savannah and industrial environments in South Africa: analysis based on observations

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2013-06-01

    Full Text Available Recent studies have shown very high frequencies of atmospheric new particle formation in different environments in South Africa. Our aim here was to investigate the causes for two or three consecutive daytime nucleation events, followed by subsequent particle growth during the same day. We analysed 108 and 31 such days observed in a polluted industrial and moderately polluted rural environments, respectively, in South Africa. The analysis was based on two years of measurements at each site. After rejecting the days having notable changes in the air mass origin or local wind direction, i.e. two major reasons for observed multiple nucleation events, we were able to investigate other factors causing this phenomenon. Clouds were present during, or in between most of the analysed multiple particle formation events. Therefore, some of these events may have been single events, interrupted somehow by the presence of clouds. From further analysis, we propose that the first nucleation and growth event of the day was often associated with the mixing of a residual air layer rich in SO2 (oxidized to sulphuric acid into the shallow surface-coupled layer. The second nucleation and growth event of the day usually started before midday and was sometimes associated with renewed SO2 emissions from industrial origin. However, it was also evident that vapours other than sulphuric acid were required for the particle growth during both events. This was especially the case when two simultaneously growing particle modes were observed. Based on our analysis, we conclude that the relative contributions of estimated H2SO4 and other vapours on the first and second nucleation and growth events of the day varied from day to day, depending on anthropogenic and natural emissions, as well as atmospheric conditions.

  16. SYSTEMS SAFETY ANALYSIS FOR FIRE EVENTS ASSOCIATED WITH THE ECRB CROSS DRIFT

    International Nuclear Information System (INIS)

    R. J. Garrett

    2001-01-01

    The purpose of this analysis is to systematically identify and evaluate fire hazards related to the Yucca Mountain Site Characterization Project (YMP) Enhanced Characterization of the Repository Block (ECRB) East-West Cross Drift (commonly referred to as the ECRB Cross-Drift). This analysis builds upon prior Exploratory Studies Facility (ESF) System Safety Analyses and incorporates Topopah Springs (TS) Main Drift fire scenarios and ECRB Cross-Drift fire scenarios. Accident scenarios involving the fires in the Main Drift and the ECRB Cross-Drift were previously evaluated in ''Topopah Springs Main Drift System Safety Analysis'' (CRWMS M and O 1995) and the ''Yucca Mountain Site Characterization Project East-West Drift System Safety Analysis'' (CRWMS M and O 1998). In addition to listing required mitigation/control features, this analysis identifies the potential need for procedures and training as part of defense-in-depth mitigation/control features. The inclusion of this information in the System Safety Analysis (SSA) is intended to assist the organization(s) (e.g., Construction, Environmental Safety and Health, Design) responsible for these aspects of the ECRB Cross-Drift in developing mitigation/control features for fire events, including Emergency Refuge Station(s). This SSA was prepared, in part, in response to Condition/Issue Identification and Reporting/Resolution System (CIRS) item 1966. The SSA is an integral part of the systems engineering process, whereby safety is considered during planning, design, testing, and construction. A largely qualitative approach is used which incorporates operating experiences and recommendations from vendors, the constructor and the operating contractor. The risk assessment in this analysis characterizes the scenarios associated with fires in terms of relative risk and includes recommendations for mitigating all identified hazards. The priority for recommending and implementing mitigation control features is: (1) Incorporate

  17. Analysis of core damage frequency due to external events at the DOE [Department of Energy] N-Reactor

    International Nuclear Information System (INIS)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L.; Baxter, J.T.; Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P.; Brosseau, D.A.

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs

  18. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. (Sandia National Labs., Albuquerque, NM (USA)); Baxter, J.T. (Westinghouse Hanford Co., Richland, WA (USA)); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. (EQE, Inc., San Francisco, CA (USA)); Brosseau, D.A. (ERCE, Inc., Albuquerque, NM (USA))

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  19. No rationale for 1 variable per 10 events criterion for binary logistic regression analysis

    Directory of Open Access Journals (Sweden)

    Maarten van Smeden

    2016-11-01

    Full Text Available Abstract Background Ten events per variable (EPV is a widely advocated minimal criterion for sample size considerations in logistic regression analysis. Of three previous simulation studies that examined this minimal EPV criterion only one supports the use of a minimum of 10 EPV. In this paper, we examine the reasons for substantial differences between these extensive simulation studies. Methods The current study uses Monte Carlo simulations to evaluate small sample bias, coverage of confidence intervals and mean square error of logit coefficients. Logistic regression models fitted by maximum likelihood and a modified estimation procedure, known as Firth’s correction, are compared. Results The results show that besides EPV, the problems associated with low EPV depend on other factors such as the total sample size. It is also demonstrated that simulation results can be dominated by even a few simulated data sets for which the prediction of the outcome by the covariates is perfect (‘separation’. We reveal that different approaches for identifying and handling separation leads to substantially different simulation results. We further show that Firth’s correction can be used to improve the accuracy of regression coefficients and alleviate the problems associated with separation. Conclusions The current evidence supporting EPV rules for binary logistic regression is weak. Given our findings, there is an urgent need for new research to provide guidance for supporting sample size considerations for binary logistic regression analysis.

  20. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis

  1. Analysis on ingress of coolant event in vacuum vessel using modified TRAC-BF1 code

    International Nuclear Information System (INIS)

    Ajima, Toshio; Kurihara, Ryoichi; Seki, Yasushi

    1999-08-01

    The Transient Reactor Analysis Code (TRAC-BF1) was modified on the basis of ICE experimental results so as to analyze the Ingress of Coolant Event (ICE) in the vacuum vessel of a nuclear fusion reactor. In the previous report, the TRAC-BF1 code, which was originally developed for the safety analysis of a light water reactor, had been modified for the ICE of the fusion reactor. And the addition of the flat structural plate model to the VESSEL component and arbitrary appointment of the gravity direction had been added in the TRAC-BF1 code. This TRAC-BF1 code was further modified. The flat structural plate model of the VESSEL component was enabled to divide in multi layers having different materials, and a part of the multi layers could take a buried heater into consideration. Moreover, the TRAC-BF1 code was modified to analyze under the low-pressure condition close to vacuum within range of the steam table. This paper describes additional functions of the modified TRAC-BF1 code, analytical evaluation using ICE experimental data and the ITER model with final design report (FDR) data. (author)

  2. No rationale for 1 variable per 10 events criterion for binary logistic regression analysis.

    Science.gov (United States)

    van Smeden, Maarten; de Groot, Joris A H; Moons, Karel G M; Collins, Gary S; Altman, Douglas G; Eijkemans, Marinus J C; Reitsma, Johannes B

    2016-11-24

    Ten events per variable (EPV) is a widely advocated minimal criterion for sample size considerations in logistic regression analysis. Of three previous simulation studies that examined this minimal EPV criterion only one supports the use of a minimum of 10 EPV. In this paper, we examine the reasons for substantial differences between these extensive simulation studies. The current study uses Monte Carlo simulations to evaluate small sample bias, coverage of confidence intervals and mean square error of logit coefficients. Logistic regression models fitted by maximum likelihood and a modified estimation procedure, known as Firth's correction, are compared. The results show that besides EPV, the problems associated with low EPV depend on other factors such as the total sample size. It is also demonstrated that simulation results can be dominated by even a few simulated data sets for which the prediction of the outcome by the covariates is perfect ('separation'). We reveal that different approaches for identifying and handling separation leads to substantially different simulation results. We further show that Firth's correction can be used to improve the accuracy of regression coefficients and alleviate the problems associated with separation. The current evidence supporting EPV rules for binary logistic regression is weak. Given our findings, there is an urgent need for new research to provide guidance for supporting sample size considerations for binary logistic regression analysis.

  3. Geostationary Coastal and Air Pollution Events (GEO-CAPE) Sensitivity Analysis Experiment

    Science.gov (United States)

    Lee, Meemong; Bowman, Kevin

    2014-01-01

    Geostationary Coastal and Air pollution Events (GEO-CAPE) is a NASA decadal survey mission to be designed to provide surface reflectance at high spectral, spatial, and temporal resolutions from a geostationary orbit necessary for studying regional-scale air quality issues and their impact on global atmospheric composition processes. GEO-CAPE's Atmospheric Science Questions explore the influence of both gases and particles on air quality, atmospheric composition, and climate. The objective of the GEO-CAPE Observing System Simulation Experiment (OSSE) is to analyze the sensitivity of ozone to the global and regional NOx emissions and improve the science impact of GEO-CAPE with respect to the global air quality. The GEO-CAPE OSSE team at Jet propulsion Laboratory has developed a comprehensive OSSE framework that can perform adjoint-sensitivity analysis for a wide range of observation scenarios and measurement qualities. This report discusses the OSSE framework and presents the sensitivity analysis results obtained from the GEO-CAPE OSSE framework for seven observation scenarios and three instrument systems.

  4. Computer-Aided Analysis of Flow in Water Pipe Networks after a Seismic Event

    Directory of Open Access Journals (Sweden)

    Won-Hee Kang

    2017-01-01

    Full Text Available This paper proposes a framework for a reliability-based flow analysis for a water pipe network after an earthquake. For the first part of the framework, we propose to use a modeling procedure for multiple leaks and breaks in the water pipe segments of a network that has been damaged by an earthquake. For the second part, we propose an efficient system-level probabilistic flow analysis process that integrates the matrix-based system reliability (MSR formulation and the branch-and-bound method. This process probabilistically predicts flow quantities by considering system-level damage scenarios consisting of combinations of leaks and breaks in network pipes and significantly reduces the computational cost by sequentially prioritizing the system states according to their likelihoods and by using the branch-and-bound method to select their partial sets. The proposed framework is illustrated and demonstrated by examining two example water pipe networks that have been subjected to a seismic event. These two examples consist of 11 and 20 pipe segments, respectively, and are computationally modeled considering their available topological, material, and mechanical properties. Considering different earthquake scenarios and the resulting multiple leaks and breaks in the water pipe segments, the water flows in the segments are estimated in a computationally efficient manner.

  5. Antigen-antibody biorecognition events as discriminated by noise analysis of force spectroscopy curves.

    Science.gov (United States)

    Bizzarri, Anna Rita; Cannistraro, Salvatore

    2014-08-22

    Atomic force spectroscopy is able to extract kinetic and thermodynamic parameters of biomolecular complexes provided that the registered unbinding force curves could be reliably attributed to the rupture of the specific complex interactions. To this aim, a commonly used strategy is based on the analysis of the stretching features of polymeric linkers which are suitably introduced in the biomolecule-substrate immobilization procedure. Alternatively, we present a method to select force curves corresponding to specific biorecognition events, which relies on a careful analysis of the force fluctuations of the biomolecule-functionalized cantilever tip during its approach to the partner molecules immobilized on a substrate. In the low frequency region, a characteristic 1/f (α) noise with α equal to one (flickering noise) is found to replace white noise in the cantilever fluctuation power spectrum when, and only when, a specific biorecognition process between the partners occurs. The method, which has been validated on a well-characterized antigen-antibody complex, represents a fast, yet reliable alternative to the use of linkers which may involve additional surface chemistry and reproducibility concerns.

  6. Rapid depressurization event analysis in BWR/6 using RELAP5 and contain

    Energy Technology Data Exchange (ETDEWEB)

    Mueftueoglu, A.K.; Feltus, M.A. [Pennsylvania State Univ., University Park, PA (United States)

    1995-09-01

    Noncondensable gases may become dissolved in Boiling Water Reactor (BWR) water level instrumentation during normal operations. Any dissolved noncondensable gases inside these water columns may come out of solution during rapid depressurization events, and displace water from the reference leg piping resulting in a false high level. These water level errors may cause a delay or failure in actuation, or premature shutdown of the Emergency Core Cooling System. (ECCS). If a rapid depressurization causes an erroneously high water level, preventing automatic ECCS actuation, it becomes important to determine if there would be other adequate indications for operator response and other signals for automatic actuation such as high drywell pressure. It is also important to determine the effect of the level signal on ECCS operation after it is being actuated. The objective of this study is to determine the detailed coupled containment/NSSS response during this rapid depressurization events in BWR/6. The selected scenarios involve: (a) inadvertent opening of all ADS valves, (b) design basis (DB) large break loss of coolant accident (LOCA), and (c) main steam line break (MSLB). The transient behaviors are evaluated in terms of: (a) vessel pressure and collapsed water level response, (b) specific transient boundary conditions, (e.g., scram, MSIV closure timing, feedwater flow, and break blowdown rates), (c) ECCS initiation timing, (d) impact of operator actions, (e) whether indications besides low-low water level were available. The results of the analysis had shown that there would be signals to actuate ECCS other than low reactor level, such as high drywell pressure, low vessel pressure, high suppression pool temperature, and that the plant operators would have significant indications to actuate ECCS.

  7. Competing events and costs of clinical trials: Analysis of a randomized trial in prostate cancer

    International Nuclear Information System (INIS)

    Zakeri, Kaveh; Rose, Brent S.; D’Amico, Anthony V.; Jeong, Jong-Hyeon; Mell, Loren K.

    2015-01-01

    Background: Clinical trial costs may be reduced by identifying enriched subpopulations of patients with favorable risk profiles for the events of interest. However, increased selectivity affects accrual rates, with uncertain impact on clinical trial cost. Methods: We conducted a secondary analysis of Southwest Oncology Group (SWOG) 8794 randomized trial of adjuvant radiotherapy for high-risk prostate cancer. The primary endpoint was metastasis-free survival (MFS), defined as time to metastasis or death from any cause (competing mortality). We used competing risks regression models to identify an enriched subgroup at high risk for metastasis and low risk for competing mortality. We applied a cost model to estimate the impact of enrichment on trial cost and duration. Results: The treatment effect on metastasis was similar in the enriched subgroup (HR, 0.42; 95% CI, 0.23–0.76) compared to the whole cohort (HR, 0.50; 95% CI, 0.30–0.81) while the effect on competing mortality was not significant in the subgroup or the whole cohort (HR 0.70; 95% CI 0.39–1.23, vs. HR 0.94; 95% CI, 0.68–1.31). Due to the higher incidence of metastasis relative to competing mortality in the enriched subgroup, the treatment effect on MFS was greater in the subgroup compared to the whole cohort (HR 0.55; 95% CI 0.36–0.82, vs. HR 0.77; 95% CI, 0.58–1.01). Trial cost was 75% less in the subgroup compared to the whole cohort ($1.7 million vs. $6.8 million), and the trial duration was 30% shorter (8.4 vs. 12.0 years). Conclusion: Competing event enrichment can reduce clinical trial cost and duration, without sacrificing generalizability

  8. Integrated survival analysis using an event-time approach in a Bayesian framework.

    Science.gov (United States)

    Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M

    2015-02-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  9. Integrated survival analysis using an event-time approach in a Bayesian framework

    Science.gov (United States)

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  10. A heavy sea fog event over the Yellow Sea in March 2005: Analysis and numerical modeling

    Science.gov (United States)

    Gao, Shanhong; Lin, Hang; Shen, Biao; Fu, Gang

    2007-02-01

    In this paper, a heavy sea fog episode that occurred over the Yellow Sea on 9 March 2005 is investigated. The sea fog patch, with a spatial scale of several hundred kilometers at its mature stage, reduced visibility along the Shandong Peninsula coast to 100 m or much less at some sites. Satellite images, surface observations and soundings at islands and coasts, and analyses from the Japan Meteorology Agency (JMA) are used to describe and analyze this event. The analysis indicates that this sea fog can be categorized as advection cooling fog. The main features of this sea fog including fog area and its movement are reasonably reproduced by the Fifth-generation Pennsylvania State University/National Center for Atmospheric Research Mesoscale Model (MM5). Model results suggest that the formation and evolution of this event can be outlined as: (1) southerly warm/moist advection of low-level air resulted in a strong sea-surface-based inversion with a thickness of about 600 m; (2) when the inversion moved from the warmer East Sea to the colder Yellow Sea, a thermal internal boundary layer (TIBL) gradually formed at the base of the inversion while the sea fog grew in response to cooling and moistening by turbulence mixing; (3) the sea fog developed as the TIBL moved northward and (4) strong northerly cold and dry wind destroyed the TIBL and dissipated the sea fog. The principal findings of this study are that sea fog forms in response to relatively persistent southerly warm/moist wind and a cold sea surface, and that turbulence mixing by wind shear is the primary mechanism for the cooling and moistening the marine layer. In addition, the study of sensitivity experiments indicates that deterministic numerical modeling offers a promising approach to the prediction of sea fog over the Yellow Sea but it may be more efficient to consider ensemble numerical modeling because of the extreme sensitivity to model input.

  11. External flooding event analysis in a PWR-W with MAAP5

    International Nuclear Information System (INIS)

    Fernandez-Cosials, Mikel Kevin; Jimenez, Gonzalo; Barreira, Pilar; Queral, Cesar

    2015-01-01

    Highlights: • External flooding preceded by a SCRAM is simulated with MAAP5.01. • Sensitivities include AFW-TDP, SLOCA and operator preventive actions. • SLOCA flow is the dominant factor in the sequences. • Vessel failure is avoidable with operator preventive actions. - Abstract: The Fukushima accident has drawn attention even more to the importance of external events and loss of energy supply on safety analysis. Since 2011, several Station Blackout (SBO) analyses have been done for all type of reactors. The most post-Fukushima studies analyze a pure and straight SBO transient, but the Fukushima accident was more complex than a standard SBO. At Fukushima accident, the SBO was a consequence of an external flooding from the tsunami and occurred 40 min after an emergency shutdown (SCRAM) caused by the earthquake. The first objective of this paper is to assume the consequences of an external flooding accident in a PWR site caused by a river flood, a dam break or a tsunami, where all the plant is damaged, not only the diesel generators. The second objective is to analyze possible actions to be performed in the time between the earthquake event (that causes a SCRAM) and the external flooding arrival, which could be applicable to accidents such as dam failures or river flooding in order to avoid more severe consequences, delay the core damage and improve the accident management. The results reveal how the actuation of the different systems and equipments affect the core damage time and how some actions could delay the core damage time enough to increase the possibility of AC power recovery

  12. Transformation of elite white maize using the particle inflow gun and detailed analysis of a low-copy integration event

    CSIR Research Space (South Africa)

    O'Kennedy, MM

    2001-12-01

    Full Text Available of these transformation events was demonstrat- ed by Southern blot analysis and by transgene expres- sion. In this event, the transgenes bar and uidA were in- serted in tandem. Keywords Elite white maize transformation ? Cereals ? Immature embryos ? Biolistics ? Fertile... study, only embryogenic white, compact structured calli, desig- nated type-I calli, were produced when immature zygotic embryos of the selected elite maize lines were cultured. The type-I calli regenerated to produce fertile plants. Although 40...

  13. Teleradiology system analysis using a discrete event-driven block-oriented network simulator

    Science.gov (United States)

    Stewart, Brent K.; Dwyer, Samuel J., III

    1992-07-01

    Performance evaluation and trade-off analysis are the central issues in the design of communication networks. Simulation plays an important role in computer-aided design and analysis of communication networks and related systems, allowing testing of numerous architectural configurations and fault scenarios. We are using the Block Oriented Network Simulator (BONeS, Comdisco, Foster City, CA) software package to perform discrete, event- driven Monte Carlo simulations in capacity planning, tradeoff analysis and evaluation of alternate architectures for a high-speed, high-resolution teleradiology project. A queuing network model of the teleradiology system has been devise, simulations executed and results analyzed. The wide area network link uses a switched, dial-up N X 56 kbps inverting multiplexer where the number of digital voice-grade lines (N) can vary from one (DS-0) through 24 (DS-1). The proposed goal of such a system is 200 films (2048 X 2048 X 12-bit) transferred between a remote and local site in an eight hour period with a mean delay time less than five minutes. It is found that: (1) the DS-1 service limit is around 100 films per eight hour period with a mean delay time of 412 +/- 39 seconds, short of the goal stipulated above; (2) compressed video teleconferencing can be run simultaneously with image data transfer over the DS-1 wide area network link without impacting the performance of the described teleradiology system; (3) there is little sense in upgrading to a higher bandwidth WAN link like DS-2 or DS-3 for the current system; and (4) the goal of transmitting 200 films in an eight hour period with a mean delay time less than five minutes can be achieved simply if the laser printer interface is updated from the current DR-11W interface to a much faster SCSI interface.

  14. Procedures as a Contributing Factor to Events in the Swedish Nuclear Power Plants. Analysis of a Database with Licensee Event Reports 1995-1999

    International Nuclear Information System (INIS)

    Bento, Jean-Pierre

    2002-12-01

    The operating experience from the twelve Swedish nuclear power units has been reviewed for the years 1995 - 1999 with respect to events - both Scrams and Licensee Event Reports, LERs - to which deficient procedure has been a contributing cause. In the present context 'Procedure' is defined as all written documentation used for the planning, performance and control of the tasks necessary for the operation and maintenance of the plants. The study has used an MTO-database (Man - Technology - Organisation) containing, for the five years studied, 42 MTO-related scrams out of 87 occurred scrams, and about 800 MTO-related LERs out of 2000 reported LERs. On an average, deficient procedures contribute to approximately 0,2 scram/unit/ year and to slightly more than three LERs/unit/year. Presented differently, procedure related scrams amount to 15% of the total number of scrams and to 31% of the MTO-related scrams. Similarly procedure related LERs amount to 10% of the total number of LERs and to 25% of the MTO-related LERs. For the most frequent work types performed at the plants, procedure related LERs are - in decreasing order - associated with tasks performed during maintenance, modification, testing and operation. However, for the latest year studied almost as many procedure related LERs are associated with modification tasks as with the three other work types together. A further analysis indicates that 'Deficient procedure content' is, by far, the dominating underlying cause contributing to procedure related scrams and LERs. The study also discusses the coupling between procedure related scrams/LERs, power operation and refuelling outages, and Common Cause Failures, CCF. An overall conclusion is that procedure related events in the Swedish nuclear power plants do not, on a national scale, represent an alarming issue. Significant and sustained efforts have been and are made at most units to improve the quality of procedures. However, a few units exhibit a noticeable

  15. Risk of neuropsychiatric adverse events associated with varenicline: systematic review and meta-analysis.

    Science.gov (United States)

    Thomas, Kyla H; Martin, Richard M; Knipe, Duleeka W; Higgins, Julian P T; Gunnell, David

    2015-03-12

    To determine the risk of neuropsychiatric adverse events associated with use of varenicline compared with placebo in randomised controlled trials. Systematic review and meta-analysis comparing study effects using two summary estimates in fixed effects models, risk differences, and Peto odds ratios. Medline, Embase, PsycINFO, the Cochrane Central Register of Controlled Trials (CENTRAL), and clinicaltrials.gov. Randomised controlled trials with a placebo comparison group that reported on neuropsychiatric adverse events (depression, suicidal ideation, suicide attempt, suicide, insomnia, sleep disorders, abnormal dreams, somnolence, fatigue, anxiety) and death. Studies that did not involve human participants, did not use the maximum recommended dose of varenicline (1 mg twice daily), and were cross over trials were excluded. In the 39 randomised controlled trials (10,761 participants), there was no evidence of an increased risk of suicide or attempted suicide (odds ratio 1.67, 95% confidence interval 0.33 to 8.57), suicidal ideation (0.58, 0.28 to 1.20), depression (0.96, 0.75 to 1.22), irritability (0.98, 0.81 to 1.17), aggression (0.91, 0.52 to 1.59), or death (1.05, 0.47 to 2.38) in the varenicline users compared with placebo users. Varenicline was associated with an increased risk of sleep disorders (1.63, 1.29 to 2.07), insomnia (1.56, 1.36 to 1.78), abnormal dreams (2.38, 2.05 to 2.77), and fatigue (1.28, 1.06 to 1.55) but a reduced risk of anxiety (0.75, 0.61 to 0.93). Similar findings were observed when risk differences were reported. There was no evidence for a variation in depression and suicidal ideation by age group, sex, ethnicity, smoking status, presence or absence of psychiatric illness, and type of study sponsor (that is, pharmaceutical industry or other). This meta-analysis found no evidence of an increased risk of suicide or attempted suicide, suicidal ideation, depression, or death with varenicline. These findings provide some reassurance for users

  16. Cardiovascular safety of linagliptin in type 2 diabetes: a comprehensive patient-level pooled analysis of prospectively adjudicated cardiovascular events.

    Science.gov (United States)

    Rosenstock, Julio; Marx, Nikolaus; Neubacher, Dietmar; Seck, Thomas; Patel, Sanjay; Woerle, Hans-Juergen; Johansen, Odd Erik

    2015-05-21

    The cardiovascular (CV) safety of linagliptin was evaluated in subjects with type 2 diabetes (T2DM). Pre-specified patient-level pooled analysis of all available double-blind, randomized, controlled trials, ≥ 12 weeks' duration (19 trials, 9459 subjects) of linagliptin versus placebo/active treatment. Primary end point: composite of prospectively adjudicated CV death, non-fatal myocardial infarction, non-fatal stroke, and hospitalization for unstable angina (4P-MACE). Hospitalization for congestive heart failure (CHF) was also evaluated; adjudication of CHF was introduced during the phase 3 program (8 trials; 3314 subjects). 4P-MACE was assessed in placebo-controlled trials (subgroup of 18 trials; 7746 subjects). Investigator-reported events suggestive of CHF from 24 placebo-controlled trials (including trials 4P-MACE incidence rates: 13.4 per 1000 patient-years, linagliptin (60 events), 18.9, total comparators (62 events); overall hazard ratio (HR), 0.78 (95% confidence interval [CI], 0.55-1.12). HR for adjudicated hospitalization for CHF (n = 21): 1.04 (0.43-2.47). For placebo-controlled trials, 4P-MACE incidence rates: 14.9 per 1000 patient-years, linagliptin (43 events), 16.4, total comparators (29 events); overall HR, 1.09 (95% CI, 0.68-1.75). Occurrence of investigator-reported events suggestive of CHF was low for linagliptin- (26 events, 0.5%; serious: 16 events, 0.3%) and placebo-treated (8 events, 0.2%; serious: 6 events, 0.2%) patients. Linagliptin is not associated with increased CV risk versus pooled active comparators or placebo in patients with T2DM.

  17. Analysis of arrhythmic events is useful to detect lead failure earlier in patients followed by remote monitoring.

    Science.gov (United States)

    Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi

    2018-03-01

    Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.

  18. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE)

    Science.gov (United States)

    Ajayakumar, J.; Shook, E.; Turner, V. K.

    2017-10-01

    With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE). SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the "North American storm complex" in December 2015, the "snowstorm Jonas" in January 2016, the "West Virginia floods" in June 2016, and the "Hurricane Matthew" in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI) for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with social media data and will be useful

  19. Incidence of cardiovascular events and associated risk factors in kidney transplant patients: a competing risks survival analysis.

    Science.gov (United States)

    Seoane-Pillado, María Teresa; Pita-Fernández, Salvador; Valdés-Cañedo, Francisco; Seijo-Bestilleiro, Rocio; Pértega-Díaz, Sonia; Fernández-Rivera, Constantino; Alonso-Hernández, Ángel; González-Martín, Cristina; Balboa-Barreiro, Vanesa

    2017-03-07

    The high prevalence of cardiovascular risk factors among the renal transplant population accounts for increased mortality. The aim of this study is to determine the incidence of cardiovascular events and factors associated with cardiovascular events in these patients. An observational ambispective follow-up study of renal transplant recipients (n = 2029) in the health district of A Coruña (Spain) during the period 1981-2011 was completed. Competing risk survival analysis methods were applied to estimate the cumulative incidence of developing cardiovascular events over time and to identify which characteristics were associated with the risk of these events. Post-transplant cardiovascular events are defined as the presence of myocardial infarction, invasive coronary artery therapy, cerebral vascular events, new-onset angina, congestive heart failure, rhythm disturbances, peripheral vascular disease and cardiovascular disease and death. The cause of death was identified through the medical history and death certificate using ICD9 (390-459, except: 427.5, 435, 446, 459.0). The mean age of patients at the time of transplantation was 47.0 ± 14.2 years; 62% were male. 16.5% had suffered some cardiovascular disease prior to transplantation and 9.7% had suffered a cardiovascular event. The mean follow-up period for the patients with cardiovascular event was 3.5 ± 4.3 years. Applying competing risk methodology, it was observed that the accumulated incidence of the event was 5.0% one year after transplantation, 8.1% after five years, and 11.9% after ten years. After applying multivariate models, the variables with an independent effect for predicting cardiovascular events are: male sex, age of recipient, previous cardiovascular disorders, pre-transplant smoking and post-transplant diabetes. This study makes it possible to determine in kidney transplant patients, taking into account competitive events, the incidence of post-transplant cardiovascular events and

  20. An Improved EMD-Based Dissimilarity Metric for Unsupervised Linear Subspace Learning

    Directory of Open Access Journals (Sweden)

    Xiangchun Yu

    2018-01-01

    Full Text Available We investigate a novel way of robust face image feature extraction by adopting the methods based on Unsupervised Linear Subspace Learning to extract a small number of good features. Firstly, the face image is divided into blocks with the specified size, and then we propose and extract pooled Histogram of Oriented Gradient (pHOG over each block. Secondly, an improved Earth Mover’s Distance (EMD metric is adopted to measure the dissimilarity between blocks of one face image and the corresponding blocks from the rest of face images. Thirdly, considering the limitations of the original Locality Preserving Projections (LPP, we proposed the Block Structure LPP (BSLPP, which effectively preserves the structural information of face images. Finally, an adjacency graph is constructed and a small number of good features of a face image are obtained by methods based on Unsupervised Linear Subspace Learning. A series of experiments have been conducted on several well-known face databases to evaluate the effectiveness of the proposed algorithm. In addition, we construct the noise, geometric distortion, slight translation, slight rotation AR, and Extended Yale B face databases, and we verify the robustness of the proposed algorithm when faced with a certain degree of these disturbances.

  1. Fishery landing forecasting using EMD-based least square support vector machine models

    Science.gov (United States)

    Shabri, Ani

    2015-05-01

    In this paper, the novel hybrid ensemble learning paradigm integrating ensemble empirical mode decomposition (EMD) and least square support machine (LSSVM) is proposed to improve the accuracy of fishery landing forecasting. This hybrid is formulated specifically to address in modeling fishery landing, which has high nonlinear, non-stationary and seasonality time series which can hardly be properly modelled and accurately forecasted by traditional statistical models. In the hybrid model, EMD is used to decompose original data into a finite and often small number of sub-series. The each sub-series is modeled and forecasted by a LSSVM model. Finally the forecast of fishery landing is obtained by aggregating all forecasting results of sub-series. To assess the effectiveness and predictability of EMD-LSSVM, monthly fishery landing record data from East Johor of Peninsular Malaysia, have been used as a case study. The result shows that proposed model yield better forecasts than Autoregressive Integrated Moving Average (ARIMA), LSSVM and EMD-ARIMA models on several criteria..

  2. Analysis of the effects of corrosion probe on riser 241-AN-102-WST-16 during seismic event

    International Nuclear Information System (INIS)

    ZIADA, H.H.

    1998-01-01

    This analysis supports the installation activity of the corrosion probe in Tank 241-AN-102. The probe is scheduled to be installed in Riser 241-AN-102-WST-16 (formerly known as Riser 15B). The purpose of this analysis is to evaluate the potential effect of the corrosion probe on the riser during a credible seismic event. The previous analysis (HNF 1997a) considered only pump jet impingement loading

  3. Combination of various data analysis techniques for efficient track reconstruction in very high multiplicity events

    Science.gov (United States)

    Siklér, Ferenc

    2017-08-01

    A novel combination of established data analysis techniques for reconstructing charged-particles in high energy collisions is proposed. It uses all information available in a collision event while keeping competing choices open as long as possible. Suitable track candidates are selected by transforming measured hits to a binned, three- or four-dimensional, track parameter space. It is accomplished by the use of templates taking advantage of the translational and rotational symmetries of the detectors. Track candidates and their corresponding hits, the nodes, form a usually highly connected network, a bipartite graph, where we allow for multiple hit to track assignments, edges. In order to get a manageable problem, the graph is cut into very many minigraphs by removing a few of its vulnerable components, edges and nodes. Finally the hits are distributed among the track candidates by exploring a deterministic decision tree. A depth-limited search is performed maximizing the number of hits on tracks, and minimizing the sum of track-fit χ2. Simplified but realistic models of LHC silicon trackers including the relevant physics processes are used to test and study the performance (efficiency, purity, timing) of the proposed method in the case of single or many simultaneous proton-proton collisions (high pileup), and for single heavy-ion collisions at the highest available energies.

  4. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Selin Aviyente

    2010-01-01

    Full Text Available Joint time-frequency representations offer a rich representation of event related potentials (ERPs that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.

  5. Cardiovascular events in patients with mild autonomous cortisol secretion: analysis with artificial neural networks.

    Science.gov (United States)

    Morelli, Valentina; Palmieri, Serena; Lania, Andrea; Tresoldi, Alberto; Corbetta, Sabrina; Cairoli, Elisa; Eller-Vainicher, Cristina; Arosio, Maura; Copetti, Massimiliano; Grossi, Enzo; Chiodini, Iacopo

    2017-07-01

    The independent role of mild autonomous cortisol secretion (ACS) in influencing the cardiovascular event (CVE) occurrence is a topic of interest. We investigated the role of mild ACS in the CVE occurrence in patients with adrenal incidentaloma (AI) by standard statistics and artificial neural networks (ANNs). We analyzed a retrospective record of 518 AI patients. Data regarding cortisol levels after 1 mg dexamethasone suppression (1 mg DST) and the presence of obesity (OB), hypertension (AH), type-2 diabetes (T2DM), dyslipidemia (DL), familial CVE history, smoking habit and CVE were collected. The receiver-operating characteristic curve analysis suggested that 1 mg DST, at a cut-off of 1.8 µg/dL, had the best accuracy for detecting patients with increased CVE risk. In patients with 1 mg-DST ≥1.8 µg/dL (DST+, n  = 223), age and prevalence of AH, T2DM, DL and CVE (66 years, 74.5, 25.9, 41.4 and 26.8% respectively) were higher than that of patients with 1 mg-DST ≤1.8 µg/dL (61.9 years, 60.7, 18.5, 32.9 and 10%, respectively, P  Cortisol after 1 mg-DST is independently associated with the CVE occurrence. The ANNs might help for assessing the CVE risk in AI patients. © 2017 European Society of Endocrinology.

  6. Central Italy magnetotelluric investigation. Structures and relations to seismic events: analysis of initial data

    Directory of Open Access Journals (Sweden)

    J. Marianiuk

    1996-06-01

    Full Text Available A scientific collaboration between the Warsaw Academy of Science, (Poland and the National Institute of Geophysics (Italy, gave rise to the installation of few stations for the long term measurement of magnetotelluric fields in central Italy. The selection of investigation sites was determined by the individual seismic interest of each location. The project began in the summer of 1991, with the installation of 2 magnetotelluric stations in the province of Isernia, (Collemeluccio and Montedimezzo. In 1992, 2 more stations became operative, one in the province of Rieti, (Fassinoro, the other in the province of L'Aquila, (S. Vittoria. For the purpose of this project, the magnetic observatory in L'Aquila was also equipped with electric lines, for the measurement of the telluric field. The aim of the analysis here presented, is to show that is possible to follow the temporal evolution of magnetotelluric characteristic parameters. At Collemeluccio this evolution was compared with the seismic released energy for events recorded within the study area.

  7. Analysis of events significant with regard to safety of Bohunice V-1 nuclear power plant

    International Nuclear Information System (INIS)

    Suchomel, J.; Maron, V.; Kmosena, J.

    1986-01-01

    An analysis was made of operating safety of the V-1 nuclear power plant in Jaslovske Bohunice for the years 1980 - 1983. Of the total number of 676 reported failures only three were events with special safety significance, namely a complete loss of power supply for own consumption from the power grid, a failure of pins on the collectors of steam generators, and a failure of the heads of heat technology inspection channels. The failures were categorized according to the systems used in the USSR and in the USA and compared with data on failures in nuclear power plants in the two countries. The conclusions show that the operation of the V-1 nuclear power plant achieves results which are fully comparable with those recorded in 9 WWER-440 power plants operating in various countries. The average coefficient of availability is 0.72 and ranks the power plant in the fourth place among the said 9 plants. A comparison of the individual power plant units showed that of the total number of 22, the first unit of the V-1 plant ranks fifth with a coefficient of 0.78 and the second unit with a coefficient of 0.69 ranks 15th. (Z.M.)

  8. The july effect: an analysis of never events in the nationwide inpatient sample.

    Science.gov (United States)

    Wen, Timothy; Attenello, Frank J; Wu, Brian; Ng, Alvin; Cen, Steven Y; Mack, William J

    2015-07-01

    Prior studies examining the impact of the "July effect" on in-hospital mortality rates have generated variable results. In 2008, the Centers for Medicare & Medicaid Services published a series of high-cost, high-volume, nonreimbursable hospital-acquired complications (HACs). These events were believed to be preventable and indicate deficiencies in healthcare delivery. The present study aims to investigate the impact of July admissions on patient safety in a national sample using the HACs as a metric. Discharge data were collected from all admissions recorded in the Nationwide Inpatient Sample database from 2008 to 2011. HAC incidence was evaluated as a function of admission month, adjusting for demographic and hospital factors in multivariable analysis. The outcome measures were HAC occurrence, prolonged length of stay (LOS), and higher inpatient costs. A total of 143,019,381 inpatient admissions were recorded, with an overall HAC occurrence of 4.7%. July admissions accounted for 7.6% of the total number of inpatient admissions. July admissions experienced a 6% increase in likelihood of HAC occurrence (odds ratio = 1.06, 95% confidence interval: 1.06-1.07, P organization structure distinct from traditional quality measures, requiring novel transition protocols dedicated to improving HACs. © 2015 Society of Hospital Medicine.

  9. Analysis of core damage frequency, Surry, Unit 1 internal events appendices

    International Nuclear Information System (INIS)

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the appendices for the accident sequence analyses of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed is an extensive reanalysis of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.0E-5 per year, with a 95% upper bound of 1.3E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency

  10. Big Data Mining and Adverse Event Pattern Analysis in Clinical Drug Trials.

    Science.gov (United States)

    Federer, Callie; Yoo, Minjae; Tan, Aik Choon

    2016-12-01

    Drug adverse events (AEs) are a major health threat to patients seeking medical treatment and a significant barrier in drug discovery and development. AEs are now required to be submitted during clinical trials and can be extracted from ClinicalTrials.gov ( https://clinicaltrials.gov/ ), a database of clinical studies around the world. By extracting drug and AE information from ClinicalTrials.gov and structuring it into a database, drug-AEs could be established for future drug development and repositioning. To our knowledge, current AE databases contain mainly U.S. Food and Drug Administration (FDA)-approved drugs. However, our database contains both FDA-approved and experimental compounds extracted from ClinicalTrials.gov . Our database contains 8,161 clinical trials of 3,102,675 patients and 713,103 reported AEs. We extracted the information from ClinicalTrials.gov using a set of python scripts, and then used regular expressions and a drug dictionary to process and structure relevant information into a relational database. We performed data mining and pattern analysis of drug-AEs in our database. Our database can serve as a tool to assist researchers to discover drug-AE relationships for developing, repositioning, and repurposing drugs.

  11. Association between earthquake events and cholera outbreaks: a cross-country 15-year longitudinal analysis.

    Science.gov (United States)

    Sumner, Steven A; Turner, Elizabeth L; Thielman, Nathan M

    2013-12-01

    Large earthquakes can cause population displacement, critical sanitation infrastructure damage, and increased threats to water resources, potentially predisposing populations to waterborne disease epidemics such as cholera. Problem The risk of cholera outbreaks after earthquake disasters remains uncertain. A cross-country analysis of World Health Organization (WHO) cholera data that would contribute to this discussion has yet to be published. A cross-country longitudinal analysis was conducted among 63 low- and middle-income countries from 1995-2009. The association between earthquake disasters of various effect sizes and a relative spike in cholera rates for a given country was assessed utilizing fixed-effects logistic regression and adjusting for gross domestic product per capita, water and sanitation level, flooding events, percent urbanization, and under-five child mortality. Also, the association between large earthquakes and cholera rate increases of various degrees was assessed. Forty-eight of the 63 countries had at least one year with reported cholera infections during the 15-year study period. Thirty-six of these 48 countries had at least one earthquake disaster. In adjusted analyses, country-years with ≥10,000 persons affected by an earthquake had 2.26 times increased odds (95 CI, 0.89-5.72, P = .08) of having a greater than average cholera rate that year compared to country-years having earthquake. The association between large earthquake disasters and cholera infections appeared to weaken as higher levels of cholera rate increases were tested. A trend of increased risk of greater than average cholera rates when more people were affected by an earthquake in a country-year was noted. However these findings did not reach statistical significance at traditional levels and may be due to chance. Frequent large-scale cholera outbreaks after earthquake disasters appeared to be relatively uncommon.

  12. Discussion of comments from a peer review of a technique for human event analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Forester, J.A.; Ramey-Smith, A.; Bley, D.C.; Kolaczkowski, A.M.; Cooper, S.E.; Wreathall, J.

    1998-01-01

    In May of 1998, a technical basis and implementation guidelines document for A Technique for Human Event Analysis (ATHEANA) was issued as a draft report for public comment (NUREG-1624). In conjunction with the release of the draft NUREG, a paper review of the method, its documentation, and the results of an initial test of the method was held over a two-day period in Seattle, Washington, in June of 1998. Four internationally-known and respected experts in human reliability analysis (HRA) were selected to serve as the peer reviewers and were paid for their services. In addition, approximately 20 other individuals with an interest in HRA and ATHEANA also attended the peer review meeting and were invited to provide comments. The peer review team was asked to comment on any aspect of the method or the report in which improvements could be made and to discuss its strengths and weaknesses. All of the reviewers thought the ATEANA method had made significant contributions to the field of PRA/HRA, in particular by addressing the most important open questions and issues in HRA, by attempting to develop an integrated approach, and by developing a framework capable of identifying types of unsafe actions that generally have not been considered using existing methods. The reviewers had many concerns about specific aspects of the methodology and made many recommendations for ways to improve and extend the method, and to make its application more cost effective and useful to PRA in general. Details of the reviewers' comments and the ATHEANA team's responses to specific criticisms will be discussed

  13. Analysis of post-blasting source mechanisms of mining-induced seismic events in Rudna copper mine, Poland

    Directory of Open Access Journals (Sweden)

    Caputa Alicja

    2015-10-01

    Full Text Available The exploitation of georesources by underground mining can be responsible for seismic activity in areas considered aseismic. Since strong seismic events are connected with rockburst hazard, it is a continuous requirement to reduce seismic risk. One of the most effective methods to do so is blasting in potentially hazardous mining panels. In this way, small to moderate tremors are provoked and stress accumulation is substantially reduced. In this paper we present an analysis of post-blasting events using Full Moment Tensor (MT inversion at the Rudna mine, Poland, underground seismic network. In addition, we describe the problems we faced when analyzing seismic signals. Our studies show that focal mechanisms for events that occurred after blasts exhibit common features in the MT solution. The strong isotropic and small Double Couple (DC component of the MT, indicate that these events were provoked by detonations. On the other hand, post-blasting MT is considerably different than the MT obtained for strong mining events. We believe that seismological analysis of provoked and unprovoked events can be a very useful tool in confirming the effectiveness of blasting in seismic hazard reduction in mining areas.

  14. Assessment of Adverse Events in Protocols, Clinical Study Reports, and Published Papers of Trials of Orlistat: A Document Analysis.

    Directory of Open Access Journals (Sweden)

    Jeppe Bennekou Schroll

    2016-08-01

    seven papers stated that "all adverse events were recorded." For one trial, we identified an additional 1,318 adverse events that were not listed or mentioned in the CSR itself but could be identified through manually counting individual adverse events reported in an appendix. We discovered that the majority of patients had multiple episodes of the same adverse event that were only counted once, though this was not described in the CSRs. We also discovered that participants treated with orlistat experienced twice as many days with adverse events as participants treated with placebo (22.7 d versus 14.9 d, p-value < 0.0001, Student's t test. Furthermore, compared with the placebo group, adverse events in the orlistat group were more severe. None of this was stated in the CSR or in the published paper. Our analysis was restricted to one drug tested in the mid-1990s; our results might therefore not be applicable for newer drugs.In the orlistat trials, we identified important disparities in the reporting of adverse events between protocols, clinical study reports, and published papers. Reports of these trials seemed to have systematically understated adverse events. Based on these findings, systematic reviews of drugs might be improved by including protocols and CSRs in addition to published articles.

  15. Making systems with mutually exclusive events analysable by standard fault tree analysis tools

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    2001-01-01

    Methods are developed for analysing systems that comprise mutually exclusive events by fault tree techniques that accept only statistically independent basic events. Techniques based on equivalent models and numerical transformations are presented for phased missions and for systems with component-caused system-level common cause failures. Numerical examples illustrate the methods

  16. A trend analysis of human error events for proactive prevention of accidents. Methodology development and effective utilization

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Ebisu, Mitsuhiro; Aikawa, Takeshi; Matsubara, Katsuyuki

    2006-01-01

    This paper described methods for analyzing human error events that has been accumulated in the individual plant and for utilizing the result to prevent accidents proactively. Firstly, a categorization framework of trigger action and causal factors of human error events were reexamined, and the procedure to analyze human error events was reviewed based on the framework. Secondly, a method for identifying the common characteristics of trigger action data and of causal factor data accumulated by analyzing human error events was clarified. In addition, to utilize the results of trend analysis effectively, methods to develop teaching material for safety education, to develop the checkpoints for the error prevention and to introduce an error management process for strategic error prevention were proposed. (author)

  17. Local-scale analysis of temperature patterns over Poland during heatwave events

    Science.gov (United States)

    Krzyżewska, Agnieszka; Dyer, Jamie

    2018-01-01

    Heatwaves are predicted to increase in frequency, duration, and severity in the future, including over Central Europe where populations are sensitive to extreme temperature. This paper studies six recent major heatwave events over Poland from 2006 through 2015 using regional-scale simulations (10-km grid spacing, hourly frequency) from the Weather Research and Forecast (WRF) model to define local-scale 2-m temperature patterns. For this purpose, a heatwave is defined as at least three consecutive days with maximum 2-m air temperature exceeding 30 °C. The WRF simulations were validated using maximum daily 2-m temperature observations from 12 meteorological stations in select Polish cities, which were selected to have even spatial coverage across the study area. Synoptic analysis of the six study events shows that the inflow of tropical air masses from the south is the primary driver of heatwave onset and maintenance, the highest temperatures (and most vulnerable areas) occur over arable land and artificial surfaces in central and western Poland, while coastal areas in the north, mountain areas in the south, and forested and mosaic areas of smaller fields and pastures of the northwest, northeast, and southeast are less affected by prolonged periods of elevated temperatures. In general, regional differences in 2-m temperature between the hottest and coolest areas is about 2-4 °C. Large urban areas like Warsaw, or the large complex of artificial areas in the conurbation of Silesian cities, are also generally warmer than surrounding areas by roughly 2-4 °C, and even up to 6 °C, especially during the night. Additionally, hot air from the south of Poland flows through a low-lying area between two mountain ranges (Sudetes and Carpathian Mountains)—the so-called Moravian Gate—hitting densely populated urban areas (Silesian cities) and Cracow. These patterns occur only during high-pressure synoptic conditions with low cloudiness and wind and without any active fronts

  18. Using Event-history Analysis: Lessons from Fifteen Years of Practice

    Directory of Open Access Journals (Sweden)

    Le Bourdais, Céline

    2001-01-01

    Full Text Available EnglishInnovative statistical methods and new longitudinal surveys paved the way forthe widespread use of event-history analysis in social science during the last two decades. This paperdoes not attempt to provide a comprehensive review of these innovative methods. More modestly, it aimsat identifying and describing the problems encountered by two privileged users. Two types of problemsare discussed here. The first arises from the design of the surveys, or the way data are collected,and the difficulty in testing specific hypotheses with the existing databases. This is the kind ofproblem that Le Bordais has faced in analyzing family dynamics. The second has to do with thelimitations of the survival regression models when the longitudinal phenomena studied can no longerproperly be thought of as a small number of unique events. This is the type of problem enountered byRenaud in his ten-year Quebec panel sturvey of new immigrants.FrenchLes avancées récentes de la statistique et le développement de nouvellesenquêtes longitudinales ont suscité un engouement pour l’analyse des transitionsen sciences sociales au cours des deux dernières décennies. Cet article necherche pas à présenter une revue exhaustive des progrès qui ont été réalisésgrâce à l’utilisation de cette méthode d’analyse statistique. Plus modestement, iltente d’identifier et de décrire les problèmes rencontrés par deux chercheurs lorsde l’application de l’analyse des transitions. Deux types de problèmes sontdiscutés ici. Le premier est lié à la structure des enquêtes, soit à la nature mêmedes données recueillies, et à la difficulté de tester certaines hypothèses à partirdes bases de données existantes; c’est le type de problème rencontré par LeBourdais dans ses travaux sur la dynamique familiale. Le second tient auxlimites des modèles de l’analyse des transitions quand les phénomènes étudiésne peuvent plus être conceptualisés comme

  19. Cause analysis and preventives for human error events in Daya Bay NPP

    International Nuclear Information System (INIS)

    Huang Weigang; Zhang Li

    1998-01-01

    Daya Bay Nuclear Power Plant is put into commercial operation in 1994 Until 1996, there are 368 human error events in operating and maintenance area, occupying 39% of total events. These events occurred mainly in the processes of maintenance, test equipment isolation and system on-line, in particular in refuelling and maintenance. The author analyses root causes for human errorievents, which are mainly operator omission or error procedure deficiency; procedure not followed; lack of training; communication failures; work management inadequacy. The protective measures and treatment principle for human error events are also discussed, and several examples applying them are given. Finally, it is put forward that key to prevent human error event lies in the coordination and management, person in charge of work, and good work habits of staffs

  20. The economic burden of nurse-sensitive adverse events in 22 medical-surgical units: retrospective and matching analysis.

    Science.gov (United States)

    Tchouaket, Eric; Dubois, Carl-Ardy; D'Amour, Danielle

    2017-07-01

    The aim of this study was to assess the economic burden of nurse-sensitive adverse events in 22 acute-care units in Quebec by estimating excess hospital-related costs and calculating resulting additional hospital days. Recent changes in the worldwide economic and financial contexts have made the cost of patient safety a topical issue. Yet, our knowledge about the economic burden of safety of nursing care is quite limited in Canada in general and Quebec in particular. Retrospective analysis of charts of 2699 patients hospitalized between July 2008 - August 2009 for at least 2 days of 30-day periods in 22 medical-surgical units in 11 hospitals in Quebec. Data were collected from September 2009 to August 2010. Nurse-sensitive adverse events analysed were pressure ulcers, falls, medication administration errors, pneumonia and urinary tract infections. Descriptive statistics identified numbers of cases for each nurse-sensitive adverse event. A literature analysis was used to estimate excess median hospital-related costs of treatments with these nurse-sensitive adverse events. Costs were calculated in 2014 Canadian dollars. Additional hospital days were estimated by comparing lengths of stay of patients with nurse-sensitive adverse events with those of similar patients without nurse-sensitive adverse events. This study found that five adverse events considered nurse-sensitive caused nearly 1300 additional hospital days for 166 patients and generated more than Canadian dollars 600,000 in excess treatment costs. The results present the financial consequences of the nurse-sensitive adverse events. Government should invest in prevention and in improvements to care quality and patient safety. Managers need to strengthen safety processes in their facilities and nurses should take greater precautions. © 2017 John Wiley & Sons Ltd.

  1. Using Web Crawler Technology for Geo-Events Analysis: A Case Study of the Huangyan Island Incident

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2014-04-01

    Full Text Available Social networking and network socialization provide abundant text information and social relationships into our daily lives. Making full use of these data in the big data era is of great significance for us to better understand the changing world and the information-based society. Though politics have been integrally involved in the hyperlinked world issues since the 1990s, the text analysis and data visualization of geo-events faced the bottleneck of traditional manual analysis. Though automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have been explored and built recently, the data mining and information collection are not comprehensive enough because of the sensibility, complexity, relativity, timeliness, and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency, and dissemination path of the Huangyan Island incident were studied by using web crawler technology and the text analysis. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios, and dissemination flow graph, based on the crawled information and data processing not only highlight the characteristics of geo-event itself, but also implicate many interesting phenomenon and deep-seated problems behind it, such as related topics, theme vocabularies, subject contents, hot countries, event bodies, opinion leaders, high-frequency vocabularies, information sources, semantic structure, propagation paths, distribution of different attitudes, and regional difference of net citizens’ response in the Huangyan Island incident. Furthermore, the text analysis of network information with the help of focused web crawler is able to express the time-space relationship of crawled information and the information characteristic of semantic network to the geo-events. Therefore, it is a useful tool to

  2. Assessment of realistic nowcasting lead-times based on predictability analysis of Mediterranean Heavy Precipitation Events

    Science.gov (United States)

    Bech, Joan; Berenguer, Marc

    2014-05-01

    ' precipitation forecasts showed some skill (improvement over persistence) for lead times up to 60' for moderate intensities (up to 1 mm in 30') and up to 2.5h for lower rates (above 0.1 mm). However an important event-to-event variability has been found as illustrated by the fact that hit rates of rain-no-rain forecasts achieved the 60% value at 90' in the 7 September 2005 and only 40' in the 2 November 2008 case. The discussion of these results provides useful information on the potential application of nowcasting systems and realistic values to be contrasted with specific end-user requirements. This work has been done in the framework of the Hymex research programme and has been partly funded by the ProFEWS project (CGL2010-15892). References Bech J, N Pineda, T Rigo, M Aran, J Amaro, M Gayà, J Arús, J Montanyà, O van der Velde, 2011: A Mediterranean nocturnal heavy rainfall and tornadic event. Part I: Overview, damage survey and radar analysis. Atmospheric Research 100:621-637 http://dx.doi.org/10.1016/j.atmosres.2010.12.024 Bech J, R Pascual, T Rigo, N Pineda, JM López, J Arús, and M Gayà, 2007: An observational study of the 7 September 2005 Barcelona tornado outbreak. Natural Hazards and Earth System Science 7:129-139 http://dx.doi.org/10.5194/nhess-7-129-2007 Berenguer M, C Corral, R Sa'nchez-Diezma, D Sempere-Torres, 2005: Hydrological validation of a radarbased nowcasting technique. Journal of Hydrometeorology 6: 532-549 http://dx.doi.org/10.1175/JHM433.1 Berenguer M, D Sempere, G Pegram, 2011: SBMcast - An ensemble nowcasting technique to assess the uncertainty in rainfall forecasts by Lagrangian extrapolation. Journal of Hydrology 404: 226-240 http://dx.doi.org/10.1016/j.jhydrol.2011.04.033 Pierce C, A Seed, S Ballard, D Simonin, Z Li, 2012: Nowcasting. In Doppler Radar Observations (J Bech, JL Chau, ed.) Ch. 13, 98-142. InTech, Rijeka, Croatia http://dx.doi.org/10.5772/39054

  3. Using high complexity analysis to probe the evolution of organic aerosol during pollution events in Beijing

    Science.gov (United States)

    Hamilton, J.; Dixon, W.; Dunmore, R.; Squires, F. A.; Swift, S.; Lee, J. D.; Rickard, A. R.; Sun, Y.; Xu, W.

    2017-12-01

    There is increasing evidence that exposure to air pollution results in significant impacts on human health. In Beijing, home to over 20 million inhabitants, particulate matter levels are very high by international standards, with official estimates of an annual mean PM2.5 concentration in 2014 of 86 μg m-3, nearly 9 times higher than the WHO guideline. Changes in particle composition during pollution events will provide key information on sources and can be used to inform strategies for pollution mitigation and health benefits. The organic fraction of PM is an extremely complex mixture reflecting the diversity of sources to the atmosphere. In this study we attempt to harness the chemical complexity of OA by developing an extensive database of over 700 mass spectra, built using literature data and sources specific tracers (e.g. diesel emission characterisation experiments and SOA generated in chamber simulations). Using a high throughput analysis method (15 min), involving UHPLC coupled to Orbitrap mass spectrometry, chromatograms are integrated, compared to the library and a list of identified compounds produced. Purpose built software based on R is used to automatically produce time series, alongside common aerosol metrics and data visualisation techniques, dramatically reducing analysis times. Offline measurements of organic aerosol composition were made as part of the Sources and Emissions of Air Pollutants in Beijing project, a collaborative program between leading UK and Chinese research groups. Rather than studying only a small number of 24 hr PM samples, we collected 250 filters samples at a range of different time resolutions, from 30 minutes to 12 hours, depending on the time of day and PM loadings. In total 643 species were identified based on their elemental formula and retention time, with species ranging from C2-C22 and between 1-13 oxygens. A large fraction of the OA species observed were organosulfates and/or nitrates. Here we will present

  4. Robust non-parametric one-sample tests for the analysis of recurrent events.

    Science.gov (United States)

    Rebora, Paola; Galimberti, Stefania; Valsecchi, Maria Grazia

    2010-12-30

    One-sample non-parametric tests are proposed here for inference on recurring events. The focus is on the marginal mean function of events and the basis for inference is the standardized distance between the observed and the expected number of events under a specified reference rate. Different weights are considered in order to account for various types of alternative hypotheses on the mean function of the recurrent events process. A robust version and a stratified version of the test are also proposed. The performance of these tests was investigated through simulation studies under various underlying event generation processes, such as homogeneous and nonhomogeneous Poisson processes, autoregressive and renewal processes, with and without frailty effects. The robust versions of the test have been shown to be suitable in a wide variety of event generating processes. The motivating context is a study on gene therapy in a very rare immunodeficiency in children, where a major end-point is the recurrence of severe infections. Robust non-parametric one-sample tests for recurrent events can be useful to assess efficacy and especially safety in non-randomized studies or in epidemiological studies for comparison with a standard population. Copyright © 2010 John Wiley & Sons, Ltd.

  5. Analysis of economic and social costs of adverse events associated with blood transfusions in Spain.

    Science.gov (United States)

    Ribed-Sánchez, Borja; González-Gaya, Cristina; Varea-Díaz, Sara; Corbacho-Fabregat, Carlos; Bule-Farto, Isabel; Pérez de-Oteyza, Jaime

    2018-02-16

    To calculate, for the first time, the direct and social costs of transfusion-related adverse events in order to include them in the National Healthcare System's budget, calculation and studies. In Spain more than 1,500 patients yearly are diagnosed with such adverse events. Blood transfusion-related adverse events recorded yearly in Spanish haemovigilance reports were studied retrospectively (2010-2015). The adverse events were coded according to the classification of Diagnosis-Related Groups. The direct healthcare costs were obtained from public information sources. The productivity loss (social cost) associated with adverse events was calculated using the human capital and hedonic salary methodologies. In 2015, 1,588 patients had adverse events that resulted in direct health care costs (4,568,914€) and social costs due to hospitalization (200,724€). Three adverse reactions resulted in patient death (at a social cost of 1,364,805€). In total, the cost of blood transfusion-related adverse events was 6,134,443€ in Spain. For the period 2010-2015: the trends show a reduction in the total amount of transfusions (2 vs. 1.91M€; -4.4%). The number of adverse events increased (822 vs. 1,588; +93%), as well as their related direct healthcare cost (3.22 vs. 4.57M€; +42%) and the social cost of hospitalization (110 vs 200M€; +83%). Mortality costs decreased (2.65 vs. 1.36M€; -48%). This is the first time that the costs of post-transfusion adverse events have been calculated in Spain. These new figures and trends should be taken into consideration in any cost-effectiveness study or trial of new surgical techniques or sanitary policies that influence blood transfusion activities. Copyright © 2018 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. Application of a new technique for human event analysis (ATHEANA) at a pressurized-water reactor

    International Nuclear Information System (INIS)

    Forester, J.A.; Kiper, K.; Ramey-Smith, A.

    1998-04-01

    Over the past several years, the US Nuclear Regulatory Commission (NRC) has sponsored the development of a new method for performing human reliability analyses (HRAs). A major impetus for the program was the recognized need for a method that would not only address errors of omission (EOOs), but also errors of commission (EOCs). Although several documents have been issued describing the basis and development of the new method referred to as ''A Technique for Human Event Analysis'' (ATHEANA), two documents were drafted to initially provide the necessary documentation for applying the method: the frame of reference (FOR) manual, which served as the technical basis document for the method and the implementation guideline (IG), which provided step by step guidance for applying the method. Upon the completion of the draft FOR manual and the draft IG in April 1997, along with several step-throughs of the process by the development team, the method was ready for a third-party test. The method was demonstrated at Seabrook Station in July 1997. The main goals of the demonstration were to (1) test the ATHENA process as described in the FOR manual and the IG, (2) test a training package developed for the method, (3) test the hypothesis that plant operators and trainers have significant insight into the EFCs that can make UAs more likely, and (4) identify ways to improve the method and its documentation. The results of the Seabrook demonstration are evaluated against the success criteria, and important findings and recommendations regarding ATHENA that were obtained from the demonstration are presented here

  7. On risk analysis for repositories in northern Switzerland: extent and probability of geological processes and events

    International Nuclear Information System (INIS)

    Buergisser, H.M.; Herrnberger, V.

    1981-01-01

    The literature study assesses, in the form of expert analysis, geological processes and events for a 1200 km 2 -area of northern Switzerland, with regard to repositories for medium- and high-active waste (depth 100 to 600 m and 600 to 2500 m, respectively) over the next 10 6 years. The area, which comprises parts of the Tabular Jura, the folded Jura and the Molasse Basin, the latter two being parts of the Alpine Orogene, has undergone a non-uniform geologic development since the Oligocene. Within the next 10 4 to 10 5 years a maximum earthquake intensity of VIII-IX (MSK-scale) has been predicted. After this period, particularly in the southern and eastern parts of the area, glaciations will probably occur, with associated erosion of possibly 200 to 300 m. Fluvial erosion as a reponse to an uplift could reach similar values after 10 5 to 10 6 years; however, there are no data on the recent relative vertical crustal movements of the area. The risk of a meteorite impact is considered small as compared to that of these factors. Seismic activity and the position and extent of faults are so poorly known within the area that the faulting probability cannot be derived at present. Flooding by the sea, intrusion of magma, diapirism, metamorphism and volcanic eruptions are not considered to be risk factors for final repositories in northern Switzerland. For the shallow-type repositories, the risk of denudation and landslides have to be judged when locality-bound projects have been proposed. (Auth.)

  8. A Mediterranean nocturnal heavy rainfall and tornadic event. Part I: Overview, damage survey and radar analysis

    Science.gov (United States)

    Bech, Joan; Pineda, Nicolau; Rigo, Tomeu; Aran, Montserrat; Amaro, Jéssica; Gayà, Miquel; Arús, Joan; Montanyà, Joan; der Velde, Oscar van

    2011-06-01

    This study presents an analysis of a severe weather case that took place during the early morning of the 2nd of November 2008, when intense convective activity associated with a rapidly evolving low pressure system affected the southern coast of Catalonia (NE Spain). The synoptic framework was dominated by an upper level trough and an associated cold front extending from Gibraltar along the Mediterranean coast of the Iberian Peninsula to SE France, which moved north-eastward. South easterly winds in the north of the Balearic Islands and the coast of Catalonia favoured high values of 0-3 km storm relative helicity which combined with moderate MLCAPE values and high shear favoured the conditions for organized convection. A number of multicell storms and others exhibiting supercell features, as indicated by Doppler radar observations, clustered later in a mesoscale convective system, and moved north-eastwards across Catalonia. They produced ground-level strong damaging wind gusts, an F2 tornado, hail and heavy rainfall. Total lightning activity (intra-cloud and cloud to ground flashes) was also relevant, exhibiting several classical features such as a sudden increased rate before ground level severe damage, as discussed in a companion study. Remarkable surface observations of this event include 24 h precipitation accumulations exceeding 100 mm in four different observatories and 30 minute rainfall amounts up to 40 mm which caused local flash floods. As the convective system evolved northward later that day it also affected SE France causing large hail, ground level damaging wind gusts and heavy rainfall.

  9. Myasthenia gravis: descriptive analysis of life-threatening events in a recent nationwide registry.

    Science.gov (United States)

    Ramos-Fransi, A; Rojas-García, R; Segovia, S; Márquez-Infante, C; Pardo, J; Coll-Cantí, J; Jericó, I; Illa, I

    2015-07-01

    Myasthenia gravis (MG) may become life-threatening if patients have respiratory insufficiency or dysphagia. This study aimed to determine the incidence, demographic characteristics, risk factors, response to treatment and outcome of these life-threatening events (LTEs) in a recent, population-based sample of MG patients. A retrospective analysis of MG patients who presented with an LTE between 2000 and 2013 was performed. Participants were identified from a neuromuscular diseases registry in Spain that includes 648 patients with MG (NMD-ES). Sixty-two (9.56%) patients had an LTE. Thirty-two were classified as class V according to the MG Foundation of America, and 30 as class IVB. Fifty per cent were previously diagnosed with MG and median duration of the disease before the LTE was 24 months (3-406). The most common related factor was infection (n = 18). All patients received intravenous human immunoglobulin; 11 had a second infusion and six had plasma exchange. Median time to feeding tube removal was 13 days (1-434). Median time to weaning from ventilation was 12 days (3-176), and it was significantly shorter in late onset MG (≥50 years) (P = 0.019). LTEs improved <2 weeks in 55.8% but did not improve until after 1 month in 20% of patients. Four patients died. No other factors influenced mortality or duration of LTEs. The percentage of LTEs in MG patients was low, particularly amongst those previously diagnosed and treated for the disease. The significant percentage of treatment-resistant LTEs indicates that more effective treatment approaches are needed for this vulnerable sub-population. © 2015 EAN.

  10. Risk analysis for repositories in north Switzerland. Extent and probability of geologic processes and events

    Energy Technology Data Exchange (ETDEWEB)

    Buergisser, H M; Herrnberger, V

    1981-07-01

    The literature study assesses, in the form of expert analysis, geological processes and events for a 1200 km/sup 2/-area of northern Switzerland, with regard to repositories for medium- and high-active waste (depth 100 to 600 m and 600 to 2500 m, respectively) over the next 10/sup 6/ years. The area, which comprises parts of the Tabular Jura, the folded Jura and the Molasse Basin, the latter two being parts of the Alpine Orogene, has undergone a non-uniform geologic development since the Oligocene. Within the next 10/sup 4/ to 10/sup 5/ years a maximum earthquake intensity of VIII-IX (MSK-scale) has been predicted. After this period, particularly in the southern and eastern parts of the area, glaciations will probably occur, with asociated erosion of possibly 200 to 300 m. Fluvial erosion as a response to an uplift could reach similar values after 10/sup 5/ to 10/sup 6/ years; however, there are no data on the recent relative vertical crustal movements of the area. The risk of a meteorite impact is considered small as compared to that of these factors. Seismic activity and the position and extent of faults are so poorly known within the area that the faulting probability cannot be derived at present. Flooding by the sea, intrusion of magma, diapirism, metamorphism and volcanic eruptions are not considered to be risk factors for final repositories in northern Switzerland. For the shallow-type repositories, the risk of denudation and landslides have to be judged when locality-bound projects have been proposed.

  11. Tracing footprints of environmental events in tree ring chemistry using neutron activation analysis

    Science.gov (United States)

    Sahin, Dagistan

    The aim of this study is to identify environmental effects on tree-ring chemistry. It is known that industrial pollution, volcanic eruptions, dust storms, acid rain and similar events can cause substantial changes in soil chemistry. Establishing whether a particular group of trees is sensitive to these changes in soil environment and registers them in the elemental chemistry of contemporary growth rings is the over-riding goal of any Dendrochemistry research. In this study, elemental concentrations were measured in tree-ring samples of absolutely dated eleven modern forest trees, grown in the Mediterranean region, Turkey, collected and dated by the Malcolm and Carolyn Wiener Laboratory for Aegean and Near Eastern Dendrochronology laboratory at Cornell University. Correlations between measured elemental concentrations in the tree-ring samples were analyzed using statistical tests to answer two questions. Does the current concentration of a particular element depend on any other element within the tree? And, are there any elements showing correlated abnormal concentration changes across the majority of the trees? Based on the detailed analysis results, the low mobility of sodium and bromine, positive correlations between calcium, zinc and manganese, positive correlations between trace elements lanthanum, samarium, antimony, and gold within tree-rings were recognized. Moreover, zinc, lanthanum, samarium and bromine showed strong, positive correlations among the trees and were identified as possible environmental signature elements. New Dendrochemistry information found in this study would be also useful in explaining tree physiology and elemental chemistry in Pinus nigra species grown in Turkey. Elemental concentrations in tree-ring samples were measured using Neutron Activation Analysis (NAA) at the Pennsylvania State University Radiation Science and Engineering Center (RSEC). Through this study, advanced methodologies for methodological, computational and

  12. Interval estimation of the overall treatment effect in a meta-analysis of a few small studies with zero events

    NARCIS (Netherlands)

    Pateras, Konstantinos; Nikolakopoulos, Stavros; Mavridis, Dimitris; Roes, Kit C.B.

    2018-01-01

    When a meta-analysis consists of a few small trials that report zero events, accounting for heterogeneity in the (interval) estimation of the overall effect is challenging. Typically, we predefine meta-analytical methods to be employed. In practice, data poses restrictions that lead to deviations

  13. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    Science.gov (United States)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  14. Analysis of early events in the interaction between Fusarium graminearum and the susceptible barley (Hordeum vulgare) cultivar Scarlett

    DEFF Research Database (Denmark)

    Yang, Fen; Jensen, J.D.; Svensson, Birte

    2010-01-01

    A proteomic analysis was conducted to map the events during the initial stages of the interaction between the fungal pathogen Fusarium graminearum and the susceptible barley cultivar Scarlett. Quantification of fungal DNA demonstrated a sharp increase in fungal biomass in barley spikelets at 3 da...

  15. Analysis on typical illegal events for nuclear safety class 1 valve

    International Nuclear Information System (INIS)

    Tian Dongqing; Gao Runsheng; Jiao Dianhui; Yang Lili; Chen Peng

    2014-01-01

    Illegal welding events of nuclear safety class l valve forging occurred to the manufacturer, while the valve was returned to be repaired. Illegal nondestructive test event of nuclear safety class valve occurred also to the manufacturer in the manufacturing process. The two events have resulted in quality incipient fault for the installed valves and the valves in the manufacturing process. It was reflected that operation of the factory quality assurance system isn't activated, and nuclear power engineering and operating company have insufficient supervision. The event-related parties should strengthen quality management and process control, get rid of the quality incipient fault, and experience feedback should be done well to guarantee quality of equipment in nuclear power plant. (authors)

  16. Multivariate analysis methods to tag b quark events at LEP/SLC

    International Nuclear Information System (INIS)

    Brandl, B.; Falvard, A.; Guicheney, C.; Henrard, P.; Jousset, J.; Proriol, J.

    1992-01-01

    Multivariate analyses are applied to tag Z → bb-bar events at LEP/SLC. They are based on the specific b-event shape caused by the large b-quark mass. Discriminant analyses, classification trees and neural networks are presented and their performances are compared. It is shown that the neural network approach, due to its non-linearity, copes best with the complexity of the problem. As an example for an application of the developed methods the measurement of Γ(Z → bb-bar) is discussed. The usefulness of methods based on the global event shape is limited by the uncertainties introduced by the necessity of event simulation. As solution a double tag method is presented which can be applied to many tasks of LEP/SLC heavy flavour physics. (author) 29 refs.; 6 figs.; 1 tab

  17. Posttraumatic stress disorder in bosnian war veterans: Analysis of stress events and risk factors

    Directory of Open Access Journals (Sweden)

    Kuljić Blagoje

    2004-01-01

    Full Text Available The aim of this study was to determine the incidence of Post-Traumatic Stress Disorder (PTSD, the characteristics of stress-related events, and the risk factors for the development of PTSD. The total patient sample consisted of 100 Bosnian war veterans. Watson’s PTSD module was used in establishing PTSD diagnosis. Patients fulfilled the following questionnaires: personal data form, Posttraumatic Symptom Scale PTSS-10 (Holen, Impact of Event Scale (Horowitz, Life Event Scale, and Eysenck Personality Inventory. PTSD was diagnosed in 30% of the examined patients. Larger number of stress-related events, particularly of those regarded as life-threatening, wounding/death of a close person, and material losses were more frequent in persons with PTSD. The risk factors for the development of PTSD in this study were: age (30-40, marital status (married, lower level of education, the front-line combat exposure, neurotic manifestations, family problems in childhood, and neuroticism.

  18. Mean occurrence frequency and temporal risk analysis of solar particle events

    International Nuclear Information System (INIS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.; Wilson, John W.

    2006-01-01

    The protection of astronauts from space radiation is required on future exploratory class and long-duration missions. For the accurate projections of radiation doses, a solar cycle statistical model, which quantifies the progression level within the cycle, has been developed. The resultant future cycle projection is then applied to estimate the mean frequency of solar particle events (SPEs) in the near future using a power law function of sunspot number. Detailed temporal behaviors of the recent large event and two historically large events of the August 1972 SPE and the November 1960 SPE are analyzed for dose-rate and cumulative dose equivalent at sensitive organs. Polyethylene shielded 'storm shelters' inside spacecraft are studied to limit astronauts' total exposure at a sensitive site within 10 cSv from a large event as a potential goal that fulfills the ALARA (as low as reasonably achievable) requirement

  19. Analysis of adverse events as a contribution to safety culture in the context of practice development

    Science.gov (United States)

    Hoffmann, Susanne; Frei, Irena Anna

    2017-01-01

    Background: Analysing adverse events is an effective patient safety measure. Aim: We show, how clinical nurse specialists have been enabled to analyse adverse events with the „Learning from Defects-Tool“ (LFD-Tool). Method: Our multi-component implementation strategy addressed both, the safety knowledge of clinical nurse specialists and their attitude towards patient safety. The culture of practice development was taken into account. Results: Clinical nurse specialists relate competency building on patient safety due to the application of the LFD-tool. Applying the tool, fosters the reflection of adverse events in care teams. Conclusion: Applying the „Learning from Defects-Tool“ promotes work-based learning. Analysing adverse events with the „Learning from Defects-Tool“ contributes to the safety culture in a hospital.

  20. Error Analysis in the Joint Event Location/Seismic Calibration Inverse Problem

    National Research Council Canada - National Science Library

    Rodi, William L

    2006-01-01

    This project is developing new mathematical and computational techniques for analyzing the uncertainty in seismic event locations, as induced by observational errors and errors in travel-time models...

  1. Contrasting safety assessments of a runway incursion scenario: Event sequence analysis versus multi-agent dynamic risk modelling

    International Nuclear Information System (INIS)

    Stroeve, Sybert H.; Blom, Henk A.P.; Bakker, G.J.

    2013-01-01

    In the safety literature it has been argued, that in a complex socio-technical system safety cannot be well analysed by event sequence based approaches, but requires to capture the complex interactions and performance variability of the socio-technical system. In order to evaluate the quantitative and practical consequences of these arguments, this study compares two approaches to assess accident risk of an example safety critical sociotechnical system. It contrasts an event sequence based assessment with a multi-agent dynamic risk model (MA-DRM) based assessment, both of which are performed for a particular runway incursion scenario. The event sequence analysis uses the well-known event tree modelling formalism and the MA-DRM based approach combines agent based modelling, hybrid Petri nets and rare event Monte Carlo simulation. The comparison addresses qualitative and quantitative differences in the methods, attained risk levels, and in the prime factors influencing the safety of the operation. The assessments show considerable differences in the accident risk implications of the performance of human operators and technical systems in the runway incursion scenario. In contrast with the event sequence based results, the MA-DRM based results show that the accident risk is not manifest from the performance of and relations between individual human operators and technical systems. Instead, the safety risk emerges from the totality of the performance and interactions in the agent based model of the safety critical operation considered, which coincides very well with the argumentation in the safety literature.

  2. Discrete Event Simulation for the Analysis of Artillery Fired Projectiles from Shore

    Science.gov (United States)

    2017-06-01

    model. 2.1 Discrete Event Simulation with Simkit Simkit is a library of classes and interfaces, written in Java , that support ease of implemen- tation...Simkit allows simulation modelers to break complex systems into components through a framework of Listener Event Graph Objects (LEGOs), described in...Classes A disadvantage to using Java Enum Types is the inability to change the values of Enum Type parameters while conducting a designed experiment

  3. Analysis of geohazards events along Swiss roads from autumn 2011 to present

    Science.gov (United States)

    Voumard, Jérémie; Jaboyedoff, Michel; Derron, Marc-Henri

    2014-05-01

    In Switzerland, roads and railways are threatened throughout the year by several natural hazards. Some of these events reach transport infrastructure many time per year leading to the closing of transportation corridors, loss of access, deviation travels and sometimes infrastructures damages and loss of human lives (3 fatalities during the period considered). The aim of this inventory of events is to investigate the number of natural events affecting roads and railways in Switzerland since autumn 2011 until now. Natural hazards affecting roads and railway can be classified in five categories: rockfalls, landslides, debris flows, snow avalanches and floods. They potentially cause several important direct damages on transportation infrastructure (roads, railway), vehicles (slightly or very damaged) or human life (slightly or seriously injured person, death). These direct damages can be easily evaluated from press articles or from Swiss police press releases. Indirect damages such as deviation cost are not taken into account in this work. During the two a half last years, about 50 events affecting the Swiss roads and Swiss railways infrastructures were inventoried. The proportion of events due to rockfalls is 45%, to landslides 25%, to debris flows 15%, to snow avalanches 10% and to floods 5%. During this period, three fatalities and two persons were injured while 23 vehicles (car, trains and coach) and 24 roads and railways were damaged. We can see that floods occur mainly on the Swiss Plateau whereas rockfalls, debris flow, snow avalanches and landslides are mostly located in the Alpine area. Most of events occur on secondary mountain roads and railways. The events are well distributed on the whole Alpine area except for the Gotthard hotspot, where an important European North-South motorway (hit in 2003 with two fatalities) and railway (hit three times in 2012 with one fatalities) are more frequently affected. According to the observed events in border regions of

  4. Maximum Credible Event Analysis Methods-Tools and Applications in Biosecurity Programs

    International Nuclear Information System (INIS)

    Rao, V.

    2007-01-01

    Maximum Credible Event (MCE) analyses are analogous to worst-case scenarios involving a likely mishap scenario in biotechnology bioprocessing operations, biological products testing laboratories, and biological specimen repository facilities, leading to release of particulate/aerosolized etiologic agents into the environment. The purpose of MCE analyses is to estimate the effectiveness of existing safeguards such as the engineering controls, administrative procedures and the attributes of facility design that, in combination, prevent the probability of release of potentially pathogenic or toxic material from the test facility to external environment. As part of our support to the United States Chemical Biological Defense Program, we have developed a unique set og realistic MCE worst-case scenarios for all laboratory and industrial aspects of a biological product development process. Although MCE analysis is a part of an overall facility biosafety assessment, our approach considered biosecurity related issues such as facility vulnerability, employment procedures and workers background investigations, exercise and drills involving local law enforcement and emergency response community, records and audits process, and facility biosafety and biosecurity oversight and governance issues. our standard operating procedure for tracking biological material transfer agreements and operating procedures for materials transfer, together with an integrated checklist of biosafety/biosecurity facility inspection and evaluation was to ensure compliance with all biosafety and biosecurity guidelines.The results of MCE analysis, described in terms of potential hazard of exposure for workers and immediate environment to etiologic agents from the manufacturing process, is a quasi-quantitative estimate of the nature and extent of adverse impact on the health and immediate environment at the vicinity. Etiologic agent exposure concentrations are estimated based on a Gaussian air depression

  5. Radionuclide data analysis in connection of DPRK event in May 2009

    Science.gov (United States)

    Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei

    2010-05-01

    The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.

  6. Tourist event "Days of plum" at Blace: Demographic and geographic analysis of visitors

    Directory of Open Access Journals (Sweden)

    Lović Suzana

    2012-01-01

    Full Text Available The event "Days of Plum - My Plum" at Blace has been one of 42 events dedicated to fruits and vegetables and one of three events dedicated to plum in Serbia. It has been held for nine consecutive years in the town situated in the wide Toplica valley at the foot of Jastrebac, where in a relatively favourable climate conditions there are good conditions for development of plum, so it has become a traditional event. This paper analyzes the results of a survey conducted during the last event, August 2011. The survey is used as methodical procedure because in relatively short time period a relatively large amount of information and data has been obtained. The survey includes 304 randomly selected respondents of different gender, age and educational structures. It was performed to examine the tourism market, attitudes and behaviour of visitors, as well as tourism promotion. In addition to the survey, the tourist valorisation of events is done in which the elements of geographic and economic groups of criteria are analyzed in order to investigate the tourism potential in terms of development of tourism as an economic sector that can contribute to the development of Blace as an underdeveloped area. [Projekat Ministarstva nauke Republike Srbije, br. 47007

  7. The four faces of rumination to stressful events: A psychometric analysis.

    Science.gov (United States)

    García, Felipe E; Duque, Almudena; Cova, Félix

    2017-11-01

    To increase the knowledge of rumination and its associations with stressful events, we explored the relationships between 4 types of rumination (brooding, reflection, intrusive, and deliberate rumination) in a sample of 750 adult participants who experienced a highly stressful event. We also explored the predictive value of the different types of rumination on posttraumatic stress symptoms and posttraumatic growth 6 months after the highly stressful event occurred. Participants completed the Ruminative Response Scale and the Event-Related Rumination Inventory. Brooding and reflection rumination were obtained from the Ruminative Response Scale, whereas deliberate and intrusive rumination were obtained from the Event-Related Rumination Inventory. Confirmatory factorial analyses were conducted using the 4 types of rumination to test 3 different models: (a) 4-factor model (brooding, reflection, intrusive, and deliberate rumination), (b) 2-factor model: adaptive rumination (reflection and deliberate) and maladaptive rumination (brooding and intrusive), and (c) 2-factor model: depressive rumination (brooding and reflection) and posttraumatic rumination (intrusive and deliberate). It was observed that the 4-factor model showed the best fit to the data. Moreover, 6 months later it was observed that the most significant predictor of posttraumatic symptoms was intrusive rumination, whereas deliberate rumination was the most significant predictor of posttraumatic growth. Results indicate that the 4 types of rumination are differentiated constructs. Ruminative thoughts experienced after a stressful event predicted posttraumatic consequences 6 months later. Implications of these findings are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Analysis of failure events for expansion joints in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Masahiro [Institute of Nuclear Safety System Inc., Mihama, Fukui (Japan)

    2001-09-01

    Although a large number of expansion joints are used in nuclear power plants with light water reactors, their failure events have not been paid as much attention as those of vessels and pipes. However, as the operation period of nuclear power plants becomes longer, it is necessary to pay attention to their failure events as well as those of vessels and pipes, because aging problems and latent troubles originated in design or fabrication of expansion joints may appear during their long period operation. In this work, we investigated failure event reports of expansion joints in nuclear power plants both in Japan and in U.S.A. and analyzed (1) the influence to output power level, (2) the position and (3) the cause of each failure. It is revealed that the failure events of expansion joints have continuously occurred, some of which have exerted influence upon power level and have caused fatal or injury accidents of personnel, and hence the importance of corrective actions to prevent the recurrence of such events is pointed out. The importance of countermeasures to the following individual events is also pointed out: (1) corrosion of expansion joints in service water systems, (2) degradation of rubber expansion joints in main condensers, (3) vibration and fatigue of expansion joints in extraction steam lines and (4) transgranular stress corrosion cracking of penetration bellows of containments. (author)

  9. Analysis and verification of a prediction model of solar energetic proton events

    Science.gov (United States)

    Wang, J.; Zhong, Q.

    2017-12-01

    The solar energetic particle event can cause severe radiation damages near Earth. The alerts and summary products of the solar energetic proton events were provided by the Space Environment Prediction Center (SEPC) according to the flux of the greater than 10 MeV protons taken by GOES satellite in geosynchronous orbit. The start of a solar energetic proton event is defined as the time when the flux of the greater than 10 MeV protons equals or exceeds 10 proton flux units (pfu). In this study, a model was developed to predict the solar energetic proton events, provide the warning for the solar energetic proton events at least minutes in advance, based on both the soft X-ray flux and integral proton flux taken by GOES. The quality of the forecast model was measured against verifications of accuracy, reliability, discrimination capability, and forecast skills. The peak flux and rise time of the solar energetic proton events in the six channels, >1MeV, >5 MeV, >10 MeV, >30 MeV, >50 MeV, >100 MeV, were also simulated and analyzed.

  10. The fuzzy set theory application to the analysis of accident progression event trees with phenomenological uncertainty issues

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Ahn, Kwang-Il

    1991-01-01

    Fuzzy set theory provides a formal framework for dealing with the imprecision and vagueness inherent in the expert judgement, and therefore it can be used for more effective analysis of accident progression of PRA where experts opinion is a major means for quantifying some event probabilities and uncertainties. In this paper, an example application of the fuzzy set theory is first made to a simple portion of a given accident progression event tree with typical qualitative fuzzy input data, and thereby computational algorithms suitable for application of the fuzzy set theory to the accident progression event tree analysis are identified and illustrated with example applications. Then the procedure used in the simple example is extended to extremely complex accident progression event trees with a number of phenomenological uncertainty issues, i.e., a typical plant damage state 'SEC' of the Zion Nuclear Power Plant risk assessment. The results show that the fuzzy averages of the fuzzy outcomes are very close to the mean values obtained by current methods. The main purpose of this paper is to provide a formal procedure for application of the fuzzy set theory to accident progression event trees with imprecise and qualitative branch probabilities and/or with a number of phenomenological uncertainty issues. (author)

  11. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    Science.gov (United States)

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  12. Two damaging hydrogeological events in Calabria, September 2000 and November 2015. Comparative analysis of causes and effects

    Science.gov (United States)

    Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela

    2016-04-01

    Each year, especially during winter season, some episode of intense rain affects Calabria, the southernmost Italian peninsular region, triggering flash floods and mass movements that cause damage and fatalities. This work presents a comparative analysis between two events that affected the southeast sector of the region, in 2000 and 2014, respectively. The event occurred between 9th and 10th of September 2000 is known in Italy as Soverato event, after the name of the municipality where it reached the highest damage severity. In the Soverato area, more than 200 mm of rain that fell in 24 hours caused a disastrous flood that swept away a campsite at about 4 a.m., killing 13 people and hurting 45. Besides, the rain affected a larger area, causing damage in 89 (out of 409) municipalities of the region. Flooding was the most common process, which damaged housing and trading. Landslide mostly affected the road network, housing and cultivations. The most recent event affected the same regional sector between 30th October and 2nd November 2015. The daily rain recorded at some of the rain gauges of the area almost reached 400 mm. Out of the 409 municipalities of Calabria, 109 suffered damage. The most frequent types of processes were both flash floods and landslides. The most heavily damaged element was the road network: the representative picture of the event is a railway bridge destroyed by the river flow. Housing was damaged too, and 486 people were temporarily evacuated from home. The event also caused a victim killed by a flood. The event-centred study approach aims to highlight differences and similarities in both the causes and the effects of the two events that occurred at a temporal distance of 14 years. The comparative analysis focus on three main aspects: the intensity of triggering rain, the modifications of urbanised areas, and the evolution of emergency management. The comparative analysis of rain is made by comparing the return period of both daily and

  13. An analysis of risk factors and adverse events in ambulatory surgery

    Directory of Open Access Journals (Sweden)

    Kent C

    2014-06-01

    Full Text Available Christopher Kent, Julia Metzner, Laurent BollagDepartment of Anesthesiology and Pain Medicine, University of Washington Medical Center, Seattle, WA, USAAbstract: Care for patients undergoing ambulatory procedures is a broad and expanding area of anesthetic and surgical practice. There were over 35 million ambulatory surgical procedures performed in the US in 2006. Ambulatory procedures are diverse in both type and setting, as they span the range from biopsies performed under local anesthesia to intra-abdominal laparoscopic procedures, and are performed in offices, freestanding ambulatory surgery centers, and ambulatory units of hospitals. The information on adverse events from these varied settings comes largely from retrospective reviews of sources, such as quality-assurance databases and closed malpractice claims. Very few if any ambulatory procedures are emergent, and in comparison to the inpatient population, ambulatory surgical patients are generally healthier. They are still however subject to most of the same types of adverse events as patients undergoing inpatient surgery, albeit at a lower frequency. The only adverse events that could be considered to be unique to ambulatory surgery are those that arise out of the circumstance of discharging a postoperative patient to an environment lacking skilled nursing care. There is limited information on these types of discharge-related adverse events, but the data that are available are reviewed in an attempt to assist the practitioner in patient selection and discharge decision making. Among ambulatory surgical patients, particularly those undergoing screening or cosmetic procedures, expectations from all parties involved are high, and a definition of adverse events can be expanded to include any occurrence that interrupts the rapid throughput of patients or interferes with early discharge and optimal patient satisfaction. This review covers all types of adverse events, but focuses on the more

  14. Global patterns and impacts of El Niño events on coral reefs: A meta-analysis

    OpenAIRE

    Claar, Danielle C.; Szostek, Lisa; McDevitt-Irwin, Jamie M.; Schanze, Julian J.; Baum, Julia K.

    2018-01-01

    Impacts of global climate change on coral reefs are being amplified by pulse heat stress events, including El Niño, the warm phase of the El Niño Southern Oscillation (ENSO). Despite reports of extensive coral bleaching and up to 97% coral mortality induced by El Niño events, a quantitative synthesis of the nature, intensity, and drivers of El Niño and La Niña impacts on corals is lacking. Herein, we first present a global meta-analysis of studies quantifying the effects of El Niño/La Niña-wa...

  15. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Sezen, Halil [The Ohio State Univ., Columbus, OH (United States). Dept. of Civil, Environmental and Geodetic Engineering; Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States). College of Engineering, Nuclear Engineering Program, Dept. of Mechanical and Aerospace Engineering; Denning, R. [The Ohio State Univ., Columbus, OH (United States); Vaidya, N. [Rizzo Associates, Pittsburgh, PA (United States)

    2017-12-29

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  16. Modeling the recurrent failure to thrive in less than two-year children: recurrent events survival analysis.

    Science.gov (United States)

    Saki Malehi, Amal; Hajizadeh, Ebrahim; Ahmadi, Kambiz; Kholdi, Nahid

    2014-01-01

    This study aimes to evaluate the failure to thrive (FTT) recurrent event over time. This longitudinal study was conducted during February 2007 to July 2009. The primary outcome was growth failure. The analysis was done using 1283 children who had experienced FTT several times, based on recurrent events analysis. Fifty-nine percent of the children had experienced the FTT at least one time and 5.3% of them had experienced it up to four times. The Prentice-Williams-Peterson (PWP) model revealed significant relationship between diarrhea (HR=1.26), respiratory infections (HR=1.25), urinary tract infections (HR=1.51), discontinuation of breast-feeding (HR=1.96), teething (HR=1.18), initiation age of complementary feeding (HR=1.11) and hazard rate of the first FTT event. Recurrence nature of the FTT is a main problem, which taking it into account increases the accuracy in analysis of FTT event process and can lead to identify different risk factors for each FTT recurrences.

  17. Event and fault tree model for reliability analysis of the greek research reactor

    International Nuclear Information System (INIS)

    Albuquerque, Tob R.; Guimaraes, Antonio C.F.; Moreira, Maria de Lourdes

    2013-01-01

    Fault trees and event trees are widely used in industry to model and to evaluate the reliability of safety systems. Detailed analyzes in nuclear installations require the combination of these two techniques. This work uses the methods of fault tree (FT) and event tree (ET) to perform the Probabilistic Safety Assessment (PSA) in research reactors. The PSA according to IAEA (International Atomic Energy Agency) is divided into Level 1, Level 2 and level 3. At Level 1, conceptually safety systems act to prevent the accident, at Level 2, the accident occurred and seeks to minimize the consequences, known as stage management of the accident, and at Level 3 are determined consequences. This paper focuses on Level 1 studies, and searches through the acquisition of knowledge consolidation of methodologies for future reliability studies. The Greek Research Reactor, GRR - 1, was used as a case example. The LOCA (Loss of Coolant Accident) was chosen as the initiating event and from there were developed the possible accident sequences, using event tree, which could lead damage to the core. Furthermore, for each of the affected systems, the possible accidents sequences were made fault tree and evaluated the probability of each event top of the FT. The studies were conducted using a commercial computational tool SAPHIRE. The results thus obtained, performance or failure to act of the systems analyzed were considered satisfactory. This work is directed to the Greek Research Reactor due to data availability. (author)

  18. Identification of unusual events in multichannel bridge monitoring data using wavelet transform and outlier analysis

    Science.gov (United States)

    Omenzetter, Piotr; Brownjohn, James M. W.; Moyo, Pilate

    2003-08-01

    Continuously operating instrumented structural health monitoring (SHM) systems are becoming a practical alternative to replace visual inspection for assessment of condition and soundness of civil infrastructure. However, converting large amount of data from an SHM system into usable information is a great challenge to which special signal processing techniques must be applied. This study is devoted to identification of abrupt, anomalous and potentially onerous events in the time histories of static, hourly sampled strains recorded by a multi-sensor SHM system installed in a major bridge structure in Singapore and operating continuously for a long time. Such events may result, among other causes, from sudden settlement of foundation, ground movement, excessive traffic load or failure of post-tensioning cables. A method of outlier detection in multivariate data has been applied to the problem of finding and localizing sudden events in the strain data. For sharp discrimination of abrupt strain changes from slowly varying ones wavelet transform has been used. The proposed method has been successfully tested using known events recorded during construction of the bridge, and later effectively used for detection of anomalous post-construction events.

  19. Towards a unified study of extreme events using universality concepts and transdisciplinary analysis methods

    Science.gov (United States)

    Balasis, George; Donner, Reik V.; Donges, Jonathan F.; Radebach, Alexander; Eftaxias, Konstantinos; Kurths, Jürgen

    2013-04-01

    The dynamics of many complex systems is characterized by the same universal principles. In particular, systems which are otherwise quite different in nature show striking similarities in their behavior near tipping points (bifurcations, phase transitions, sudden regime shifts) and associated extreme events. Such critical phenomena are frequently found in diverse fields such as climate, seismology, or financial markets. Notably, the observed similarities include a high degree of organization, persistent behavior, and accelerated energy release, which are common to (among others) phenomena related to geomagnetic variability of the terrestrial magnetosphere (intense magnetic storms), seismic activity (electromagnetic emissions prior to earthquakes), solar-terrestrial physics (solar flares), neurophysiology (epileptic seizures), and socioeconomic systems (stock market crashes). It is an open question whether the spatial and temporal complexity associated with extreme events arises from the system's structural organization (geometry) or from the chaotic behavior inherent to the nonlinear equations governing the dynamics of these phenomena. On the one hand, the presence of scaling laws associated with earthquakes and geomagnetic disturbances suggests understanding these events as generalized phase transitions similar to nucleation and critical phenomena in thermal and magnetic systems. On the other hand, because of the structural organization of the systems (e.g., as complex networks) the associated spatial geometry and/or topology of interactions plays a fundamental role in the emergence of extreme events. Here, a few aspects of the interplay between geometry and dynamics (critical phase transitions) that could result in the emergence of extreme events, which is an open problem, will be discussed.

  20. Trend analysis of breaker events at United States nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Hiroki

    2006-01-01

    From events in overseas nuclear power plants recorded in the nuclear information detabase of Institute of Nuclear Safety System, Inc. (INSS), the number of events of electrical systems during the four years from 2002 to 2005 was extracted and the trend was analyzed. The results showed that breaker events were the largest in number in all years, and almost all them occurred in the US. The breaker events that occurred in US nuclear power plants in 2005 were analyzed by classifying them by cause of failure and effect on the plant, and by comparing the number of occurrences with that in Japan. As a result, the main cause of many of the breaker events was improper maintenance due to poor arrangement of maintenance manuals and human error, as well as aging degradation, they can be estimated to have been caused by insufficient maintenance control and inspection. The number of breaker failures per plant per year in our country was lower than that in the US by an order of magnitude, and there were no failures that led to a plant trip or power reduction. These facts suggest that our country's maintenance contents of breaker are advantage. (author)

  1. Event and fault tree model for reliability analysis of the greek research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Albuquerque, Tob R.; Guimaraes, Antonio C.F.; Moreira, Maria de Lourdes, E-mail: atalbuquerque@ien.gov.br, E-mail: btony@ien.gov.br, E-mail: malu@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2013-07-01

    Fault trees and event trees are widely used in industry to model and to evaluate the reliability of safety systems. Detailed analyzes in nuclear installations require the combination of these two techniques. This work uses the methods of fault tree (FT) and event tree (ET) to perform the Probabilistic Safety Assessment (PSA) in research reactors. The PSA according to IAEA (International Atomic Energy Agency) is divided into Level 1, Level 2 and level 3. At Level 1, conceptually safety systems act to prevent the accident, at Level 2, the accident occurred and seeks to minimize the consequences, known as stage management of the accident, and at Level 3 are determined consequences. This paper focuses on Level 1 studies, and searches through the acquisition of knowledge consolidation of methodologies for future reliability studies. The Greek Research Reactor, GRR - 1, was used as a case example. The LOCA (Loss of Coolant Accident) was chosen as the initiating event and from there were developed the possible accident sequences, using event tree, which could lead damage to the core. Furthermore, for each of the affected systems, the possible accidents sequences were made fault tree and evaluated the probability of each event top of the FT. The studies were conducted using a commercial computational tool SAPHIRE. The results thus obtained, performance or failure to act of the systems analyzed were considered satisfactory. This work is directed to the Greek Research Reactor due to data availability. (author)

  2. Application of Continuous and Structural ARMA modeling for noise analysis of a BWR coupled core and plant instability event

    International Nuclear Information System (INIS)

    Demeshko, M.; Dokhane, A.; Washio, T.; Ferroukhi, H.; Kawahara, Y.; Aguirre, C.

    2015-01-01

    Highlights: • We demonstrate the first application of a novel CSARMA method. • We analyze the instability occurred in a Swiss BWR plant during power ascension. • Benchmarked the results against STP analysis. • The CSARMA results are consistent with the background physics and the STP results. • The instability was caused by disturbances in the pressure control system. - Abstract: This paper presents a first application of a novel Continuous and Structural Autoregressive Moving Average (CSARMA) modeling approach to BWR noise analysis. The CSARMA approach derives a unique representation of the system dynamics by more robust and reliable canonical models as basis for signal analysis in general and for reactor diagnostics in particular. In this paper, a stability event that occurred in a Swiss BWR plant during power ascension phase is analyzed as well as the time periods that preceded and followed the event. Focusing only on qualitative trends at this stage, the obtained results clearly indicate a different dynamical state during the unstable event compared to the two other stable periods. Also, they could be interpreted as pointing out a disturbance in the pressure control system as primary cause for the event. To benchmark these findings, the frequency-domain based signal transmission-path (STP) method is also applied. And with the STP method, we obtained similar relationships as mentioned above. This consistency between both methods can be considered as being a confirmation that the event was caused by a pressure control system disturbance and not induced by the core. Also, it is worth noting that the STP analysis failed to catch the relations among the processes during the stable periods, that were clearly indicated by the CSARMA method, since the last uses more precise models as basis

  3. Event tree analysis of accidents during transport of radioactive materials in Japan

    International Nuclear Information System (INIS)

    Watabe, N.; Shirai, K.; Noguchi, K.; Suzuki, H.; Kinehara, Y.

    1993-01-01

    The Event Tree Method is one of the Probabilistic Safety Assessment Method. It introduces the accident scenario and the results of countermeasures. Therefore, it is effective in determining latent accident scenarios in the transfer. In this report the Event Tree Method is used to study the tunnel fire and its effects are evaluated. And this is the first trail of our Probabilistic Safety Assessment. The Event Tree for determining the early conditions when a car engine catches fire in a tunnel is examined. There are fire extinguishers, tunnel equipments for fire-fighting, fire stations and the heat-resisting property of the container for protecting from the fire. The protection level against the over 800degC-30min. fire accident is 88.3 %. (J.P.N.)

  4. Analysis of aluminum protective effect for female astronauts in solar particle events

    Directory of Open Access Journals (Sweden)

    Xu Feng

    2017-01-01

    Full Text Available In order to ensure the health and safety of female astronauts in space, the risks of space radiation should be evaluated, and effective methods for protecting against space radiation should be investigated. In this paper, a dose calculation model is established for Chinese female astronauts. The absorbed doses of some organs in two historical solar particle events are calculated using Monte Carlo methods, and the shielding conditions are 0 gcm-2 and 5 gcm-2 aluminum, respectively. The calculated results are analysed, compared, and discussed. The results show that 5 gcm-2 aluminum cannot afford enough effective protection in solar particle events. Hence, once encountering solar particle events in manned spaceflight missions, in order to ensure the health and safety of female astronauts, they are not allowed to stay in the pressure vessel, and must enter into the thicker shielding location such as food and water storage cabin.

  5. Event recognition by detrended fluctuation analysis: An application to Teide-Pico Viejo volcanic complex, Tenerife, Spain

    International Nuclear Information System (INIS)

    Del Pin, Enrico; Carniel, Roberto; Tarraga, Marta

    2008-01-01

    In this work we investigate the application of the detrended fluctuation analysis (DFA) to seismic data recorded in the island of Tenerife (Canary Islands, Spain) during the month of July 2004, in a phase of possible unrest of the Teide-Pico Viejo volcanic complex. Tectonic events recorded in the area are recognized and located by the Spanish national agency Instituto Geografico Nacional (IGN) and their catalogue is the only currently available dataset, whose completeness unfortunately suffers from the strong presence of anthropogenic noise. In this paper we propose the use of DFA to help to aut