WorldWideScience

Sample records for reusable event analysis

  1. Differences in alarm events between disposable and reusable electrocardiography lead wires.

    Science.gov (United States)

    Albert, Nancy M; Murray, Terri; Bena, James F; Slifcak, Ellen; Roach, Joel D; Spence, Jackie; Burkle, Alicia

    2015-01-01

    Disposable electrocardiographic lead wires (ECG-LWs) may not be as durable as reusable ones. To examine differences in alarm events between disposable and reusable ECG-LWs. Two cardiac telemetry units were randomized to reusable ECG-LWs, and 2 units alternated between disposable and reusable ECG-LWs for 4 months. A remote monitoring team, blinded to ECG-LW type, assessed frequency and type of alarm events by using total counts and rates per 100 patient days. Event rates were compared by using generalized linear mixed-effect models for differences and noninferiority between wire types. In 1611 patients and 9385.5 patient days of ECG monitoring, patient characteristics were similar between groups. Rates of alarms for no telemetry, leads fail, or leads off were lower in disposable ECG-LWs (adjusted relative risk [95% CI], 0.71 [0.53-0.96]; noninferiority P < .001; superiority P = .03) and monitoring (artifact) alarms were significantly noninferior (adjusted relative risk [95% CI]: 0.88, [0.62-1.24], P = .02; superiority P = .44). No between-group differences existed in false or true crisis alarms. Disposable ECG-LWs were noninferior to reusable ECG-LWs for all false-alarm events (N [rate per 100 patient days], disposable 2029 [79.1] vs reusable 6673 [97.9]; adjusted relative risk [95% CI]: 0.81 [0.63-1.06], P = .002; superiority P = .12.) Disposable ECG-LWs with patented push-button design had superior performance in reducing alarms created by no telemetry, leads fail, or leads off and significant noninferiority in all false-alarm rates compared with reusable ECG-LWs. Fewer ECG alarms may save nurses time, decrease alarm fatigue, and improve patient safety. ©2015 American Association of Critical-Care Nurses.

  2. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  3. Weight Analysis of Two-Stage-To-Orbit Reusable Launch Vehicles for Military Applications

    National Research Council Canada - National Science Library

    Caldwell, Richard A

    2005-01-01

    In response to Department of Defense (DoD) requirements for responsive and low-cost space access, this design study provides an objective empty weight analysis of potential reusable launch vehicle (RLV) configurations...

  4. Design, Analysis and Qualification of Elevon for Reusable Launch Vehicle

    Science.gov (United States)

    Tiwari, S. B.; Suresh, R.; Krishnadasan, C. K.

    2017-12-01

    Reusable launch vehicle technology demonstrator is configured as a winged body vehicle, designed to fly in hypersonic, supersonic and subsonic regimes. The vehicle will be boosted to hypersonic speeds after which the winged body separates and descends using aerodynamic control. The aerodynamic control is achieved using the control surfaces mainly the rudder and the elevon. Elevons are deflected for pitch and roll control of the vehicle at various flight conditions. Elevons are subjected to aerodynamic, thermal and inertial loads during the flight. This paper gives details about the configuration, design, qualification and flight validation of elevon for Reusable Launch Vehicle.

  5. Air Force Reusable Booster System: A Quick-look, Design Focused Modeling and Cost Analysis Study

    Science.gov (United States)

    Zapata, Edgar

    2011-01-01

    This paper presents a method and an initial analysis of the costs of a reusable booster system (RBS) as envisioned by the US Department of Defense (DoD) and numerous initiatives that form the concept of Operationally Responsive Space (ORS). This paper leverages the knowledge gained from decades of experience with the semi-reusable NASA Space Shuttle to understand how the costs of a military next generation semi-reusable space transport might behave in the real world - and how it might be made as affordable as desired. The NASA Space Shuttle had a semi-expendable booster, that being the reusable Solid Rocket MotorslBoosters (SRMlSRB) and the expendable cryogenic External Tank (ET), with a reusable cargo and crew capable orbiter. This paper will explore DoD concepts that invert this architectural arrangement, using a reusable booster plane that flies back to base soon after launch, with the in-space elements of the launch system being the expendable portions. Cost estimating in the earliest stages of any potential, large scale program has limited usefulness. As a result, the emphasis here is on developing an approach, a structure, and the basic concepts that could continue to be matured as the program gains knowledge. Where cost estimates are provided, these results by necessity carry many caveats and assumptions, and this analysis becomes more about ways in which drivers of costs for diverse scenarios can be better understood. The paper is informed throughout with a design-for-cost philosophy whereby the design and technology features of the proposed RBS (who and what, the "architecture") are taken as linked at the hip to a desire to perform a certain mission (where and when), and together these inform the cost, responsiveness, performance and sustainability (how) of the system. Concepts for developing, acquiring, producing or operating the system will be shown for their inextricable relationship to the "architecture" of the system, and how these too relate to costs

  6. Cost analysis of single-use (Ambu® aScope™) and reusable bronchoscopes in the ICU.

    Science.gov (United States)

    Perbet, S; Blanquet, M; Mourgues, C; Delmas, J; Bertran, S; Longère, B; Boïko-Alaux, V; Chennell, P; Bazin, J-E; Constantin, J-M

    2017-12-01

    Flexible optical bronchoscopes are essential for management of airways in ICU, but the conventional reusable flexible scopes have three major drawbacks: high cost of repairs, need for decontamination, and possible transmission of infectious agents. The main objective of this study was to measure the cost of bronchoalveolar lavage (BAL) and percutaneous tracheostomy (PT) using reusable bronchoscopes and single-use bronchoscopes in an ICU of an university hospital. The secondary objective was to compare the satisfaction of healthcare professionals with reusable and single-use bronchoscopes. The study was performed between August 2009 and July 2014 in a 16-bed ICU. All BAL and PT procedures were performed by experienced healthcare professionals. Cost analysis was performed considering ICU and hospital organization. Healthcare professional satisfaction with single-use and reusable scopes was determined based on eight factors. Sensitivity analysis was performed by applying discount rates (0, 3, and 5%) and by simulation of six situations based on different assumptions. At a discount rate of 3%, the costs per BAL for the two reusable scopes were 188.86€ (scope 1) and 185.94€ (scope 2), and the costs per PT for the reusable scope 1 and scope 2 and single-use scopes were 1613.84€, 410.24€, and 204.49€, respectively. The cost per procedure for the reusable scopes depended on the number of procedures performed, maintenance costs, and decontamination costs. Healthcare professionals were more satisfied with the third-generation single-use Ambu ® aScope™. The cost per procedure for the single-use scope was not superior to that for reusable scopes. The choice of single-use or reusable bronchoscopes in an ICU should consider the frequency of procedures and the number of bronchoscopes needed.

  7. STATISTICS. The reusable holdout: Preserving validity in adaptive data analysis.

    Science.gov (United States)

    Dwork, Cynthia; Feldman, Vitaly; Hardt, Moritz; Pitassi, Toniann; Reingold, Omer; Roth, Aaron

    2015-08-07

    Misapplication of statistical data analysis is a common cause of spurious discoveries in scientific research. Existing approaches to ensuring the validity of inferences drawn from data assume a fixed procedure to be performed, selected before the data are examined. In common practice, however, data analysis is an intrinsically adaptive process, with new analyses generated on the basis of data exploration, as well as the results of previous analyses on the same data. We demonstrate a new approach for addressing the challenges of adaptivity based on insights from privacy-preserving data analysis. As an application, we show how to safely reuse a holdout data set many times to validate the results of adaptively chosen analyses. Copyright © 2015, American Association for the Advancement of Science.

  8. An Entry Flight Controls Analysis for a Reusable Launch Vehicle

    Science.gov (United States)

    Calhoun, Philip

    2000-01-01

    The NASA Langley Research Center has been performing studies to address the feasibility of various single-stage to orbit concepts for use by NASA and the commercial launch industry to provide a lower cost access to space. Some work on the conceptual design of a typical lifting body concept vehicle, designated VentureStar(sup TM) has been conducted in cooperation with the Lockheed Martin Skunk Works. This paper will address the results of a preliminary flight controls assessment of this vehicle concept during the atmospheric entry phase of flight. The work includes control analysis from hypersonic flight at the atmospheric entry through supersonic speeds to final approach and landing at subsonic conditions. The requirements of the flight control effectors are determined over the full range of entry vehicle Mach number conditions. The analysis was performed for a typical maximum crossrange entry trajectory utilizing angle of attack to limit entry heating and providing for energy management, and bank angle to modulation of the lift vector to provide downrange and crossrange capability to fly the vehicle to a specified landing site. Sensitivity of the vehicle open and closed loop characteristics to CG location, control surface mixing strategy and wind gusts are included in the results. An alternative control surface mixing strategy utilizing a reverse aileron technique demonstrated a significant reduction in RCS torque and fuel required to perform bank maneuvers during entry. The results of the control analysis revealed challenges for an early vehicle configuration in the areas of hypersonic pitch trim and subsonic longitudinal controllability.

  9. Analysis of extreme events

    CSIR Research Space (South Africa)

    Khuluse, S

    2009-04-01

    Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...

  10. eXframe: reusable framework for storage, analysis and visualization of genomics experiments

    Directory of Open Access Journals (Sweden)

    Sinha Amit U

    2011-11-01

    Full Text Available Abstract Background Genome-wide experiments are routinely conducted to measure gene expression, DNA-protein interactions and epigenetic status. Structured metadata for these experiments is imperative for a complete understanding of experimental conditions, to enable consistent data processing and to allow retrieval, comparison, and integration of experimental results. Even though several repositories have been developed for genomics data, only a few provide annotation of samples and assays using controlled vocabularies. Moreover, many of them are tailored for a single type of technology or measurement and do not support the integration of multiple data types. Results We have developed eXframe - a reusable web-based framework for genomics experiments that provides 1 the ability to publish structured data compliant with accepted standards 2 support for multiple data types including microarrays and next generation sequencing 3 query, analysis and visualization integration tools (enabled by consistent processing of the raw data and annotation of samples and is available as open-source software. We present two case studies where this software is currently being used to build repositories of genomics experiments - one contains data from hematopoietic stem cells and another from Parkinson's disease patients. Conclusion The web-based framework eXframe offers structured annotation of experiments as well as uniform processing and storage of molecular data from microarray and next generation sequencing platforms. The framework allows users to query and integrate information across species, technologies, measurement types and experimental conditions. Our framework is reusable and freely modifiable - other groups or institutions can deploy their own custom web-based repositories based on this software. It is interoperable with the most important data formats in this domain. We hope that other groups will not only use eXframe, but also contribute their own

  11. Economics of reusable facilities

    International Nuclear Information System (INIS)

    Antia, D.D.J.

    1992-01-01

    In this paper some of the different economic development strategies that can be used for reusable facilities in the UK, Norway, Netherlands and in some production sharing contracts are outlined. These strategies focus on an integrated decision analysis approach which considers development phasing, reservoir management, tax planning and where appropriate facility purchase, leasing, or sale and leaseback decisions

  12. An Analysis of the Advantages of Reusable Plastic Containers in Strawberry Logistics : A Case Study of the Japan Agricultural Cooperative YOICHI

    OpenAIRE

    尾碕, 亨; 樋元, 淳一

    2014-01-01

    This article undertook a comparative analysis of cardboard boxes and reusable plastic containers and their impact on production logistics costs and the receipt prices of producers. The results showed that reusable plastic containers shortened logistical working hours, reduced production logistics costs and increased the receipt prices of the producer. However, exchange-value cannot realized if the quality of the farm product is not maintained, even if it is transported in superior packing con...

  13. EVENT PLANNING USING FUNCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  14. A Discrete-Event Simulation Model for Evaluating Air Force Reusable Military Launch Vehicle Post-Landing Operations

    National Research Council Canada - National Science Library

    Martindale, Michael

    2006-01-01

    The purpose of this research was to develop a discrete-event computer simulation model of the post-landing vehicle recoveoperations to allow the Air Force Research Laboratory, Air Vehicles Directorate...

  15. MGR External Events Hazards Analysis

    International Nuclear Information System (INIS)

    Booth, L.

    1999-01-01

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses

  16. Reliable clinical serum analysis with reusable electrochemical sensor: Toward point-of-care measurement of the antipsychotic medication clozapine.

    Science.gov (United States)

    Kang, Mijeong; Kim, Eunkyoung; Winkler, Thomas E; Banis, George; Liu, Yi; Kitchen, Christopher A; Kelly, Deanna L; Ghodssi, Reza; Payne, Gregory F

    2017-09-15

    Clozapine is one of the most promising medications for managing schizophrenia but it is under-utilized because of the challenges of maintaining serum levels in a safe therapeutic range (1-3μM). Timely measurement of serum clozapine levels has been identified as a barrier to the broader use of clozapine, which is however challenging due to the complexity of serum samples. We demonstrate a robust and reusable electrochemical sensor with graphene-chitosan composite for rapidly measuring serum levels of clozapine. Our electrochemical measurements in clinical serum from clozapine-treated and clozapine-untreated schizophrenia groups are well correlated to centralized laboratory analysis for the readily detected uric acid and for the clozapine which is present at 100-fold lower concentration. The benefits of our electrochemical measurement approach for serum clozapine monitoring are: (i) rapid measurement (≈20min) without serum pretreatment; (ii) appropriate selectivity and sensitivity (limit of detection 0.7μM); (iii) reusability of an electrode over several weeks; and (iv) rapid reliability testing to detect common error-causing problems. This simple and rapid electrochemical approach for serum clozapine measurements should provide clinicians with the timely point-of-care information required to adjust dosages and personalize the management of schizophrenia. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. TEMAC, Top Event Sensitivity Analysis

    International Nuclear Information System (INIS)

    Iman, R.L.; Shortencarier, M.J.

    1988-01-01

    1 - Description of program or function: TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude of risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates. 2 - Method of solution: Sensitivity and uncertainty analyses associated with top events involve mathematical operations on the corresponding Boolean expression for the top event, as well as repeated evaluations of the top event in a Monte Carlo fashion. TEMAC employs a general matrix approach which provides a convenient general form for Boolean expressions, is computationally efficient, and allows large problems to be analyzed. 3 - Restrictions on the complexity of the problem - Maxima of: 4000 cut sets, 500 events, 500 values in a Monte Carlo sample, 16 characters in an event name. These restrictions are implemented through the FORTRAN 77 PARAMATER statement

  18. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  19. Trending analysis of precursor events

    International Nuclear Information System (INIS)

    Watanabe, Norio

    1998-01-01

    The Accident Sequence Precursor (ASP) Program of United States Nuclear Regulatory Commission (U.S.NRC) identifies and categorizes operational events at nuclear power plants in terms of the potential for core damage. The ASP analysis has been performed on yearly basis and the results have been published in the annual reports. This paper describes the trends in initiating events and dominant sequences for 459 precursors identified in the ASP Program during the 1969-94 period and also discusses a comparison with dominant sequences predicted in the past Probabilistic Risk Assessment (PRA) studies. These trends were examined for three time periods, 1969-81, 1984-87 and 1988-94. Although the different models had been used in the ASP analyses for these three periods, the distribution of precursors by dominant sequences show similar trends to each other. For example, the sequences involving loss of both main and auxiliary feedwater were identified in many PWR events and those involving loss of both high and low coolant injection were found in many BWR events. Also, it was found that these dominant sequences were comparable to those determined to be dominant in the predictions by the past PRAs. As well, a list of the 459 precursors identified are provided in Appendix, indicating initiating event types, unavailable systems, dominant sequences, conditional core damage probabilities, and so on. (author)

  20. Teaching tools in Evidence Based Practice: evaluation of reusable learning objects (RLOs for learning about Meta-analysis

    Directory of Open Access Journals (Sweden)

    Wharrad Heather

    2011-05-01

    Full Text Available Abstract Background All healthcare students are taught the principles of evidence based practice on their courses. The ability to understand the procedures used in systematically reviewing evidence reported in studies, such as meta-analysis, are an important element of evidence based practice. Meta-analysis is a difficult statistical concept for healthcare students to understand yet it is an important technique used in systematic reviews to pool data from studies to look at combined effectiveness of treatments. In other areas of the healthcare curricula, by supplementing lectures, workbooks and workshops with pedagogically designed, multimedia learning objects (known as reusable learning objects or RLOs we have shown an improvement in students' perceived understanding in subjects they found difficult. In this study we describe the development and evaluation of two RLOs on meta-analysis. The RLOs supplement associated lectures and aim to improve students' understanding of meta-analysis in healthcare students. Methods Following a quality controlled design process two RLOs were developed and delivered to two cohorts of students, a Master in Public Health course and Postgraduate diploma in nursing course. Students' understanding of five key concepts of Meta-analysis were measured before and after a lecture and again after RLO use. RLOs were also evaluated for their educational value, learning support, media attributes and usability using closed and open questions. Results Students rated their understanding of meta-analysis as improved after a lecture and further improved after completing the RLOs (Wilcoxon paired test, p Conclusions Meta-analysis RLOs that are openly accessible and unrestricted by usernames and passwords provide flexible support for students who find the process of meta-analysis difficult.

  1. Event Shape Analysis in ALICE

    CERN Document Server

    AUTHOR|(CDS)2073367; Paic, Guy

    2009-01-01

    The jets are the final state manifestation of the hard parton scattering. Since at LHC energies the production of hard processes in proton-proton collisions will be copious and varied, it is important to develop methods to identify them through the study of their final states. In the present work we describe a method based on the use of some shape variables to discriminate events according their topologies. A very attractive feature of this analysis is the possibility of using the tracking information of the TPC+ITS in order to identify specific events like jets. Through the correlation between the quantities: thrust and recoil, calculated in minimum bias simulations of proton-proton collisions at 10 TeV, we show the sensitivity of the method to select specific topologies and high multiplicity. The presented results were obtained both at level generator and after reconstruction. It remains that with any kind of jet reconstruction algorithm one will confronted in general with overlapping jets. The present meth...

  2. Reusable Component Services

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Reusable Component Services (RCS) is a super-catalog of components, services, solutions and technologies that facilitates search, discovery and collaboration in...

  3. Joint Attributes and Event Analysis for Multimedia Event Detection.

    Science.gov (United States)

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  4. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  5. Risk analysis of brachytherapy events

    International Nuclear Information System (INIS)

    Buricova, P.; Zackova, H.; Hobzova, L.; Novotny, J.; Kindlova, A.

    2005-01-01

    For prevention radiological events it is necessary to identify hazardous situation and to analyse the nature of committed errors. Though the recommendation on the classification and prevention of radiological events: Radiological accidents has been prepared in the framework of Czech Society of Radiation Oncology, Biology and Physics and it was approved by Czech regulatory body (SONS) in 1999, only a few reports have been submitted up to now from brachytherapy practice. At the radiotherapy departments attention has been paid more likely to the problems of dominant teletherapy treatments. But in the two last decades the usage of brachytherapy methods has gradually increased because .nature of this treatment well as the possibilities of operating facility have been completely changed: new radionuclides of high activity are introduced and sophisticate afterloading systems controlled by computers are used. Consequently also the nature of errors, which can occurred in the clinical practice, has been changing. To determine the potentially hazardous parts of procedure the so-called 'process tree', which follows the flow of entire treatment process, has been created for most frequent type of applications. Marking the location of errors on the process tree indicates where failures occurred and accumulation of marks along branches show weak points in the process. Analysed data provide useful information to prevent medical events in brachytherapy .The results strength the requirements given in Recommendations of SONS and revealed the need for its amendment. They call especially for systematic registration of the events. (authors)

  6. Surface Management System Departure Event Data Analysis

    Science.gov (United States)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  7. External events analysis for experimental fusion facilities

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1990-01-01

    External events are those off-normal events that threaten facilities either from outside or inside the building. These events, such as floods, fires, and earthquakes, are among the leading risk contributors for fission power plants, and the nature of fusion facilities indicates that they may also lead fusion risk. This paper gives overviews of analysis methods, references good analysis guidance documents, and gives design tips for mitigating the effects of floods and fires, seismic events, and aircraft impacts. Implications for future fusion facility siting are also discussed. Sites similar to fission plant sites are recommended. 46 refs

  8. Event analysis in primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Paulasaari, H. [Tampere Univ. of Technology (Finland)

    1996-12-31

    The target of the project is to develop a system which observes the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: the first is the communication with the SRIO and the SCADA system, the second is the internal clock, which is used to produce time stamped event data, and the third is the capability to register some values during the fault. For example, during a short circuit fault the relay registers the value of the short circuit current and information on the number of faulted phases. In the case of an earth fault the relay stores both the neutral current and the neutral voltage

  9. Event analysis in primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Paulasaari, H [Tampere Univ. of Technology (Finland)

    1997-12-31

    The target of the project is to develop a system which observes the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: the first is the communication with the SRIO and the SCADA system, the second is the internal clock, which is used to produce time stamped event data, and the third is the capability to register some values during the fault. For example, during a short circuit fault the relay registers the value of the short circuit current and information on the number of faulted phases. In the case of an earth fault the relay stores both the neutral current and the neutral voltage

  10. Reusability of coordination programs

    NARCIS (Netherlands)

    F. Arbab (Farhad); C.L. Blom (Kees); F.J. Burger (Freek); C.T.H. Everaars (Kees)

    1996-01-01

    textabstractIsolating computation and communication concerns into separate pure computation and pure coordination modules enhances modularity, understandability, and reusability of parallel and/or distributed software. This can be achieved by moving communication primitives (such as SendMessage and

  11. External event analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Bohn, M.P.; Lambright, J.A.

    1989-01-01

    The US Nuclear Regulatory Commission is sponsoring probabilistic risk assessments of six operating commercial nuclear power plants as part of a major update of the understanding of risk as provided by the original WASH-1400 risk assessments. In contrast to the WASH-1400 studies, at least two of the NUREG-1150 risk assessments will include an analysis of risks due to earthquakes, fires, floods, etc., which are collectively known as eternal events. This paper summarizes the methods to be used in the external event analysis for NUREG-1150 and the results obtained to date. The two plants for which external events are being considered are Surry and Peach Bottom, a PWR and BWR respectively. The external event analyses (through core damage frequency calculations) were completed in June 1989, with final documentation available in September. In contrast to most past external event analyses, wherein rudimentary systems models were developed reflecting each external event under consideration, the simplified NUREG-1150 analyses are based on the availability of the full internal event PRA systems models (event trees and fault trees) and make use of extensive computer-aided screening to reduce them to sequence cut sets important to each external event. This provides two major advantages in that consistency and scrutability with respect to the internal event analysis is achieved, and the full gamut of random and test/maintenance unavailabilities are automatically included, while only those probabilistically important survive the screening process. Thus, full benefit of the internal event analysis is obtained by performing the internal and external event analyses sequentially

  12. NPP unusual events: data, analysis and application

    International Nuclear Information System (INIS)

    Tolstykh, V.

    1990-01-01

    Subject of the paper are the IAEA cooperative patterns of unusual events data treatment and utilization of the operating safety experience feedback. The Incident Reporting System (IRS) and the Analysis of Safety Significant Event Team (ASSET) are discussed. The IRS methodology in collection, handling, assessment and dissemination of data on NPP unusual events (deviations, incidents and accidents) occurring during operations, surveillance and maintenance is outlined by the reports gathering and issuing practice, the experts assessment procedures and the parameters of the system. After 7 years of existence the IAEA-IRS contains over 1000 reports and receives 1.5-4% of the total information on unusual events. The author considers the reports only as detailed technical 'records' of events requiring assessment. The ASSET approaches implying an in-depth occurrences analysis directed towards level-1 PSA utilization are commented on. The experts evaluated root causes for the reported events and some trends are presented. Generally, internal events due to unexpected paths of water in the nuclear installations, occurrences related to the integrity of the primary heat transport systems, events associated with the engineered safety systems and events involving human factor represent the large groups deserving close attention. Personal recommendations on how to use the events related information use for NPP safety improvement are given. 2 tabs (R.Ts)

  13. Data analysis of event tape and connection

    International Nuclear Information System (INIS)

    Gong Huili

    1995-01-01

    The data analysis on the VAX-11/780 computer is briefly described, the data is from the recorded event tape of JUHU data acquisition system on the PDP-11/44 computer. The connection of the recorded event tapes of the XSYS data acquisition system on VAX computer is also introduced

  14. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  15. Interpretation Analysis as a Competitive Event.

    Science.gov (United States)

    Nading, Robert M.

    Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…

  16. Human reliability analysis using event trees

    International Nuclear Information System (INIS)

    Heslinga, G.

    1983-01-01

    The shut-down procedure of a technologically complex installation as a nuclear power plant consists of a lot of human actions, some of which have to be performed several times. The procedure is regarded as a chain of modules of specific actions, some of which are analyzed separately. The analysis is carried out by making a Human Reliability Analysis event tree (HRA event tree) of each action, breaking down each action into small elementary steps. The application of event trees in human reliability analysis implies more difficulties than in the case of technical systems where event trees were mainly used until now. The most important reason is that the operator is able to recover a wrong performance; memory influences play a significant role. In this study these difficulties are dealt with theoretically. The following conclusions can be drawn: (1) in principle event trees may be used in human reliability analysis; (2) although in practice the operator will recover his fault partly, theoretically this can be described as starting the whole event tree again; (3) compact formulas have been derived, by which the probability of reaching a specific failure consequence on passing through the HRA event tree after several times of recovery is to be calculated. (orig.)

  17. Negated bio-events: analysis and identification

    Science.gov (United States)

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  18. Reusable platform concepts

    International Nuclear Information System (INIS)

    Gudmestad, O.T.; Sparby, B.K.; Stead, B.L.

    1993-01-01

    There is an increasing need to reduce costs of offshore production facilities in order to make development of offshore fields profitable. For small fields with short production time there is in particular a need to investigate ways to reduce costs. The idea of platform reuse is for such fields particularly attractive. This paper will review reusable platform concepts and will discuss their range of application. Particular emphasis will be placed on technical limitations. Traditional concepts as jackups and floating production facilities will be discussed by major attention will be given to newly developed ideas for reuse of steel jackets and concrete structures. It will be shown how the operator for several fields can obtain considerable savings by applying such reusable platform concepts

  19. Reusable radiation monitor

    International Nuclear Information System (INIS)

    Fanselow, D.L.; Ersfeld, D.A.

    1978-01-01

    An integrating, reusable device for monitoring exposure to actinic radiation is disclosed. The device comprises a substrate having deposited thereon at least one photochromic aziridine compound which is sealed in an oxygen barrier to stabilize the color developed by the aziridine compound in response to actinic radiation. The device includes a spectral response shaping filter to transmit only actinic radiation of the type being monitored. A color standard is also provided with which to compare the color developed by the aziridine compound

  20. DEPONTO: A Reusable Dependability Domain Ontology

    Directory of Open Access Journals (Sweden)

    Teodora Sanislav

    2015-08-01

    Full Text Available This paper proposes a dependability reusable ontology for knowledge representation. The fundamental knowledge related to dependability follows its taxonomy. Thus, this paper gives an analysis of what is the dependability domain ontology andof its components.The dependability domain ontology plays an important role in ensuring the dependability of information systems by providing support for their diagnosis in case of faults, errors and failures.The proposed ontology is used as a dependability framework in two case study Cyber-Physical Systemswhich demonstrate its reusability within this category of systems.

  1. Statistical analysis of solar proton events

    Directory of Open Access Journals (Sweden)

    V. Kurt

    2004-06-01

    Full Text Available A new catalogue of 253 solar proton events (SPEs with energy >10MeV and peak intensity >10 protons/cm2.s.sr (pfu at the Earth's orbit for three complete 11-year solar cycles (1970-2002 is given. A statistical analysis of this data set of SPEs and their associated flares that occurred during this time period is presented. It is outlined that 231 of these proton events are flare related and only 22 of them are not associated with Ha flares. It is also noteworthy that 42 of these events are registered as Ground Level Enhancements (GLEs in neutron monitors. The longitudinal distribution of the associated flares shows that a great number of these events are connected with west flares. This analysis enables one to understand the long-term dependence of the SPEs and the related flare characteristics on the solar cycle which are useful for space weather prediction.

  2. Sentiment analysis on tweets for social events

    DEFF Research Database (Denmark)

    Zhou, Xujuan; Tao, Xiaohui; Yong, Jianming

    2013-01-01

    Sentiment analysis or opinion mining is an important type of text analysis that aims to support decision making by extracting and analyzing opinion oriented text, identifying positive and negative opinions, and measuring how positively or negatively an entity (i.e., people, organization, event......, location, product, topic, etc.) is regarded. As more and more users express their political and religious views on Twitter, tweets become valuable sources of people's opinions. Tweets data can be efficiently used to infer people's opinions for marketing or social studies. This paper proposes a Tweets...... Sentiment Analysis Model (TSAM) that can spot the societal interest and general people's opinions in regard to a social event. In this paper, Australian federal election 2010 event was taken as an example for sentiment analysis experiments. We are primarily interested in the sentiment of the specific...

  3. Event analysis in a primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Jaerventausta, P; Paulasaari, H [Tampere Univ. of Technology (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    The target of the project was to develop applications which observe the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: communication with the SCADA, the internal clock to produce time stamped event data, and the capability to register certain values during the fault. Using the above features some new functions for event analysis were developed in the project

  4. Attack Graph Construction for Security Events Analysis

    Directory of Open Access Journals (Sweden)

    Andrey Alexeevich Chechulin

    2014-09-01

    Full Text Available The paper is devoted to investigation of the attack graphs construction and analysis task for a network security evaluation and real-time security event processing. Main object of this research is the attack modeling process. The paper contains the description of attack graphs building, modifying and analysis technique as well as overview of implemented prototype for network security analysis based on attack graph approach.

  5. Advanced event reweighting using multivariate analysis

    International Nuclear Information System (INIS)

    Martschei, D; Feindt, M; Honc, S; Wagner-Kuhr, J

    2012-01-01

    Multivariate analysis (MVA) methods, especially discrimination techniques such as neural networks, are key ingredients in modern data analysis and play an important role in high energy physics. They are usually trained on simulated Monte Carlo (MC) samples to discriminate so called 'signal' from 'background' events and are then applied to data to select real events of signal type. We here address procedures that improve this work flow. This will be the enhancement of data / MC agreement by reweighting MC samples on a per event basis. Then training MVAs on real data using the sPlot technique will be discussed. Finally we will address the construction of MVAs whose discriminator is independent of a certain control variable, i.e. cuts on this variable will not change the discriminator shape.

  6. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  7. Disruptive event analysis: volcanism and igneous intrusion

    International Nuclear Information System (INIS)

    Crowe, B.M.

    1979-01-01

    Three basic topics are addressed for the disruptive event analysis: first, the range of disruptive consequences of a radioactive waste repository by volcanic activity; second, the possible reduction of the risk of disruption by volcanic activity through selective siting of a repository; and third, the quantification of the probability of repository disruption by volcanic activity

  8. Parallel processor for fast event analysis

    International Nuclear Information System (INIS)

    Hensley, D.C.

    1983-01-01

    Current maximum data rates from the Spin Spectrometer of approx. 5000 events/s (up to 1.3 MBytes/s) and minimum analysis requiring at least 3000 operations/event require a CPU cycle time near 70 ns. In order to achieve an effective cycle time of 70 ns, a parallel processing device is proposed where up to 4 independent processors will be implemented in parallel. The individual processors are designed around the Am2910 Microsequencer, the AM29116 μP, and the Am29517 Multiplier. Satellite histogramming in a mass memory system will be managed by a commercial 16-bit μP system

  9. Dynamic Event Tree Analysis Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  10. Reusable Surface Insulation

    Science.gov (United States)

    1997-01-01

    Advanced Flexible Reusable Surface Insulation, developed by Ames Research Center, protects the Space Shuttle from the searing heat that engulfs it on reentry into the Earth's atmosphere. Initially integrated into the Space Shuttle by Rockwell International, production was transferred to Hi-Temp Insulation Inc. in 1974. Over the years, Hi-Temp has created many new technologies to meet the requirements of the Space Shuttle program. This expertise is also used commercially, including insulation blankets to cover aircrafts parts, fire barrier material to protect aircraft engine cowlings and aircraft rescue fire fighter suits. A Fire Protection Division has also been established, offering the first suit designed exclusively by and for aircraft rescue fire fighters. Hi-Temp is a supplier to the Los Angeles City Fire Department as well as other major U.S. civil and military fire departments.

  11. Integration of reusable systems

    CERN Document Server

    Rubin, Stuart

    2014-01-01

    Software reuse and integration has been described as the process of creating software systems from existing software rather than building software systems from scratch. Whereas reuse solely deals with the artifacts creation, integration focuses on how reusable artifacts interact with the already existing parts of the specified transformation. Currently, most reuse research focuses on creating and integrating adaptable components at development or at compile time. However, with the emergence of ubiquitous computing, reuse technologies that can support adaptation and reconfiguration of architectures and components at runtime are in demand. This edited book includes 15 high quality research papers written by experts in information reuse and integration to cover the most recent advances in the field. These papers are extended versions of the best papers which were presented at IEEE International Conference on Information Reuse and Integration and IEEE International Workshop on Formal Methods Integration, which wa...

  12. Multistate event history analysis with frailty

    Directory of Open Access Journals (Sweden)

    Govert Bijwaard

    2014-05-01

    Full Text Available Background: In survival analysis a large literature using frailty models, or models with unobserved heterogeneity, exists. In the growing literature and modelling on multistate models, this issue is only in its infant phase. Ignoring frailty can, however, produce incorrect results. Objective: This paper presents how frailties can be incorporated into multistate models, with an emphasis on semi-Markov multistate models with a mixed proportional hazard structure. Methods: First, the aspects of frailty modeling in univariate (proportional hazard, Cox and multivariate event history models are addressed. The implications of choosing shared or correlated frailty is highlighted. The relevant differences with recurrent events data are covered next. Multistate models are event history models that can have both multivariate and recurrent events. Incorporating frailty in multistate models, therefore, brings all the previously addressed issues together. Assuming a discrete frailty distribution allows for a very general correlation structure among the transition hazards in a multistate model. Although some estimation procedures are covered the emphasis is on conceptual issues. Results: The importance of multistate frailty modeling is illustrated with data on labour market and migration dynamics of recent immigrants to the Netherlands.

  13. Analysis hierarchical model for discrete event systems

    Science.gov (United States)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  14. Using variable transformations to perform common event analysis

    International Nuclear Information System (INIS)

    Worrell, R.B.

    1977-01-01

    Any analytical method for studying the effect of common events on the behavior of a system is considered as being a form of common event analysis. The particular common events that are involved often represent quite different phenomena, and this has led to the development of different kinds of common event analysis. For example, common mode failure analysis, common cause analysis, critical location analysis, etc., are all different kinds of common event analysis for which the common events involved represent different phenomena. However, the problem that must be solved for each of these different kinds of common event analysis is essentially the same: Determine the effect of common events on the behavior of a system. Thus, a technique that is useful in achieving one kind of common event analysis is often useful in achieving other kinds of common event analysis

  15. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  16. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  17. Probabilistic analysis of extreme wind events

    Energy Technology Data Exchange (ETDEWEB)

    Chaviaropoulos, P.K. [Center for Renewable Energy Sources (CRES), Pikermi Attikis (Greece)

    1997-12-31

    A vital task in wind engineering and meterology is to understand, measure, analyse and forecast extreme wind conditions, due to their significant effects on human activities and installations like buildings, bridges or wind turbines. The latest version of the IEC standard (1996) pays particular attention to the extreme wind events that have to be taken into account when designing or certifying a wind generator. Actually, the extreme wind events within a 50 year period are those which determine the ``static`` design of most of the wind turbine components. The extremes which are important for the safety of wind generators are those associated with the so-called ``survival wind speed``, the extreme operating gusts and the extreme wind direction changes. A probabilistic approach for the analysis of these events is proposed in this paper. Emphasis is put on establishing the relation between extreme values and physically meaningful ``site calibration`` parameters, like probability distribution of the annual wind speed, turbulence intensity and power spectra properties. (Author)

  18. Contingency Analysis of Cascading Line Outage Events

    Energy Technology Data Exchange (ETDEWEB)

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  19. Reliable, Reusable Cryotank, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Microcracking issues have significantly limited the reusability of state-of-the-art (SOA) composite cryotanks. While developers have made some progress addressing...

  20. Reusability Framework for Cloud Computing

    OpenAIRE

    Singh, Sukhpal; Singh, Rishideep

    2012-01-01

    Cloud based development is a challenging task for several software engineering projects, especially for those which needs development with reusability. Present time of cloud computing is allowing new professional models for using the software development. The expected upcoming trend of computing is assumed to be this cloud computing because of speed of application deployment, shorter time to market, and lower cost of operation. Until Cloud Co mputing Reusability Model is considered a fundamen...

  1. Reusable Launch Vehicle Technology Program

    Science.gov (United States)

    Freeman, Delma C., Jr.; Talay, Theodore A.; Austin, R. Eugene

    1997-01-01

    Industry/NASA reusable launch vehicle (RLV) technology program efforts are underway to design, test, and develop technologies and concepts for viable commercial launch systems that also satisfy national needs at acceptable recurring costs. Significant progress has been made in understanding the technical challenges of fully reusable launch systems and the accompanying management and operational approaches for achieving a low cost program. This paper reviews the current status of the RLV technology program including the DC-XA, X-33 and X-34 flight systems and associated technology programs. It addresses the specific technologies being tested that address the technical and operability challenges of reusable launch systems including reusable cryogenic propellant tanks, composite structures, thermal protection systems, improved propulsion and subsystem operability enhancements. The recently concluded DC-XA test program demonstrated some of these technologies in ground and flight test. Contracts were awarded recently for both the X-33 and X-34 flight demonstrator systems. The Orbital Sciences Corporation X-34 flight test vehicle will demonstrate an air-launched reusable vehicle capable of flight to speeds of Mach 8. The Lockheed-Martin X-33 flight test vehicle will expand the test envelope for critical technologies to flight speeds of Mach 15. A propulsion program to test the X-33 linear aerospike rocket engine using a NASA SR-71 high speed aircraft as a test bed is also discussed. The paper also describes the management and operational approaches that address the challenge of new cost effective, reusable launch vehicle systems.

  2. DISRUPTIVE EVENT BIOSPHERE DOSE CONVERSION FACTOR ANALYSIS

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The Biosphere Model Report (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objective of this analysis was to develop the BDCFs for the volcanic

  3. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  4. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  5. Human reliability analysis of dependent events

    International Nuclear Information System (INIS)

    Swain, A.D.; Guttmann, H.E.

    1977-01-01

    In the human reliability analysis in WASH-1400, the continuous variable of degree of interaction among human events was approximated by selecting four points on this continuum to represent the entire continuum. The four points selected were identified as zero coupling (i.e., zero dependence), complete coupling (i.e., complete dependence), and two intermediate points--loose coupling (a moderate level of dependence) and tight coupling (a high level of dependence). The paper expands the WASH-1400 treatment of common mode failure due to the interaction of human activities. Mathematical expressions for the above four levels of dependence are derived for parallel and series systems. The psychological meaning of each level of dependence is illustrated by examples, with probability tree diagrams to illustrate the use of conditional probabilities resulting from the interaction of human actions in nuclear power plant tasks

  6. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  7. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  8. Disruptive Event Biosphere Doser Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  9. Analysis of external events - Nuclear Power Plant Dukovany

    International Nuclear Information System (INIS)

    Hladky, Milan

    2000-01-01

    PSA of external events at level 1 covers internal events, floods, fires, other external events are not included yet. Shutdown PSA takes into account internal events, floods, fires, heavy load drop, other external events are not included yet. Final safety analysis report was conducted after 10 years of operation for all Dukovany operational units. Probabilistic approach was used for analysis of aircraft drop and external man-induced events. The risk caused by man-induced events was found to be negligible and was accepted by State Office for Nuclear Safety (SONS)

  10. Event shape analysis in ultrarelativistic nuclear collisions

    OpenAIRE

    Kopecna, Renata; Tomasik, Boris

    2016-01-01

    We present a novel method for sorting events. So far, single variables like flow vector magnitude were used for sorting events. Our approach takes into account the whole azimuthal angle distribution rather than a single variable. This method allows us to determine the good measure of the event shape, providing a multiplicity-independent insight. We discuss the advantages and disadvantages of this approach, the possible usage in femtoscopy, and other more exclusive experimental studies.

  11. Economic Multipliers and Mega-Event Analysis

    OpenAIRE

    Victor Matheson

    2004-01-01

    Critics of economic impact studies that purport to show that mega-events such as the Olympics bring large benefits to the communities “lucky” enough to host them frequently cite the use of inappropriate multipliers as a primary reason why these impact studies overstate the true economic gains to the hosts of these events. This brief paper shows in a numerical example how mega-events may lead to inflated multipliers and exaggerated claims of economic benefits.

  12. Root cause analysis of relevant events

    International Nuclear Information System (INIS)

    Perez, Silvia S.; Vidal, Patricia G.

    2000-01-01

    During 1998 the research work followed more specific guidelines, which entailed focusing exclusively on the two selected methods (ASSET and HPIP) and incorporating some additional human behaviour elements based on the documents of reference. Once resident inspectors were incorporated in the project (and trained accordingly), events occurring in Argentine nuclear power plants were analysed. Some events were analysed (all of them from Atucha I and Embalse nuclear power plant), concluding that the systematic methodology used allows us to investigate also minor events that were precursors of the events selected. (author)

  13. Analysis for Human-related Events during the Overhaul

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Tae; Kim, Min Chull; Choi, Dong Won; Lee, Durk Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2011-10-15

    The event frequency due to human error is decreasing among 20 operating Nuclear Power Plants (NPPs) excluding the NPP (Shin-Kori unit-1) in the commissioning stage since 2008. However, the events due to human error during an overhaul (O/H) occur annually (see Table I). An analysis for human-related events during the O/H was performed. Similar problems were identified for each event from the analysis and also, organizational and safety cultural factors were also identified

  14. Glaciological parameters of disruptive event analysis

    International Nuclear Information System (INIS)

    Bull, C.

    1979-01-01

    The following disruptive events caused by ice sheets are considered: continental glaciation, erosion, loading and subsidence, deep ground water recharge, flood erosion, isostatic rebound rates, melting, and periodicity of ice ages

  15. A Fourier analysis of extreme events

    DEFF Research Database (Denmark)

    Mikosch, Thomas Valentin; Zhao, Yuwei

    2014-01-01

    The extremogram is an asymptotic correlogram for extreme events constructed from a regularly varying stationary sequence. In this paper, we define a frequency domain analog of the correlogram: a periodogram generated from a suitable sequence of indicator functions of rare events. We derive basic ...... properties of the periodogram such as the asymptotic independence at the Fourier frequencies and use this property to show that weighted versions of the periodogram are consistent estimators of a spectral density derived from the extremogram....

  16. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  17. External events analysis of the Ignalina Nuclear Power Plant

    International Nuclear Information System (INIS)

    Liaukonis, Mindaugas; Augutis, Juozas

    1999-01-01

    This paper presents analysis of external events impact on the safe operation of the Ignalina Nuclear Power Plant (INPP) safety systems. Analysis was based on the probabilistic estimation and modelling of the external hazards. The screening criteria were applied to the number of external hazards. The following external events such as aircraft failure on the INPP, external flooding, fire, extreme winds requiring further bounding study were analysed. Mathematical models were developed and event probabilities were calculated. External events analysis showed rather limited external events danger to Ignalina NPP. Results of the analysis were compared to analogous analysis in western NPPs and no great differences were specified. Calculations performed show that external events can not significantly influence the safety level of the Ignalina NPP operation. (author)

  18. Statistical analysis of hydrodynamic cavitation events

    Science.gov (United States)

    Gimenez, G.; Sommer, R.

    1980-10-01

    The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.

  19. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  20. Risk and sensitivity analysis in relation to external events

    International Nuclear Information System (INIS)

    Alzbutas, R.; Urbonas, R.; Augutis, J.

    2001-01-01

    This paper presents risk and sensitivity analysis of external events impacts on the safe operation in general and in particular the Ignalina Nuclear Power Plant safety systems. Analysis is based on the deterministic and probabilistic assumptions and assessment of the external hazards. The real statistic data are used as well as initial external event simulation. The preliminary screening criteria are applied. The analysis of external event impact on the NPP safe operation, assessment of the event occurrence, sensitivity analysis, and recommendations for safety improvements are performed for investigated external hazards. Such events as aircraft crash, extreme rains and winds, forest fire and flying parts of the turbine are analysed. The models are developed and probabilities are calculated. As an example for sensitivity analysis the model of aircraft impact is presented. The sensitivity analysis takes into account the uncertainty features raised by external event and its model. Even in case when the external events analysis show rather limited danger, the sensitivity analysis can determine the highest influence causes. These possible variations in future can be significant for safety level and risk based decisions. Calculations show that external events cannot significantly influence the safety level of the Ignalina NPP operation, however the events occurrence and propagation can be sufficiently uncertain.(author)

  1. Probabilistic analysis of external events with focus on the Fukushima event

    International Nuclear Information System (INIS)

    Kollasko, Heiko; Jockenhoevel-Barttfeld, Mariana; Klapp, Ulrich

    2014-01-01

    External hazards are those natural or man-made hazards to a site and facilities that are originated externally to both the site and its processes, i.e. the duty holder may have very little or no control over the hazard. External hazards can have the potential of causing initiating events at the plant, typically transients like e.g., loss of offsite power. Simultaneously, external events may affect safety systems required to control the initiating event and, where applicable, also back-up systems implemented for risk-reduction. The plant safety may especially be threatened when loads from external hazards exceed the load assumptions considered in the design of safety-related systems, structures and components. Another potential threat is given by hazards inducing initiating events not considered in the safety demonstration otherwise. An example is loss of offsite power combined with prolonged plant isolation. Offsite support, e.g., delivery of diesel fuel oil, usually credited in the deterministic safety analysis may not be possible in this case. As the Fukushima events have shown, the biggest threat is likely given by hazards inducing both effects. Such hazards may well be dominant risk contributors even if their return period is very high. In order to identify relevant external hazards for a certain Nuclear Power Plant (NPP) location, a site specific screening analysis is performed, both for single events and for combinations of external events. As a result of the screening analysis, risk significant and therefore relevant (screened-in) single external events and combinations of them are identified for a site. The screened-in events are further considered in a detailed event tree analysis in the frame of the Probabilistic Safety Analysis (PSA) to calculate the core damage/large release frequency resulting from each relevant external event or from each relevant combination. Screening analyses of external events performed at AREVA are based on the approach provided

  2. Reusable Military Launch Systems (RMLS)

    Science.gov (United States)

    2008-02-01

    shown in Figure 11. The second configuration is an axisymmetric, rocket-based combined cycle (RBCC) powered, SSTO vehicle, similar to the GTX...McCormick, D., and Sorensen, K., “Hyperion: An SSTO Vision Vehicle Concept Utilizing Rocket-Based Combined Cycle Propulsion”, AIAA paper 99-4944...there have been several failedattempts at the development of reusable rocket or air-breathing launch vehicle systems. Single-stage-to-orbit ( SSTO

  3. ANALYSIS OF EVENT TOURISM IN RUSSIA, ITS FUNCTIONS, WAYS TO IMPROVE THE EFFICIENCY OF EVENT

    Directory of Open Access Journals (Sweden)

    Mikhail Yur'evich Grushin

    2016-01-01

    Full Text Available This article considers one of the important directions of development of the national economy in the area of tourist services – development of event tourism in the Russian Federation. Today the market of event management in Russia is in the process of formation, therefore its impact on the socio-economic development of regions and Russia as a whole is minimal, and the analysis of the influence is not performed. This problem comes to the fore in the regions of Russia, specializing in the creation of event-direction tourist-recreational cluster. The article provides an analysis of the existing market of event management and event tourism functions. Providing the ways to improve the efficiency of event management and recommendations for the organizer of events in the regions. The article shows the specific role of event tourism in the national tourism and provides direction for the development of organizational and methodical recommendations on its formation in the regions of Russia and the creation of an effective management system at the regional level. The purpose of this article is to analyze the emerging in Russia event tourism market and its specifics. On the basis of these studies are considered folding patterns of the new market and the assessment of its impact on the modern national tourism industry. Methodology. To complete this article are used comparative and economic and statistical analysis methods. Conclusions/significance. The practical importance of this article is in the elimination of existing in the national tourism industry contradictions: on the one hand, in the Russian Federation is annually held a large amount events activities, including world-class in all regions say about tourist trips to the event, but the event tourism does not exist yet. In all regions, there is an internal and inbound tourism, but it has nothing to do with the event tourism. The article has a practical conclusions demonstrate the need to adapt the

  4. Second-order analysis of semiparametric recurrent event processes.

    Science.gov (United States)

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  5. Analysis of catchments response to severe drought event for ...

    African Journals Online (AJOL)

    Nafiisah

    The run sum analysis method was a sound method which indicates in ... intensity and duration of stream flow depletion between nearby catchments. ... threshold level analysis method, and allows drought events to be described in more.

  6. Preliminary safety analysis of unscrammed events for KLFR

    International Nuclear Information System (INIS)

    Kim, S.J.; Ha, G.S.

    2005-01-01

    The report presents the design features of KLFR; Safety Analysis Code; steady-state calculation results and analysis results of unscrammed events. The calculations of the steady-state and unscrammed events have been performed for the conceptual design of KLFR using SSC-K code. UTOP event results in no fuel damage and no centre-line melting. The inherent safety features are demonstrated through the analysis of ULOHS event. Although the analysis of ULOF has much uncertainties in the pump design, the analysis results show the inherent safety characteristics. 6% flow of rated flow of natural circulation is formed in the case of ULOF. In the metallic fuel rod, the cladding temperature is somewhat high due to the low heat transfer coefficient of lead. ULOHS event should be considered in design of RVACS for long-term cooling

  7. Top event prevention analysis: A deterministic use of PRA

    International Nuclear Information System (INIS)

    Worrell, R.B.; Blanchard, D.P.

    1996-01-01

    This paper describes the application of Top Event Prevention Analysis. The analysis finds prevention sets which are combinations of basic events that can prevent the occurrence of a fault tree top event such as core damage. The problem analyzed in this application is that of choosing a subset of Motor-Operated Valves (MOVs) for testing under the Generic Letter 89-10 program such that the desired level of safety is achieved while providing economic relief from the burden of testing all safety-related valves. A brief summary of the method is given, and the process used to produce a core damage expression from Level 1 PRA models for a PWR is described. The analysis provides an alternative to the use of importance measures for finding the important combination of events in a core damage expression. This application of Top Event Prevention Analysis to the MOV problem was achieve with currently available software

  8. Resonant experience in emergent events of analysis

    DEFF Research Database (Denmark)

    Revsbæk, Line

    2018-01-01

    Theory, and the traditions of thought available and known to us, give shape to what we are able to notice of our field of inquiry, and so also of our practice of research. Building on G. H. Mead’s Philosophy of the Present (1932), this paper draws attention to ‘emergent events’ of analysis when...... in responsive relating to (case study) others is made generative as a dynamic in and of case study analysis. Using a case of being a newcomer (to research communities) researching newcomer innovation (of others), ‘resonant experience’ is illustrated as a heuristic in interview analysis to simultaneously...

  9. External events analysis for the Savannah River Site K reactor

    International Nuclear Information System (INIS)

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be ''external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 x 10 -4 per year, from which seismic events are the major contributor (1.2 x 10 -4 per year). Fire initiated events contribute 1.4 x 10 -7 per year, tornados 5.8 x 10 -7 per year, dam failures 1.5 x 10 -6 per year and the crane failure scenario less than 10 -4 per year to the core melt frequency. 8 refs., 3 figs., 5 tabs

  10. A Fourier analysis of extremal events

    DEFF Research Database (Denmark)

    Zhao, Yuwei

    is the extremal periodogram. The extremal periodogram shares numerous asymptotic properties with the periodogram of a linear process in classical time series analysis: the asymptotic distribution of the periodogram ordinates at the Fourier frequencies have a similar form and smoothed versions of the periodogram...

  11. Event analysis using a massively parallel processor

    International Nuclear Information System (INIS)

    Bale, A.; Gerelle, E.; Messersmith, J.; Warren, R.; Hoek, J.

    1990-01-01

    This paper describes a system for performing histogramming of n-tuple data at interactive rates using a commercial SIMD processor array connected to a work-station running the well-known Physics Analysis Workstation software (PAW). Results indicate that an order of magnitude performance improvement over current RISC technology is easily achievable

  12. Reusable launch vehicle development research

    Science.gov (United States)

    1995-01-01

    NASA has generated a program approach for a SSTO reusable launch vehicle technology (RLV) development which includes a follow-on to the Ballistic Missile Defense Organization's (BMDO) successful DC-X program, the DC-XA (Advanced). Also, a separate sub-scale flight demonstrator, designated the X-33, will be built and flight tested along with numerous ground based technologies programs. For this to be a successful effort, a balance between technical, schedule, and budgetary risks must be attained. The adoption of BMDO's 'fast track' management practices will be a key element in the eventual success of NASA's effort.

  13. Multi-Unit Initiating Event Analysis for a Single-Unit Internal Events Level 1 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong San; Park, Jin Hee; Lim, Ho Gon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The Fukushima nuclear accident in 2011 highlighted the importance of considering the risks from multi-unit accidents at a site. The ASME/ANS probabilistic risk assessment (PRA) standard also includes some requirements related to multi-unit aspects, one of which (IE-B5) is as follows: 'For multi-unit sites with shared systems, DO NOT SUBSUME multi-unit initiating events if they impact mitigation capability [1].' However, the existing single-unit PSA models do not explicitly consider multi-unit initiating events and hence systems shared by multiple units (e.g., alternate AC diesel generator) are fully credited for the single unit and ignores the need for the shared systems by other units at the same site [2]. This paper describes the results of the multi-unit initiating event (IE) analysis performed as a part of the at-power internal events Level 1 probabilistic safety assessment (PSA) for an OPR1000 single unit ('reference unit'). In this study, a multi-unit initiating event analysis for a single-unit PSA was performed, and using the results, dual-unit LOOP initiating event was added to the existing PSA model for the reference unit (OPR1000 type). Event trees were developed for dual-unit LOOP and dual-unit SBO which can be transferred from dual- unit LOOP. Moreover, CCF basic events for 5 diesel generators were modelled. In case of simultaneous SBO occurrences in both units, this study compared two different assumptions on the availability of the AAC D/G. As a result, when dual-unit LOOP initiating event was added to the existing single-unit PSA model, the total CDF increased by 1∼ 2% depending on the probability that the AAC D/G is available to a specific unit in case of simultaneous SBO in both units.

  14. Analysis of event-mode data with Interactive Data Language

    International Nuclear Information System (INIS)

    De Young, P.A.; Hilldore, B.B.; Kiessel, L.M.; Peaslee, G.F.

    2003-01-01

    We have developed an analysis package for event-mode data based on Interactive Data Language (IDL) from Research Systems Inc. This high-level language is high speed, array oriented, object oriented, and has extensive visual (multi-dimensional plotting) and mathematical functions. We have developed a general framework, written in IDL, for the analysis of a variety of experimental data that does not require significant customization for each analysis. Unlike many traditional analysis package, spectra and gates are applied after data are read and are easily changed as analysis proceeds without rereading the data. The events are not sequentially processed into predetermined arrays subject to predetermined gates

  15. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  16. Analysis of unprotected overcooling events in the Integral Fast Reactor

    International Nuclear Information System (INIS)

    Vilim, R.B.

    1989-01-01

    Simple analytic models are developed for predicting the response of a metal fueled, liquid-metal cooled reactor to unprotected overcooling events in the balance of plant. All overcooling initiators are shown to fall into two categories. The first category contains these events for which there is no final equilibrium state of constant overcooling, as in the case for a large steam leak. These events are analyzed using a non-flow control mass approach. The second category contains those events which will eventually equilibrate, such as a loss of feedwater heaters. A steady flow control volume analysis shows that these latter events ultimately affect the plant through the feedwater inlet to the steam generator. The models developed for analyzing these two categories provide upper bounds for the reactor's passive response to overcooling accident initiators. Calculation of these bounds for a prototypic plant indicate that failure limits -- eutectic melting, sodium boiling, fuel pin failure -- are not exceeded in any overcooling event. 2 refs

  17. Repeated Time-to-event Analysis of Consecutive Analgesic Events in Postoperative Pain

    DEFF Research Database (Denmark)

    Juul, Rasmus Vestergaard; Rasmussen, Sten; Kreilgaard, Mads

    2015-01-01

    BACKGROUND: Reduction in consumption of opioid rescue medication is often used as an endpoint when investigating analgesic efficacy of drugs by adjunct treatment, but appropriate methods are needed to analyze analgesic consumption in time. Repeated time-to-event (RTTE) modeling is proposed as a way...... to describe analgesic consumption by analyzing the timing of consecutive analgesic events. METHODS: Retrospective data were obtained from 63 patients receiving standard analgesic treatment including morphine on request after surgery following hip fracture. Times of analgesic events up to 96 h after surgery...... were extracted from hospital medical records. Parametric RTTE analysis was performed with exponential, Weibull, or Gompertz distribution of analgesic events using NONMEM®, version 7.2 (ICON Development Solutions, USA). The potential influences of night versus day, sex, and age were investigated...

  18. Reusable Rocket Engine Turbopump Health Management System

    Science.gov (United States)

    Surko, Pamela

    1994-01-01

    A health monitoring expert system software architecture has been developed to support condition-based health monitoring of rocket engines. Its first application is in the diagnosis decisions relating to the health of the high pressure oxidizer turbopump (HPOTP) of Space Shuttle Main Engine (SSME). The post test diagnostic system runs off-line, using as input the data recorded from hundreds of sensors, each running typically at rates of 25, 50, or .1 Hz. The system is invoked after a test has been completed, and produces an analysis and an organized graphical presentation of the data with important effects highlighted. The overall expert system architecture has been developed and documented so that expert modules analyzing other line replaceable units may easily be added. The architecture emphasizes modularity, reusability, and open system interfaces so that it may be used to analyze other engines as well.

  19. Sovereign Default Analysis through Extreme Events Identification

    Directory of Open Access Journals (Sweden)

    Vasile George MARICA

    2015-06-01

    Full Text Available This paper investigates contagion in international credit markets through the use of a novel jump detection technique proposed by Chan and Maheuin (2002. This econometrical methodology is preferred because it is non-linear by definition and not a subject to volatility bias. Also, the identified jumps in CDS premiums are considered as outliers positioned beyond any stochastic movement that can and is already modelled through well-known linear analysis. Though contagion is hard to define, we show that extreme discrete movements in default probabilities inferred from CDS premiums can lead to sound economic conclusions about the risk profile of sovereign nations in international bond markets. We find evidence of investor sentiment clustering for countries with unstable political regimes or that are engaged in armed conflict. Countries that have in their recent history faced currency or financial crises are less vulnerable to external unexpected shocks. First we present a brief history of sovereign defaults with an emphasis on their increased frequency and geographical reach, as financial markets become more and more integrated. We then pass to a literature review of the most important definitions for contagion, and discuss what quantitative methods are available to detect the presence of contagion. The paper continues with the details for the methodology of jump detection through non-linear modelling and its use in the field of contagion identification. In the last sections we present the estimation results for simultaneous jumps between emerging markets CDS and draw conclusions on the difference of behavior in times of extreme movement versus tranquil periods.

  20. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  1. Glaciological parameters of disruptive event analysis

    International Nuclear Information System (INIS)

    Bull, C.

    1980-04-01

    The possibility of complete glaciation of the earth is small and probably need not be considered in the consequence analysis by the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program. However, within a few thousand years an ice sheet may well cover proposed waste disposal sites in Michigan. Those in the Gulf Coast region and New Mexico are unlikely to be ice covered. The probability of ice cover at Hanford in the next million years is finite, perhaps about 0.5. Sea level will fluctuate as a result of climatic changes. As ice sheets grow, sea level will fall. Melting of ice sheets will be accompanied by a rise in sea level. Within the present interglacial period there is a definite chance that the West Antarctic ice sheet will melt. Ice sheets are agents of erosion, and some estimates of the amount of material they erode have been made. As an average over the area glaciated by late Quaternary ice sheets, only a few tens of meters of erosion is indicated. There were perhaps 3 meters of erosion per glaciation cycle. Under glacial conditions the surface boundary conditions for ground water recharge will be appreciably changed. In future glaciations melt-water rivers generally will follow pre-existing river courses. Some salt dome sites in the Gulf Coast region could be susceptible to changes in the course of the Mississippi River. The New Mexico site, which is on a high plateau, seems to be immune from this type of problem. The Hanford Site is only a few miles from the Columbia River, and in the future, lateral erosion by the Columbia River could cause changes in its course. A prudent assumption in the AEGIS study is that the present interglacial will continue for only a limited period and that subsequently an ice sheet will form over North America. Other factors being equal, it seems unwise to site a nuclear waste repository (even at great depth) in an area likely to be glaciated

  2. Human performance analysis of industrial radiography radiation exposure events

    International Nuclear Information System (INIS)

    Reece, W.J.; Hill, S.G.

    1995-01-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures

  3. Formaldehyde in reusable protective gloves.

    Science.gov (United States)

    Pontén, Ann

    2006-05-01

    Due to the clinical findings in a single patient's case, formaldehyde was suspected to be present in clinically relevant levels in reusable protective gloves. Therefore, 9 types of gloves were investigated with the semi-quantitative chromotropic acid method. It was found that 6/9 gloves emitted some formaldehyde and that 4/9 gloves emitted > or =40 microg of formaldehyde. Most of the formaldehyde was found on the inside of the gloves. To get an indication of the clinical relevance, a comparison with a protective cream declared to contain the formaldehyde-releasing agent diazolidinyl urea was performed by comparing areas of gloves with areas of cream layers with thickness 1-2 mg/cm(2). It was found that the amounts of formaldehyde emitted from the gloves might be in the same range as emitted from a layer of cream.

  4. Serious adverse events with infliximab: analysis of spontaneously reported adverse events.

    Science.gov (United States)

    Hansen, Richard A; Gartlehner, Gerald; Powell, Gregory E; Sandler, Robert S

    2007-06-01

    Serious adverse events such as bowel obstruction, heart failure, infection, lymphoma, and neuropathy have been reported with infliximab. The aims of this study were to explore adverse event signals with infliximab by using a long period of post-marketing experience, stratifying by indication. The relative reporting of infliximab adverse events to the U.S. Food and Drug Administration (FDA) was assessed with the public release version of the adverse event reporting system (AERS) database from 1968 to third quarter 2005. On the basis of a systematic review of adverse events, Medical Dictionary for Regulatory Activities (MedDRA) terms were mapped to predefined categories of adverse events, including death, heart failure, hepatitis, infection, infusion reaction, lymphoma, myelosuppression, neuropathy, and obstruction. Disproportionality analysis was used to calculate the empiric Bayes geometric mean (EBGM) and corresponding 90% confidence intervals (EB05, EB95) for adverse event categories. Infliximab was identified as the suspect medication in 18,220 reports in the FDA AERS database. We identified a signal for lymphoma (EB05 = 6.9), neuropathy (EB05 = 3.8), infection (EB05 = 2.9), and bowel obstruction (EB05 = 2.8). The signal for granulomatous infections was stronger than the signal for non-granulomatous infections (EB05 = 12.6 and 2.4, respectively). The signals for bowel obstruction and infusion reaction were specific to patients with IBD; this suggests potential confounding by indication, especially for bowel obstruction. In light of this additional evidence of risk of lymphoma, neuropathy, and granulomatous infections, clinicians should stress this risk in the shared decision-making process.

  5. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    Science.gov (United States)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to

  6. Reusable tamper-indicating security seal

    International Nuclear Information System (INIS)

    Ryan, M.J.

    1981-01-01

    A reusable tamper-indicating mechanical security seal for use in safeguarding nuclear material has been developed. The high-security seal displays an unpredictable, randomly selected, five-digit code each time it is used. This five digit code serves the same purpose that the serial number does for conventional non-reusable seals - a unique identifier for each use or application. The newly developed reusable seal is completely enclosed within a seamless, tamper-indicating, plastic jacket. The jacket is designed to reveal any attempts to penetrate, section or to chemically remove and replace with a counterfeit for surreptitious purposes

  7. Microprocessor event analysis in parallel with Camac data acquisition

    International Nuclear Information System (INIS)

    Cords, D.; Eichler, R.; Riege, H.

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a Camac System (GEC-ELLIOTT System Crate) and shares the Camac access with a Nord-1OS computer. Interfaces have been designed and tested for execution of Camac cycles, communication with the Nord-1OS computer and DMA-transfer from Camac to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-1OS computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the result of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-1OS buffer will be reset and the event omitted from further processing. (orig.)

  8. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    Energy Technology Data Exchange (ETDEWEB)

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K. (and others)

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  9. Microprocessor event analysis in parallel with CAMAC data acquisition

    CERN Document Server

    Cords, D; Riege, H

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a CAMAC System (GEC-ELLIOTT System Crate) and shares the CAMAC access with a Nord-10S computer. Interfaces have been designed and tested for execution of CAMAC cycles, communication with the Nord-10S computer and DMA-transfer from CAMAC to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-10S computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the results of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-10S buffer will be reset and the event omitted from further processing. (5 refs).

  10. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  11. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  12. Reusable Xerogel Containing Quantum Dots with High Fluorescence Retention

    Directory of Open Access Journals (Sweden)

    Xiang-Yong Liang

    2018-03-01

    Full Text Available Although various analytical methods have been established based on quantum dots (QDs, most were conducted in solution, which is inadequate for storage/transportation and rapid analysis. Moreover, the potential environmental problems caused by abandoned QDs cannot be ignored. In this paper, a reusable xerogel containing CdTe with strong emission is established by introducing host–guest interactions between QDs and polymer matrix. This xerogel shows high QDs loading capacity without decrease or redshift in fluorescence (the maximum of loading is 50 wt % of the final xerogel, which benefits from the steric hindrance of β-cyclodextrin (βCD molecules. Host–guest interactions immobilize QDs firmly, resulting in the excellent fluorescence retention of the xerogel. The good detecting performance and reusability mean this xerogel could be employed as a versatile analysis platform (for quantitative and qualitative analyses. In addition, the xerogel can be self-healed by the aid of water.

  13. Reusable coordinator modules for massively concurrent applications

    NARCIS (Netherlands)

    F. Arbab (Farhad); C.L. Blom (Kees); F.J. Burger (Freek); C.T.H. Everaars (Kees)

    1998-01-01

    htmlabstractIsolating computation and communication concerns into separate pure computation and pure coordination modules enhances modularity, understandability and reusability of parallel and/or distributed software. MANIFOLD is a pure coordination language that encourages this separation. We use

  14. Event history analysis and the cross-section

    DEFF Research Database (Denmark)

    Keiding, Niels

    2006-01-01

    Examples are given of problems in event history analysis, where several time origins (generating calendar time, age, disease duration, time on study, etc.) are considered simultaneously. The focus is on complex sampling patterns generated around a cross-section. A basic tool is the Lexis diagram....

  15. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    International Nuclear Information System (INIS)

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-01-01

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility

  16. Root Cause Analysis: Learning from Adverse Safety Events.

    Science.gov (United States)

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. © RSNA, 2015.

  17. Physics analysis of the gang partial rod drive event

    International Nuclear Information System (INIS)

    Boman, C.; Frost, R.L.

    1992-08-01

    During the routine positioning of partial-length control rods in Gang 3 on the afternoon of Monday, July 27, 1992, the partial-length rods continued to drive into the reactor even after the operator released the controlling toggle switch. In response to this occurrence, the Safety Analysis and Engineering Services Group (SAEG) requested that the Applied Physics Group (APG) analyze the gang partial rod drive event. Although similar accident scenarios were considered in analysis for Chapter 15 of the Safety Analysis Report (SAR), APG and SAEG conferred and agreed that this particular type of gang partial-length rod motion event was not included in the SAR. This report details this analysis

  18. Reusable, tamper-indicating seal

    International Nuclear Information System (INIS)

    Ryan, M.J.

    1978-01-01

    A reusable, tamper-indicating seal is comprised of a drum confined within a fixed body and rotatable in one direction therewithin, the top of the drum constituting a tray carrying a large number of small balls of several different colors. The fixed body contains parallel holes for looping a seal wire therethrough. The base of the drums carries cams adapted to coact with cam followers to lock the wire within the seal at one angular position of the drum. A channel in the fixed body, visible from outside the seal, adjacent the tray constitutes a segregated location for a small plurality of the colored balls. A spring in the tray forces colored balls into the segregated location at one angular position of the drum, further rotation securing the balls in position and the wires in the seal. A wedge-shaped plough removes the balls from the segregated location, at a different angular position of the drum, the wire being unlocked at the same postion. A new pattern of colored balls will appear in the segregated location when the seal is relocked

  19. LOSP-initiated event tree analysis for BWR

    International Nuclear Information System (INIS)

    Watanabe, Norio; Kondo, Masaaki; Uno, Kiyotaka; Chigusa, Takeshi; Harami, Taikan

    1989-03-01

    As a preliminary study of 'Japanese Model Plant PSA', a LOSP (loss of off-site power)-initiated Event Tree Analysis for a Japanese typical BWR was carried out solely based on the open documents such as 'Safety Analysis Report'. The objectives of this analysis are as follows; - to delineate core-melt accident sequences initiated by LOSP, - to evaluate the importance of core-melt accident sequences in terms of occurrence frequency, and - to develop a foundation of plant information and analytical procedures for efficiently performing further 'Japanese Model Plant PSA'. This report describes the procedure and results of the LOSP-initiated Event Tree Analysis. In this analysis, two types of event trees, Functional Event Tree and Systemic Event Tree, were developed to delineate core-melt accident sequences and to quantify their frequencies. Front-line System Event Tree was prepared as well to provide core-melt sequence delineation for accident progression analysis of Level 2 PSA which will be followed in a future. Applying U.S. operational experience data such as component failure rates and a LOSP frequency, we obtained the following results; - The total frequency of core-melt accident sequences initiated by LOSP is estimated at 5 x 10 -4 per reactor-year. - The dominant sequences are 'Loss of Decay Heat Removal' and 'Loss of Emergency Electric Power Supply', which account for more than 90% of the total core-melt frequency. In this analysis, a higher value of 0.13/R·Y was used for the LOSP frequency than experiences in Japan and any recovery action was not considered. In fact, however, there has been no experience of LOSP event in Japanese nuclear power plants so far and it is also expected that offsite power and/or PCS would be recovered before core melt. Considering Japanese operating experience and recovery factors will reduce the total core-melt frequency to less than 10 -6 per reactor-year. (J.P.N.)

  20. Evaluation of Fourier integral. Spectral analysis of seismic events

    International Nuclear Information System (INIS)

    Chitaru, Cristian; Enescu, Dumitru

    2003-01-01

    Spectral analysis of seismic events represents a method for great earthquake prediction. The seismic signal is not a sinusoidal signal; for this, it is necessary to find a method for best approximation of real signal with a sinusoidal signal. The 'Quanterra' broadband station allows the data access in numerical and/or graphical forms. With the numerical form we can easily make a computer program (MSOFFICE-EXCEL) for spectral analysis. (authors)

  1. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  2. Top event prevention analysis - a deterministic use of PRA

    International Nuclear Information System (INIS)

    Blanchard, D.P.; Worrell, R.B.

    1995-01-01

    Risk importance measures are popular for many applications of probabilistic analysis. Inherent in the derivation of risk importance measures are implicit assumptions that those using these numerical results should be aware of in their decision making. These assumptions and potential limitations include the following: (1) The risk importance measures are derived for a single event at a time and are therefore valid only if all other event probabilities are unchanged at their current values. (2) The results for which risk importance measures are derived may not be complete for reasons such as truncation

  3. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  4. Conceptual Design of an APT Reusable Spaceplane

    Science.gov (United States)

    Corpino, S.; Viola, N.

    This paper concerns the conceptual design of an Aerial Propellant Transfer reusable spaceplane carried out during our PhD course under the supervision of prof. Chiesa. The new conceptual design methodology employed in order to develop the APT concept and the main characteristics of the spaceplane itself will be presented and discussed. The methodology for conceptual design has been worked out during the last three years. It was originally thought for atmospheric vehicle design but, thanks to its modular structure which makes it very flexible, it has been possible to convert it to space transportation systems design by adding and/or modifying a few modules. One of the major improvements has been for example the conception and development of the mission simulation and trajectory optimisation module. The methodology includes as main characteristics and innovations the latest techniques of geometric modelling and logistic, operational and cost aspects since the first stages of the project. Computer aided design techniques are used to obtain a better definition of the product at the end of the conceptual design phase and virtual reality concepts are employed to visualise three-dimensional installation and operational aspects, at least in part replacing full-scale mock- ups. The introduction of parametric three-dimensional CAD software integrated into the conceptual design methodology represents a great improvement because it allows to carry out different layouts and to assess them immediately. It is also possible to link the CAD system to a digital prototyping software which combines 3D visualisation and assembly analysis, useful to define the so-called Digital Mock-Up at Conceptual Level (DMUCL) which studies the integration between the on board systems, sized with simulation algorithms, and the airframe. DMUCL represents a very good means to integrate the conceptual design with a methodology turned towards dealing with Reliability, Availability, Maintainability and

  5. Using discriminant analysis as a nucleation event classification method

    Directory of Open Access Journals (Sweden)

    S. Mikkonen

    2006-01-01

    Full Text Available More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  6. External events analysis in PSA studies for Czech NPPs

    International Nuclear Information System (INIS)

    Holy, J.; Hustak, S.; Kolar, L.; Jaros, M.; Hladky, M.; Mlady, O.

    2014-01-01

    The purpose of the paper is to summarize current status of natural external hazards analysis in the PSA projects maintained in Czech Republic for both Czech NPPs - Dukovany and Temelin. The focus of the presentation is put upon the basic milestones in external event analysis effort - identification of external hazards important for Czech NPPs sites, screening out of the irrelevant hazards, modeling of plant response to the initiating events, including the basic activities regarding vulnerability and fragility analysis (supported with on-site analysis), quantification of accident sequences, interpretation of results and development of measures decreasing external events risk. The following external hazards are discussed in the paper, which have been addressed during several last years in PSA projects for Czech NPPs: 1)seismicity, 2)extremely low temperature 3)extremely high temperature 4)extreme wind 5)extreme precipitation (water, snow) 6)transport of dangerous substances (as an example of man-made hazard with some differences identified in comparison with natural hazards) 7)other hazards, which are not considered as very important for Czech NPPs, were screened out in the initial phase of the analysis, but are known as potential problem areas abroad. The paper is a result of coordinated effort with participation of experts and staff from engineering support organization UJV Rez, a.s. and NPPs located in Czech Republic - Dukovany and Temelin. (authors)

  7. Analysis of system and of course of events

    International Nuclear Information System (INIS)

    Hoertner, H.; Kersting, E.J.; Puetter, B.M.

    1986-01-01

    The analysis of the system and of the course of events is used to determine the frequency of core melt-out accidents and to describe the safety-related boundary conditions of appropriate accidents. The lecture is concerned with the effect of system changes in the reference plant and the effect of triggering events not assessed in detail or not sufficiently assessed in detail in phase A of the German Risk Study on the frequency of core melt-out accidents, the minimum requirements for system functions for controlling triggering events, i.e. to prevent core melt-out accidents, the reliability data important for reliability investigations and frequency assessments. (orig./DG) [de

  8. EVNTRE, Code System for Event Progression Analysis for PRA

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: EVNTRE is a generalized event tree processor that was developed for use in probabilistic risk analysis of severe accident progressions for nuclear power plants. The general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. EVNTRE efficiently processes large, complex event trees. It can assign probabilities to event tree branch points in several different ways, classify pathways or outcomes into user-specified groupings, and sample input distributions of probabilities and parameters. PSTEVNT, a post-processor program used to sort and reclassify the 'binned' data output from EVNTRE and generate summary tables, is included. 2 - Methods: EVNTRE processes event trees that are cast in the form of questions or events, with multiple choice answers for each question. Split fractions (probabilities or frequencies that sum to unity) are either supplied or calculated for the branches of each question in a path-dependent manner. EVNTRE traverses the tree, enumerating the leaves of the tree and calculating their probabilities or frequencies based upon the initial probability or frequency and the split fractions for the branches taken along the corresponding path to an individual leaf. The questions in the event tree are usually grouped to address specific phases of time regimes in the progression of the scenario or severe accident. Grouping or binning of each path through the event tree in terms of a small number of characteristics or attributes is allowed. Boolean expressions of the branches taken are used to select the appropriate values of the characteristics of interest for the given path. Typically, the user specifies a cutoff tolerance for the frequency of a pathway to terminate further exploration. Multiple sets of input to an event tree can be processed by using Monte Carlo sampling to generate

  9. Incident sequence analysis; event trees, methods and graphical symbols

    International Nuclear Information System (INIS)

    1980-11-01

    When analyzing incident sequences, unwanted events resulting from a certain cause are looked for. Graphical symbols and explanations of graphical representations are presented. The method applies to the analysis of incident sequences in all types of facilities. By means of the incident sequence diagram, incident sequences, i.e. the logical and chronological course of repercussions initiated by the failure of a component or by an operating error, can be presented and analyzed simply and clearly

  10. Analysis of operation events for HFETR emergency diesel generator set

    International Nuclear Information System (INIS)

    Li Zhiqiang; Ji Xifang; Deng Hong

    2015-01-01

    By the statistic analysis of the historical failure data of the emergency diesel generator set, the specific mode, the attribute, and the direct and root origin for each failure are reviewed and summarized. Considering the current status of the emergency diesel generator set, the preventive measures and solutions in terms of operation, handling and maintenance are proposed, and the potential events for the emergency diesel generator set are analyzed. (authors)

  11. Practical guidance for statistical analysis of operational event data

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies

  12. Reusable Boosters in a European-Russian Perspective

    Science.gov (United States)

    Deneu, François; Ramiandrasoa, Fabienne

    2002-01-01

    In 2001, EADS and Khrunichev SRPSC have initiated and carried out a working group devoted to the analysis of potential common studies and developments in the field of space activities. This working group came up with several propositions of interest, among which, the use of reusable boosters issued from Khrunichev previous design appeared to be promising when applied to heavy type launchers. Although the results required to be confirmed by detailed studies prior to final conclusions, preliminary studies have shown the interest of Ariane 5 configurations using such reusable booster in view of reducing the specific and launch cost as well as potentially increasing the performance. In November 2001, EADS and KHRUNICHEV SRPSC have started a study on an Ariane 5 plus reusable boosters configuration. This study aims at obtaining a better understanding of the advantages and drawbacks attached to such a use. Technical feasibility is more in depth analysed, with all recurring and not recurring aspects (including launch infrastructure modifications). Programmatic aspects are also addressed in order to better assess potential economic advantages and unavoidable drawbacks. Beyond that the identification of what could be, for western Europe and Russian players, an efficient and pay- off industrial organisation, is also a study theme of importance. This papers intends to present the main results achieved within this study and the propositions for the future which are likely to provide western Europe and Russia with stronger positions in the competitive field of launch business.

  13. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  14. Delayed reactions to reusable protective gloves.

    Science.gov (United States)

    Pontén, Ann; Dubnika, Inese

    2009-04-01

    The materials in plastic protective gloves are thought to cause less contact allergy than rubber gloves. Our aim was to estimate the frequency of delayed reactions to different types of reusable protective gloves among dermatitis patients. 2 x 2 cm pieces of polyvinyl chloride (PVC) gloves, nitrile gloves, and natural rubber latex (NRL) gloves were tested as is in consecutive dermatitis patients tested with the baseline series. Among 658 patients, 6 patients reacted to PVC gloves and 6 patients to the NRL gloves. None reacted to both these types of gloves. Five of six patients with reactions to rubber gloves reacted to thiuram mix in the baseline series. Delayed reactions to reusable PVC gloves may be as common as to reusable NRL gloves. In contrast to most reactions to the NRL glove, the reactions to the PVC glove had no obvious association with reactions to any allergen(s) in the baseline series.

  15. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  16. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  17. Interactive analysis of human error factors in NPP operation events

    International Nuclear Information System (INIS)

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  18. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  19. Detection of Abnormal Events via Optical Flow Feature Analysis

    Science.gov (United States)

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  20. Vulnerability analysis of a PWR to an external event

    International Nuclear Information System (INIS)

    Aruety, S.; Ilberg, D.; Hertz, Y.

    1980-01-01

    The Vulnerability of a Nuclear Power Plant (NPP) to external events is affected by several factors such as: the degree of redundancy of the reactor systems, subsystems and components; the separation of systems provided in the general layout; the extent of the vulnerable area, i.e., the area which upon being affected by an external event will result in system failure; and the time required to repair or replace the systems, when allowed. The present study offers a methodology, using Probabilistic Safety Analysis, to evaluate the relative importance of the above parameters in reducing the vulnerability of reactor safety systems. Several safety systems of typical PWR's are analyzed as examples. It was found that the degree of redundancy and physical separation of the systems has the most prominent effect on the vulnerability of the NPP

  1. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  2. Reusable Agena study. Volume 2: Technical

    Science.gov (United States)

    Carter, W. K.; Piper, J. E.; Douglass, D. A.; Waller, E. W.; Hopkins, C. V.; Fitzgerald, E. T.; Sagawa, S. S.; Carter, S. A.; Jensen, H. L.

    1974-01-01

    The application of the existing Agena vehicle as a reusable upper stage for the space shuttle is discussed. The primary objective of the study is to define those changes to the Agena required for it to function in the reusable mode in the 100 percent capture of the NASA-DOD mission model. This 100 percent capture is achieved without use of kick motors or stages by simply increasing the Agena propellant load by using optional strap-on-tanks. The required shuttle support equipment, launch and flight operations techniques, development program, and cost package are also defined.

  3. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  4. Prism reactor system design and analysis of postulated unscrammed events

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.; Slovik, G.C.

    1991-08-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristic and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. 6 refs., 4 figs

  5. PRISM reactor system design and analysis of postulated unscrammed events

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.; Slovik, G.C.

    1991-01-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristic and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. (author)

  6. PRISM reactor system design and analysis of postulated unscrammed events

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.; Slovik, G.C.; Rosztoczy, Z.; Lane, J.

    1991-01-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristics and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. 6 refs., 4 figs

  7. Bisphosphonates and risk of cardiovascular events: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Dae Hyun Kim

    Full Text Available Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV events, atrial fibrillation, myocardial infarction (MI, stroke, and CV death in adults with or at risk for low bone mass.A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs and 95% confidence intervals (CIs of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed.Absolute risks over 25-36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84-1.14]; I2 = 0.0%, atrial fibrillation (41 trials; 1.08 [0.92-1.25]; I2 = 0.0%, MI (10 trials; 0.96 [0.69-1.34]; I2 = 0.0%, stroke (10 trials; 0.99 [0.82-1.19]; I2 = 5.8%, and CV death (14 trials; 0.88 [0.72-1.07]; I2 = 0.0% with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96-1.61]; I2 = 0.0%, not for oral bisphosphonates (26 trials; 1.02 [0.83-1.24]; I2 = 0.0%. The CV effects did not vary by subgroups or study quality.Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large reduction in fractures with bisphosphonates, changes in

  8. Learning Objects, Repositories, Sharing and Reusability

    Science.gov (United States)

    Koppi, Tony; Bogle, Lisa; Bogle, Mike

    2005-01-01

    The online Learning Resource Catalogue (LRC) Project has been part of an international consortium for several years and currently includes 25 institutions worldwide. The LRC Project has evolved for several pragmatic reasons into an academic network whereby members can identify and share reusable learning objects as well as collaborate in a number…

  9. Transforming existing content into reusable Learning Objects

    NARCIS (Netherlands)

    Doorten, Monique; Giesbers, Bas; Janssen, José; Daniels, Jan; Koper, Rob

    2003-01-01

    Please cite as: Doorten, M., Giesbers, B., Janssen, J., Daniëls, J, & Koper, E.J.R., (2004). Transforming existing content into reusable learning objects. In R. McGreal, Online Education using Learning Objects (pp. 116-127). London: RoutledgeFalmer.

  10. 14 CFR 437.67 - Tracking a reusable suborbital rocket.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Tracking a reusable suborbital rocket. 437... a reusable suborbital rocket. A permittee must— (a) During permitted flight, measure in real time the position and velocity of its reusable suborbital rocket; and (b) Provide position and velocity...

  11. Analysis of warm convective rain events in Catalonia

    Science.gov (United States)

    Ballart, D.; Figuerola, F.; Aran, M.; Rigo, T.

    2009-09-01

    Between the end of September and November, events with high amounts of rainfall are quite common in Catalonia. The high sea surface temperature of the Mediterranean Sea near to the Catalan Coast is one of the most important factors that help to the development of this type of storms. Some of these events have particular characteristics: elevated rain rate during short time periods, not very deep convection and low lightning activity. Consequently, the use of remote sensing tools for the surveillance is quite useless or limited. With reference to the high rain efficiency, this is caused by internal mechanisms of the clouds, and also by the air mass where the precipitation structure is developed. As aforementioned, the contribution of the sea to the air mass is very relevant, not only by the increase of the big condensation nuclei, but also by high temperature of the low layers of the atmosphere, where are allowed clouds with 5 or 6 km of particles in liquid phase. In fact, the freezing level into these clouds can be detected by -15ºC. Due to these characteristics, this type of rainy structures can produce high quantities of rainfall in a relatively brief period of time, and, in the case to be quasi-stationary, precipitation values at surface could be very important. From the point of view of remote sensing tools, the cloud nature implies that the different tools and methodologies commonly used for the analysis of heavy rain events are not useful. This is caused by the following features: lightning are rarely observed, the top temperatures of clouds are not cold enough to be enhanced in the satellite imagery, and, finally, reflectivity radar values are lower than other heavy rain cases. The third point to take into account is the vulnerability of the affected areas. An elevated percentage of the Catalan population lives in the coastal region. In the central coast of Catalonia, the urban areas are surrounded by a not very high mountain range with small basins and

  12. Making the Case for Reusable Booster Systems: The Operations Perspective

    Science.gov (United States)

    Zapata, Edgar

    2012-01-01

    Presentation to the Aeronautics Space Engineering Board National Research Council Reusable Booster System: Review and Assessment Committee. Addresses: the criteria and assumptions used in the formulation of current RBS plans; the methodologies used in the current cost estimates for RBS; the modeling methodology used to frame the business case for an RBS capability including: the data used in the analysis, the models' robustness if new data become available, and the impact of unclassified government data that was previously unavailable and which will be supplied by the USAF; the technical maturity of key elements critical to RBS implementation and the ability of current technology development plans to meet technical readiness milestones.

  13. Conceptual design of a crewed reusable space transportation system aimed at parabolic flights: stakeholder analysis, mission concept selection, and spacecraft architecture definition

    Science.gov (United States)

    Fusaro, Roberta; Viola, Nicole; Fenoglio, Franco; Santoro, Francesco

    2017-03-01

    This paper proposes a methodology to derive architectures and operational concepts for future earth-to-orbit and sub-orbital transportation systems. In particular, at first, it describes the activity flow, methods, and tools leading to the generation of a wide range of alternative solutions to meet the established goal. Subsequently, the methodology allows selecting a small number of feasible options among which the optimal solution can be found. For the sake of clarity, the first part of the paper describes the methodology from a theoretical point of view, while the second part proposes the selection of mission concepts and of a proper transportation system aimed at sub-orbital parabolic flights. Starting from a detailed analysis of the stakeholders and their needs, the major objectives of the mission have been derived. Then, following a system engineering approach, functional analysis tools as well as concept of operations techniques allowed generating a very high number of possible ways to accomplish the envisaged goals. After a preliminary pruning activity, aimed at defining the feasibility of these concepts, more detailed analyses have been carried out. Going on through the procedure, the designer should move from qualitative to quantitative evaluations, and for this reason, to support the trade-off analysis, an ad-hoc built-in mission simulation software has been exploited. This support tool aims at estimating major mission drivers (mass, heat loads, manoeuverability, earth visibility, and volumetric efficiency) as well as proving the feasibility of the concepts. Other crucial and multi-domain mission drivers, such as complexity, innovation level, and safety have been evaluated through the other appropriate analyses. Eventually, one single mission concept has been selected and detailed in terms of layout, systems, and sub-systems, highlighting also logistic, safety, and maintainability aspects.

  14. Analysis of the stability of events occurred in Laguna Verde

    International Nuclear Information System (INIS)

    Castillo D, R.; Ortiz V, J.; Calleros M, G.

    2005-01-01

    The new fuel designs for operation cycles more long have regions of uncertainty bigger that those of the old fuels, and therefore, they can have oscillations of power when an event is presented that causes that the reactor operates to high power and low flow of coolant. During the start up of the reactor there are continued procedures that avoid that oscillations are presented with that which makes sure the stable behavior of the reactor. However, when the reactor is operating to nominal conditions and they are shot or they are transferred to low speed the recirculation pumps, it cannot make sure that the reactor doesn't present oscillations of power when entering to the restricted operation regions. The methods of stability analysis commonly use signs of neutronic noise that require to be stationary, but after a transitory one where they commonly get lost the recirculation pumps the signs they don't have the required characteristics, for what they are used with certain level of uncertainty by the limited validity of the models. In this work the Prony method is used to determine the reactor stability, starting from signs of transitory and it is compared with autoregressive models. Four events are analyzed happened in the Laguna Verde power plant where the reactor was in the area of high power and low flow of coolant, giving satisfactory results. (Author)

  15. Formal Analysis of BPMN Models Using Event-B

    Science.gov (United States)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  16. Root cause analysis for fire events at nuclear power plants

    International Nuclear Information System (INIS)

    1999-09-01

    Fire hazard has been identified as a major contributor to a plant' operational safety risk. The International nuclear power community (regulators, operators, designers) has been studying and developing tools for defending against this hazed. Considerable advances have been achieved during past two decades in design and regulatory requirements for fire safety, fire protection technology and related analytical techniques. The IAEA endeavours to provide assistance to Member States in improving fire safety in nuclear power plants. A task was launched by IAEA in 1993 with the purpose to develop guidelines and good practices, to promote advanced fire safety assessment techniques, to exchange state of the art information, and to provide engineering safety advisory services and training in the implementation of internationally accepted practices. This TECDOC addresses a systematic assessment of fire events using the root cause analysis methodology, which is recognized as an important element of fire safety assessment

  17. The Run 2 ATLAS Analysis Event Data Model

    CERN Document Server

    SNYDER, S; The ATLAS collaboration; NOWAK, M; EIFERT, T; BUCKLEY, A; ELSING, M; GILLBERG, D; MOYSE, E; KOENEKE, K; KRASZNAHORKAY, A

    2014-01-01

    During the LHC's first Long Shutdown (LS1) ATLAS set out to establish a new analysis model, based on the experience gained during Run 1. A key component of this is a new Event Data Model (EDM), called the xAOD. This format, which is now in production, provides the following features: A separation of the EDM into interface classes that the user code directly interacts with, and data storage classes that hold the payload data. The user sees an Array of Structs (AoS) interface, while the data is stored in a Struct of Arrays (SoA) format in memory, thus making it possible to efficiently auto-vectorise reconstruction code. A simple way of augmenting and reducing the information saved for different data objects. This makes it possible to easily decorate objects with new properties during data analysis, and to remove properties that the analysis does not need. A persistent file format that can be explored directly with ROOT, either with or without loading any additional libraries. This allows fast interactive naviga...

  18. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    Science.gov (United States)

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  19. Authoring Systems Delivering Reusable Learning Objects

    Directory of Open Access Journals (Sweden)

    George Nicola Sammour

    2009-10-01

    Full Text Available A three layer e-learning course development model has been defined based on a conceptual model of learning content object. It starts by decomposing the learning content into small chunks which are initially placed in a hierarchic structure of units and blocks. The raw content components, being the atomic learning objects (ALO, were linked to the blocks and are structured in the database. We set forward a dynamic generation of LO's using re-usable e-learning raw materials or ALO’s In that view we need a LO authoring/ assembling system fitting the requirements of interoperability and reusability and starting from selecting the raw learning content from the learning materials content database. In practice authoring systems are used to develop e-learning courses. The company EDUWEST has developed an authoring system that is database based and will be SCORM compliant in the near future.

  20. The reusable launch vehicle technology program

    Science.gov (United States)

    Cook, S.

    1995-01-01

    Today's launch systems have major shortcomings that will increase in significance in the future, and thus are principal drivers for seeking major improvements in space transportation. They are too costly; insufficiently reliable, safe, and operable; and increasingly losing market share to international competition. For the United States to continue its leadership in the human exploration and wide ranging utilization of space, the first order of business must be to achieve low cost, reliable transportatin to Earth orbit. NASA's Access to Space Study, in 1993, recommended the development of a fully reusable single-stage-to-orbit (SSTO) rocket vehicle as an Agency goal. The goal of the Reusable Launch Vehicle (RLV) technology program is to mature the technologies essential for a next-generation reusable launch system capable of reliably serving National space transportation needs at substantially reduced costs. The primary objectives of the RLV technology program are to (1) mature the technologies required for the next-generation system, (2) demonstrate the capability to achieve low development and operational cost, and rapid launch turnaround times and (3) reduce business and technical risks to encourage significant private investment in the commercial development and operation of the next-generation system. Developing and demonstrating the technologies required for a Single Stage to Orbit (SSTO) rocket is a focus of the program becuase past studies indicate that it has the best potential for achieving the lowest space access cost while acting as an RLV technology driver (since it also encompasses the technology requirements of reusable rocket vehicles in general).

  1. The reusable launch vehicle technology program

    Science.gov (United States)

    Cook, S.

    Today's launch systems have major shortcomings that will increase in significance in the future, and thus are principal drivers for seeking major improvements in space transportation. They are too costly; insufficiently reliable, safe, and operable; and increasingly losing market share to international competition. For the United States to continue its leadership in the human exploration and wide ranging utilization of space, the first order of business must be to achieve low cost, reliable transportatin to Earth orbit. NASA's Access to Space Study, in 1993, recommended the development of a fully reusable single-stage-to-orbit (SSTO) rocket vehicle as an Agency goal. The goal of the Reusable Launch Vehicle (RLV) technology program is to mature the technologies essential for a next-generation reusable launch system capable of reliably serving National space transportation needs at substantially reduced costs. The primary objectives of the RLV technology program are to (1) mature the technologies required for the next-generation system, (2) demonstrate the capability to achieve low development and operational cost, and rapid launch turnaround times and (3) reduce business and technical risks to encourage significant private investment in the commercial development and operation of the next-generation system. Developing and demonstrating the technologies required for a Single Stage to Orbit (SSTO) rocket is a focus of the program becuase past studies indicate that it has the best potential for achieving the lowest space access cost while acting as an RLV technology driver (since it also encompasses the technology requirements of reusable rocket vehicles in general).

  2. Analysis of core damage frequency: Surry, Unit 1 internal events

    International Nuclear Information System (INIS)

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the accident sequence analysis of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed and described here is an extensive of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments form numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.05-E-5 per year, with a 95% upper bound of 1.34E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency. 49 refs., 52 figs., 70 tabs

  3. Trend analysis of explosion events at overseas nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Hiroki

    2008-01-01

    We surveyed failures caused by disasters (e.g., severe storms, heavy rainfall, earthquakes, explosions and fires) which occurred during the 13 years from 1995 to 2007 at overseas nuclear power plants (NPPs) from the nuclear information database of the Institute of Nuclear Safety System. Incorporated (INSS). The results revealed that explosions were the second most frequent type of failure after fires. We conducted a trend analysis on such explosion events. The analysis by equipment, cause, and effect on the plant showed that the explosions occurred mainly at electrical facilities, and thus it is essential to manage the maintenance of electrical facilities for preventing explosions. In addition, it was shown that explosions at transformers and batteries, which have never occurred at Japan's NPPs, accounted for as much as 55% of all explosions. The fact infers that this difference is attributable to the difference in maintenance methods of transformers (condition based maintenance adopted by NPPs) and workforce organization of batteries (inspections performed by utilities' own maintenance workers at NPPs). (author)

  4. Reusable fuel test assembly for the FFTF

    International Nuclear Information System (INIS)

    Pitner, A.L.; Dittmer, J.O.

    1992-01-01

    A fuel test assembly that provides re-irradiation capability after interim discharge and reconstitution of the test pin bundle has been developed for use in the Fast Flux Test Facility (FFTF). This test vehicle permits irradiation test data to be obtained at multiple exposures on a few select test pins without the substantial expense of fabricating individual test assemblies as would otherwise be required. A variety of test pin types can be loaded in the reusable test assembly. A reusable test vehicle for irradiation testing in the FFTF has long been desired, but a number of obstacles previously prevented the implementation of such an experimental rig. The MFF-8A test assembly employs a 169-pin bundle using HT-9 alloy for duct and cladding material. The standard driver pins in the fuel bundle are sodium-bonded metal fuel (U-10 wt% Zr). Thirty-seven positions in the bundle are replaceable pin positions. Standard MFF-8A driver pins can be loaded in any test pin location to fill the bundle if necessary. Application of the MFF-8A reusable test assembly in the FFTF constitutes a considerable cost-saving measure with regard to irradiation testing. Only a few well-characterized test pins need be fabricated to conduct a test program rather than constructing entire test assemblies

  5. Autocommander: A Supervisory Controller for Integrated Guidance and Control for the 2nd Generation Reusable Launch Vehicle

    Science.gov (United States)

    Fisher, J. E.; Lawrence, D. A.; Zhu, J. J.; Jackson, Scott (Technical Monitor)

    2002-01-01

    This paper presents a hierarchical architecture for integrated guidance and control that achieves risk and cost reduction for NASA's 2d generation reusable launch vehicle (RLV). Guidance, attitude control, and control allocation subsystems that heretofore operated independently will now work cooperatively under the coordination of a top-level autocommander. In addition to delivering improved performance from a flight mechanics perspective, the autocommander is intended to provide an autonomous supervisory control capability for traditional mission management under nominal conditions, G&C reconfiguration in response to effector saturation, and abort mode decision-making upon vehicle malfunction. This high-level functionality is to be implemented through the development of a relational database that is populated with the broad range of vehicle and mission specific data and translated into a discrete event system model for analysis, simulation, and onboard implementation. A Stateflow Autocoder software tool that translates the database into the Stateflow component of a Matlab/Simulink simulation is also presented.

  6. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    Science.gov (United States)

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    Directory of Open Access Journals (Sweden)

    Maria Cleofe Giorgino

    2017-11-01

    Full Text Available Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR. The features of this tool are that it aims to represent the multidimensional impact of the organization’s activity and assumes materiality as a guiding principle of the report drafting. Adopting the event study methodology associated with a statistical significance test for categorical data, our results verify that an organization’s release of IR is able to produce a statistically significant impact on the related share prices. Moreover, the term “integrated” assigned to the reports plays a significant role in the impact on capital markets. Our findings have beneficial implications for both researchers and practitioners, adding new evidence for the IR usefulness as a corporate disclosure tool and the effect of an organization’s decision to disclose material information.

  8. Pertussis outbreak in Polish shooters with adverse event analysis

    Directory of Open Access Journals (Sweden)

    Monika Skrzypiec-Spring

    2017-04-01

    Full Text Available In addition to different injuries, infections are the most common reason for giving up training altogether or reducing its volume and intensity, as well as a lack of opportunities to participate in sports competitions. Nowadays, a slow but constant re‑emergence of pertussis, especially among teenagers and young adults, including athletes, can be observed. This paper describes an outbreak of pertussis among professional Polish shooters, focusing on the transmission of Bordetella pertussis infection between members of the national team, its influence on performance capacity and adverse event analysis. From 9 June, 2015 to 31 July, 2015, a total of 4 confirmed and suspected cases of pertussis were reported among members of the Polish Sport Shooting National Team, their relatives and acquaintances. Pertussis significantly decreased exercise performance of the first athlete, a 35-year-old woman, interrupted her training, and finally resulted in failure to win a medal or quota place. Pertussis also significantly decreased performance of the second athlete, a 25-year-old shooter. The other cases emerged in their families. Whooping cough is a real threat to athletes and should be prevented. Preventive measures include appropriate immunization, constant medical supervision, as well as early isolation, diagnostic tests and treatment of all infected sport team members. Regular administration of booster doses of the acellular pertussis vaccine (Tdpa every 5 years seems reasonable.

  9. Analysis of Multi Muon Events in the L3 Detector

    CERN Document Server

    Schmitt, Volker

    2000-01-01

    The muon density distribution in air showers initiated by osmi parti les is sensitive to the hemi al omposition of osmi rays. The density an be measured via the multipli ity distribution in a nite size dete tor, as it is L3. With a shallow depth of 30 meters under ground, the dete tor provides an ex ellent fa ility to measure a high muon rate, but being shielded from the hadroni and ele troni shower omponent. Subje t of this thesis is the des ription of the L3 Cosmi s experiment (L3+C), whi h is taking data sin e May 1999 and the analysis of muon bundles in the large magneti spe trometer of L3. The new osmi trigger and readout system is brie y des ribed. The in uen e of dierent primaries on the multipli ity distribution has been investigated using Monte Carlo event samples, generated with the CORSIKA program. The simulation results showed that L3+C measures in the region of the \\knee" of the primary spe trum of osmi rays. A new pattern re ognition has been developed and added to the re onstru tion ode, whi h ...

  10. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  11. Civil protection and Damaging Hydrogeological Events: comparative analysis of the 2000 and 2015 events in Calabria (southern Italy

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2017-11-01

    Full Text Available Calabria (southern Italy is a flood prone region, due to both its rough orography and fast hydrologic response of most watersheds. During the rainy season, intense rain affects the region, triggering floods and mass movements that cause economic damage and fatalities. This work presents a methodological approach to perform the comparative analysis of two events affecting the same area at a distance of 15 years, by collecting all the qualitative and quantitative features useful to describe both rain and damage. The aim is to understand if similar meteorological events affecting the same area can have different outcomes in terms of damage. The first event occurred between 8 and 10 September 2000, damaged 109 out of 409 municipalities of the region and killed 13 people in a campsite due to a flood. The second event, which occurred between 30 October and 1 November 2015, damaged 79 municipalities, and killed a man due to a flood. The comparative analysis highlights that, despite the exceptionality of triggering daily rain was higher in the 2015 event, the damage caused by the 2000 event to both infrastructures and belongings was higher, and it was strongly increased due to the 13 flood victims. We concluded that, in the 2015 event, the management of pre-event phases, with the issuing of meteorological alert, and the emergency management, with the preventive evacuation of people in hazardous situations due to landslides or floods, contributed to reduce the number of victims.

  12. Distributed Health Monitoring System for Reusable Liquid Rocket Engines

    Science.gov (United States)

    Lin, C. F.; Figueroa, F.; Politopoulos, T.; Oonk, S.

    2009-01-01

    The ability to correctly detect and identify any possible failure in the systems, subsystems, or sensors within a reusable liquid rocket engine is a major goal at NASA John C. Stennis Space Center (SSC). A health management (HM) system is required to provide an on-ground operation crew with an integrated awareness of the condition of every element of interest by determining anomalies, examining their causes, and making predictive statements. However, the complexity associated with relevant systems, and the large amount of data typically necessary for proper interpretation and analysis, presents difficulties in implementing complete failure detection, identification, and prognostics (FDI&P). As such, this paper presents a Distributed Health Monitoring System for Reusable Liquid Rocket Engines as a solution to these problems through the use of highly intelligent algorithms for real-time FDI&P, and efficient and embedded processing at multiple levels. The end result is the ability to successfully incorporate a comprehensive HM platform despite the complexity of the systems under consideration.

  13. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Dale [Los Alamos National Laboratory; Selby, Neil [AWE Blacknest

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  14. Analysis of early initiating event(s) in radiation-induced thymic lymphomagenesis

    International Nuclear Information System (INIS)

    Muto, Masahiro; Ying Chen; Kubo, Eiko; Mita, Kazuei

    1996-01-01

    Since the T cell receptor rearrangement is a sequential process and unique to the progeny of each clone, we investigated the early initiating events in radiation-induced thymic lymphomagenesis by comparing the oncogenic alterations with the pattern of γ T cell receptor (TCR) rearrangements. We reported previously that after leukemogenic irradiation, preneoplastic cells developed, albeit infrequently, from thymic leukemia antigen-2 + (TL-2 + ) thymocytes. Limited numbers of TL-2 + cells from individual irradiated B10.Thy-1.1 mice were injected into B10.Thy-1.2 mice intrathymically, and the common genetic changes among the donor-type T cell lymphomas were investigated with regard to p53 gene and chromosome aberrations. The results indicated that some mutations in the p53 gene had taken place in these lymphomas, but there was no common mutation among the donor-type lymphomas from individual irradiated mice, suggesting that these mutations were late-occurring events in the process of oncogenesis. On the other hand, there were common chromosome aberrations or translocations such as trisomy 15, t(7F; 10C), t(1A; 13D) or t(6A; XB) among the donor-type lymphomas derived from half of the individual irradiated mice. This indicated that the aberrations/translocations, which occurred in single progenitor cells at the early T cell differentiation either just before or after γ T cell receptor rearrangements, might be important candidates for initiating events. In the donor-type lymphomas from the other half of the individual irradiated mice, microgenetic changes were suggested to be initial events and also might take place in single progenitor cells just before or right after γ TCR rearrangements. (author)

  15. Analysis of thermal fatigue events in light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Okuda, Yasunori [Institute of Nuclear Safety System Inc., Seika, Kyoto (Japan)

    2000-09-01

    Thermal fatigue events, which may cause shutdown of nuclear power stations by wall-through-crack of pipes of RCRB (Reactor Coolant Pressure Boundary), are reported by licensees in foreign countries as well as in Japan. In this paper, thermal fatigue events reported in anomalies reports of light water reactors inside and outside of Japan are investigated. As a result, it is clarified that the thermal fatigue events can be classified in seven patterns by their characteristics, and the trend of the occurrence of the events in PWRs (Pressurized Water Reactors) has stronger co-relation to operation hours than that in BWRs (Boiling Water Reactors). Also, it is concluded that precise identification of locations where thermal fatigue occurs and its monitoring are important to prevent the thermal fatigue events by aging or miss modification. (author)

  16. Reusable single-port access device shortens operative time and reduces operative costs.

    Science.gov (United States)

    Shussman, Noam; Kedar, Asaf; Elazary, Ram; Abu Gazala, Mahmoud; Rivkind, Avraham I; Mintz, Yoav

    2014-06-01

    In recent years, single-port laparoscopy (SPL) has become an attractive approach for performing surgical procedures. The pitfalls of this approach are technical and financial. Financial concerns are due to the increased cost of dedicated devices and prolonged operating room time. Our aim was to calculate the cost of SPL using a reusable port and instruments in order to evaluate the cost difference between this approach to SPL using the available disposable ports and standard laparoscopy. We performed 22 laparoscopic procedures via the SPL approach using a reusable single-port access system and reusable laparoscopic instruments. These included 17 cholecystectomies and five other procedures. Operative time, postoperative length of stay (LOS) and complications were prospectively recorded and were compared with similar data from our SPL database. Student's t test was used for statistical analysis. SPL was successfully performed in all cases. Mean operative time for cholecystectomy was 72 min (range 40-116). Postoperative LOS was not changed from our standard protocols and was 1.1 days for cholecystectomy. The postoperative course was within normal limits for all patients and perioperative morbidity was recorded. Both operative time and length of hospital stay were shorter for the 17 patients who underwent cholecystectomy using a reusable port than for the matched previous 17 SPL cholecystectomies we performed (p cost difference. Operating with a reusable port ended up with an average cost savings of US$388 compared with using disposable ports, and US$240 compared with standard laparoscopy. Single-port laparoscopic surgery is a technically challenging and expensive surgical approach. Financial concerns among others have been advocated against this approach; however, we demonstrate herein that using a reusable port and instruments reduces operative time and overall operative costs, even beyond the cost of standard laparoscopy.

  17. Internal event analysis of Laguna Verde Unit 1 Nuclear Power Plant. System Analysis

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis of Laguna Verde Unit 1 Nuclear Power Plant , CNSNS-TR-004, in five volumes. The reports are organized as follows: CNSNS-TR-004 Volume 1: Introduction and Methodology. CNSNS-TR-004 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR-004 Volume 3: System Analysis. CNSNS-TR-004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR-004 Volume 5: Appendices A, B and C. This volume presents the results of the system analysis for the Laguna Verde Unit 1 Nuclear Power Plant. The system analysis involved the development of logical models for all the systems included in the accident sequence event tree headings, and for all the support systems required to operate the front line systems. For the Internal Event analysis for Laguna Verde, 16 front line systems and 5 support systems were included. Detailed fault trees were developed for most of the important systems. Simplified fault trees focusing on major faults were constructed for those systems that can be adequately represent,ed using this kind of modeling. For those systems where fault tree models were not constructed, actual data were used to represent the dominant failures of the systems. The main failures included in the fault trees are hardware failures, test and maintenance unavailabilities, common cause failures, and human errors. The SETS and TEMAC codes were used to perform the qualitative and quantitative fault tree analyses. (Author)

  18. Yucca Mountain Feature, Event, and Process (FEP) Analysis

    International Nuclear Information System (INIS)

    Freeze, G.

    2005-01-01

    A Total System Performance Assessment (TSPA) model was developed for the U.S. Department of Energy (DOE) Yucca Mountain Project (YMP) to help demonstrate compliance with applicable postclosure regulatory standards and support the License Application (LA). Two important precursors to the development of the TSPA model were (1) the identification and screening of features, events, and processes (FEPs) that might affect the Yucca Mountain disposal system (i.e., FEP analysis), and (2) the formation of scenarios from screened in (included) FEPs to be evaluated in the TSPA model (i.e., scenario development). YMP FEP analysis and scenario development followed a five-step process: (1) Identify a comprehensive list of FEPs potentially relevant to the long-term performance of the disposal system. (2) Screen the FEPs using specified criteria to identify those FEPs that should be included in the TSPA analysis and those that can be excluded from the analysis. (3) Form scenarios from the screened in (included) FEPs. (4) Screen the scenarios using the same criteria applied to the FEPs to identify any scenarios that can be excluded from the TSPA, as appropriate. (5) Specify the implementation of the scenarios in the computational modeling for the TSPA, and document the treatment of included FEPs. This paper describes the FEP analysis approach (Steps 1 and 2) for YMP, with a brief discussion of scenario formation (Step 3). Details of YMP scenario development (Steps 3 and 4) and TSPA modeling (Step 5) are beyond scope of this paper. The identification and screening of the YMP FEPs was an iterative process based on site-specific information, design, and regulations. The process was iterative in the sense that there were multiple evaluation and feedback steps (e.g., separate preliminary, interim, and final analyses). The initial YMP FEP list was compiled from an existing international list of FEPs from other radioactive waste disposal programs and was augmented by YMP site- and design

  19. Event-shape analysis: Sequential versus simultaneous multifragment emission

    International Nuclear Information System (INIS)

    Cebra, D.A.; Howden, S.; Karn, J.; Nadasen, A.; Ogilvie, C.A.; Vander Molen, A.; Westfall, G.D.; Wilson, W.K.; Winfield, J.S.; Norbeck, E.

    1990-01-01

    The Michigan State University 4π array has been used to select central-impact-parameter events from the reaction 40 Ar+ 51 V at incident energies from 35 to 85 MeV/nucleon. The event shape in momentum space is an observable which is shown to be sensitive to the dynamics of the fragmentation process. A comparison of the experimental event-shape distribution to sequential- and simultaneous-decay predictions suggests that a transition in the breakup process may have occurred. At 35 MeV/nucleon, a sequential-decay simulation reproduces the data. For the higher energies, the experimental distributions fall between the two contrasting predictions

  20. Event tree analysis for the system of hybrid reactor

    International Nuclear Information System (INIS)

    Yang Yongwei; Qiu Lijian

    1993-01-01

    The application of probabilistic risk assessment for fusion-fission hybrid reactor is introduced. A hybrid reactor system has been analysed using event trees. According to the character of the conceptual design of Hefei Fusion-fission Experimental Hybrid Breeding Reactor, the probabilities of the event tree series induced by 4 typical initiating events were calculated. The results showed that the conceptual design is safe and reasonable. through this paper, the safety character of hybrid reactor system has been understood more deeply. Some suggestions valuable to safety design for hybrid reactor have been proposed

  1. Time to Tenure in Spanish Universities: An Event History Analysis

    Science.gov (United States)

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  2. Time to tenure in Spanish universities: an event history analysis.

    Science.gov (United States)

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  3. Time to tenure in Spanish universities: an event history analysis.

    Directory of Open Access Journals (Sweden)

    Luis Sanz-Menéndez

    Full Text Available Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  4. Nonstochastic Analysis of Manufacturing Systems Using Timed-Event Graphs

    DEFF Research Database (Denmark)

    Hulgaard, Henrik; Amon, Tod

    1996-01-01

    Using automated methods to analyze the temporal behavior ofmanufacturing systems has proven to be essential and quite beneficial.Popular methodologies include Queueing networks, Markov chains,simulation techniques, and discrete event systems (such as Petrinets). These methodologies are primarily...

  5. Analysis of the Steam Generator Tubes Rupture Initiating Event

    International Nuclear Information System (INIS)

    Trillo, A.; Minguez, E.; Munoz, R.; Melendez, E.; Sanchez-Perea, M.; Izquierd, J.M.

    1998-01-01

    In PSA studies, Event Tree-Fault Tree techniques are used to analyse to consequences associated with the evolution of an initiating event. The Event Tree is built in the sequence identification stage, following the expected behaviour of the plant in a qualitative way. Computer simulation of the sequences is performed mainly to determine the allowed time for operator actions, and do not play a central role in ET validation. The simulation of the sequence evolution can instead be performed by using standard tools, helping the analyst obtain a more realistic ET. Long existing methods and tools can be used to automatism the construction of the event tree associated to a given initiator. These methods automatically construct the ET by simulating the plant behaviour following the initiator, allowing some of the systems to fail during the sequence evolution. Then, the sequences with and without the failure are followed. The outcome of all this is a Dynamic Event Tree. The work described here is the application of one such method to the particular case of the SGTR initiating event. The DYLAM scheduler, designed at the Ispra (Italy) JRC of the European Communities, is used to automatically drive the simulation of all the sequences constituting the Event Tree. Similarly to the static Event Tree, each time a system is demanded, two branches are open: one corresponding to the success and the other to the failure of the system. Both branches are followed by the plant simulator until a new system is demanded, and the process repeats. The plant simulation modelling allows the treatment of degraded sequences that enter into the severe accident domain as well as of success sequences in which long-term cooling is started. (Author)

  6. Detecting failure events in buildings: a numerical and experimental analysis

    OpenAIRE

    Heckman, V. M.; Kohler, M. D.; Heaton, T. H.

    2010-01-01

    A numerical method is used to investigate an approach for detecting the brittle fracture of welds associated with beam -column connections in instrumented buildings in real time through the use of time-reversed Green’s functions and wave propagation reciprocity. The approach makes use of a prerecorded catalog of Green’s functions for an instrumented building to detect failure events in the building during a later seismic event by screening continuous data for the presence of wavef...

  7. The analysis of a complex fire event using multispaceborne observations

    Directory of Open Access Journals (Sweden)

    Andrei Simona

    2018-01-01

    Full Text Available This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  8. The analysis of a complex fire event using multispaceborne observations

    Science.gov (United States)

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  9. Trend analysis of cables failure events at nuclear power plants

    International Nuclear Information System (INIS)

    Fushimi, Yasuyuki

    2007-01-01

    In this study, 152 failure events related with cables at overseas nuclear power plants are selected from Nuclear Information Database, which is owned by The Institute of Nuclear Safety System, and these events are analyzed in view of occurrence, causal factor, and so on. And 15 failure events related with cables at domestic nuclear power plants are selected from Nuclear Information Archives, which is owned by JANTI, and these events are analyzed by the same manner. As a result of comparing both trends, it is revealed following; 1) A cable insulator failure rate is lower at domestic nuclear power plants than at foreign ones. It is thought that a deterioration diagnosis is performed broadly in Japan. 2) Many buried cables failure events have been occupied a significant portion of cables failure events during work activity at overseas plants, however none has been occurred at domestic plants. It is thought that sufficient survey is conducted before excavating activity in Japan. 3) A domestic age related cables failure rate in service is lower than the overseas one and domestic improper maintenance rate is higher than the overseas one. Maintenance worker' a skill improvement is expected in order to reduce improper maintenance. (author)

  10. Preliminary Analysis of the Common Cause Failure Events for Domestic Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Daeil; Han, Sanghoon

    2007-01-01

    It is known that the common cause failure (CCF) events have a great effect on the safety and probabilistic safety assessment (PSA) results of nuclear power plants (NPPs). However, the domestic studies have been mainly focused on the analysis method and modeling of CCF events. Thus, the analysis of the CCF events for domestic NPPs were performed to establish a domestic database for the CCF events and to deliver them to the operation office of the international common cause failure data exchange (ICDE) project. This paper presents the analysis results of the CCF events for domestic nuclear power plants

  11. Reusable launch vehicle facts and fantasies

    Science.gov (United States)

    Kaplan, Marshall H.

    2002-01-01

    Many people refuse to address many of the realities of reusable launch vehicle systems, technologies, operations and economics. Basic principles of physics, space flight operations, and business limitations are applied to the creation of a practical vision of future expectations. While reusable launcher concepts have been proposed for several decades, serious review of potential designs began in the mid-1990s, when NASA decided that a Space Shuttle replacement had to be pursued. A great deal of excitement and interest was quickly generated by the prospect of ``orders-of-magnitude'' reduction in launch costs. The potential for a vastly expanded space program motivated the entire space community. By the late-1990s, and after over one billion dollars were spent on the technology development and privately-funded concepts, it had become clear that there would be no new, near-term operational reusable vehicle. Many factors contributed to a very expensive and disappointing effort to create a new generation of launch vehicles. It began with overly optimistic projections of technology advancements and the belief that a greatly increased demand for satellite launches would be realized early in the 21st century. Contractors contributed to the perception of quickly reachable technology and business goals, thus, accelerating the enthusiasm and helping to create a ``gold rush'' euphoria. Cost, schedule and performance margins were all highly optimistic. Several entrepreneurs launched start up companies to take advantage of the excitement and the availability of investor capital. Millions were raised from private investors and venture capitalists, based on little more than flashy presentations and animations. Well over $500 million were raised by little-known start up groups to create reusable systems, which might complete for the coming market in launch services. By 1999, it was clear that market projections, made just two years earlier, were not going to be realized. Investors

  12. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  13. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Science.gov (United States)

    Fedosimova, Anastasiya; Gaitinov, Adigam; Grushevskaya, Ekaterina; Lebedev, Igor

    2017-06-01

    In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC), short-range multiparticle correlations (SC) and mixed type (MT) in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  14. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Directory of Open Access Journals (Sweden)

    Fedosimova Anastasiya

    2017-01-01

    Full Text Available In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC, short-range multiparticle correlations (SC and mixed type (MT in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  15. Analysis of water hammer events in nuclear power plants

    International Nuclear Information System (INIS)

    Sato, Masahiro; Yanagi, Chihiro

    1999-01-01

    A water hammer issue in nuclear power plants was one of unresolved safety issues listed by the United States Nuclear Regulatory Commission and was regarded as resolved. But later on, the water hammer events are still experienced intermittently, while the number of the events is decreasing. We collected water hammer events of PWRs in Japan and the United States and relevant documents, analyzed them, and studied corrective actions taken by Japanese plants. As a result, it is confirmed that preventive measured in design, operation etc. have been already taken and that mitigation mechanisms against water hammer have also been considered. However, it is clarified that attention should be continuously paid to operation of valves and/or pumps, as the prevention of water hammer still relies on operation. (author)

  16. Initiating events in the safety probabilistic analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Stasiulevicius, R.

    1989-01-01

    The importance of the initiating event in the probabilistic safety analysis of nuclear power plants are discussed and the basic procedures necessary for preparing reports, quantification and grouping of the events are described. The examples of initiating events with its occurence medium frequency, included those calculated for OCONEE reactor and Angra-1 reactor are presented. (E.G.)

  17. Event Sequence Analysis of the Air Intelligence Agency Information Operations Center Flight Operations

    National Research Council Canada - National Science Library

    Larsen, Glen

    1998-01-01

    This report applies Event Sequence Analysis, methodology adapted from aircraft mishap investigation, to an investigation of the performance of the Air Intelligence Agency's Information Operations Center (IOC...

  18. Identification and analysis of external event combinations for Hanhikivi 1PRA

    Energy Technology Data Exchange (ETDEWEB)

    Helander, Juho [Fennovoima Oy, Helsinki (Finland)

    2017-03-15

    Fennovoima's nuclear power plant, Hanhikivi 1, Pyhäjoki, Finland, is currently in design phase, and its construction is scheduled to begin in 2018 and electricity production in 2024. The objective of this paper is to produce a preliminary list of safety-significant external event combinations including preliminary probability estimates, to be used in the probabilistic risk assessment of Hanhikivi 1 plant. Starting from the list of relevant single events, the relevant event combinations are identified based on seasonal variation, preconditions related to different events, and dependencies (fundamental and cascade type) between events. Using this method yields 30 relevant event combinations of two events for the Hanhikivi site. The preliminary probability of each combination is evaluated, and event combinations with extremely low probability are excluded from further analysis. Event combinations of three or more events are identified by adding possible events to the remaining combinations of two events. Finally, 10 relevant combinations of two events and three relevant combinations of three events remain. The results shall be considered preliminary and will be updated after evaluating more detailed effects of different events on plant safety.

  19. Adverse events with use of antiepileptic drugs: a prescription and event symmetry analysis

    DEFF Research Database (Denmark)

    Tsiropoulos, Ioannis; Andersen, Morten; Hallas, Jesper

    2009-01-01

    Database (OPED) for the period of 1 August 1990-31 December 2006, and diagnoses from the County Hospital register for the period of 1994-2006 to perform sequence symmetry analysis. The method assesses the distribution of disease entities and prescription of other drugs (ODs), before and after initiation...

  20. Methodology for Evaluating Quality and Reusability of Learning Objects

    Science.gov (United States)

    Kurilovas, Eugenijus; Bireniene, Virginija; Serikoviene, Silvija

    2011-01-01

    The aim of the paper is to present the scientific model and several methods for the expert evaluation of quality of learning objects (LOs) paying especial attention to LOs reusability level. The activities of eQNet Quality Network for a European Learning Resource Exchange (LRE) aimed to improve reusability of LOs of European Schoolnet's LRE…

  1. Analysis of Paks NPP Personnel Activity during Safety Related Event Sequences

    International Nuclear Information System (INIS)

    Bareith, A.; Hollo, Elod; Karsa, Z.; Nagy, S.

    1998-01-01

    Within the AGNES Project (Advanced Generic and New Evaluation of Safety) the Level-1 PSA model of the Paks NPP Unit 3 was developed in form of a detailed event tree/fault tree structure (53 initiating events, 580 event sequences, 6300 basic events are involved). This model gives a good basis for quantitative evaluation of potential consequences of actually occurred safety-related events, i.e. for precursor event studies. To make these studies possible and efficient, the current qualitative event analysis practice should be reviewed and a new additional quantitative analysis procedure and system should be developed and applied. The present paper gives an overview of the method outlined for both qualitative and quantitative analyses of the operator crew activity during off-normal situations. First, the operator performance experienced during past operational events is discussed. Sources of raw information, the qualitative evaluation process, the follow-up actions, as well as the documentation requirements are described. Second, the general concept of the proposed precursor event analysis is described. Types of modeled interactions and the considered performance influences are presented. The quantification of the potential consequences of the identified precursor events is based on the task-oriented, Level-1 PSA model of the plant unit. A precursor analysis system covering the evaluation of operator activities is now under development. Preliminary results gained during a case study evaluation of a past historical event are presented. (authors)

  2. Events in time: Basic analysis of Poisson data

    International Nuclear Information System (INIS)

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given

  3. Applications of heavy ion microprobe for single event effects analysis

    International Nuclear Information System (INIS)

    Reed, Robert A.; Vizkelethy, Gyorgy; Pellish, Jonathan A.; Sierawski, Brian; Warren, Kevin M.; Porter, Mark; Wilkinson, Jeff; Marshall, Paul W.; Niu, Guofu; Cressler, John D.; Schrimpf, Ronald D.; Tipton, Alan; Weller, Robert A.

    2007-01-01

    The motion of ionizing-radiation-induced rogue charge carriers in a semiconductor can create unwanted voltage and current conditions within a microelectronic circuit. If sufficient unwanted charge or current occurs on a sensitive node, a variety of single event effects (SEEs) can occur with consequences ranging from trivial to catastrophic. This paper describes the application of heavy ion microprobes to assist with calibration and validation of SEE modeling approaches

  4. Events in time: Basic analysis of Poisson data

    Energy Technology Data Exchange (ETDEWEB)

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  5. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-01

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.

  6. Clinical outcomes and costs of reusable and single-use flexible ureterorenoscopes: a prospective cohort study.

    Science.gov (United States)

    Mager, R; Kurosch, M; Höfner, T; Frees, S; Haferkamp, A; Neisius, A

    2018-01-22

    The purpose of this study is to analyze clinical outcomes and costs of single-use flexible ureterorenoscopes in comparison with reusable flexible ureterorenoscopes in a tertiary referral center. Prospectively, 68 flexible ureterorenoscopies utilizing reusable (Flex-X2S, Flex-X C , Karl Storz) and 68 applying single-use flexible ureterorenoscopes (LithoVue, Boston Scientific) were collected. Clinical outcome parameters such as overall success rate, complication rates according to Clavien-Dindo, operation time and radiation exposure time were measured. Cost analysis was based on purchase costs and recurrent costs for repair and reprocessing divided by number of procedures. In each group 68 procedures were available for evaluation. In 91% of reusable and 88% of single-use ureterorenoscopies stone disease was treated with a mean stone burden of 101 ± 226 and 90 ± 244 mm 2 and lower pole involvement in 47 and 41%, respectively (p > 0.05). Comparing clinical outcomes of reusable vs. single-use instruments revealed no significant difference for overall success rates (81 vs. 87%), stone-free rates (82 vs. 85%), operation time (76.2 ± 46.8 vs. 76.8 ± 40.2 min), radiation exposure time (3.83 ± 3.15 vs. 3.93 ± 4.43 min) and complication rates (7 vs. 17%) (p > 0.05). A wide range of repair and purchase costs resulted in total to $1212-$1743 per procedure for reusable ureterorenoscopy whereas price of single-use ureterorenoscopy was $1300-$3180 per procedure. The current work provided evidence for equal clinical effectiveness of reusable and single-use flexible ureterorenoscopes. Partially overlapping ranges of costs for single-use and reusable scopes stress the importance to precisely know the expenses and caseload when negotiating purchase prices, repair prices and warranty conditions.

  7. Common-Cause Failure Analysis in Event Assessment

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Kelly, D.L.

    2008-01-01

    This paper reviews the basic concepts of modeling common-cause failures (CCFs) in reliability and risk studies and then applies these concepts to the treatment of CCF in event assessment. The cases of a failed component (with and without shared CCF potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g. failure to start and failure to run) is a new feature of this paper, as is the treatment of asymmetry within a common-cause component group

  8. Thermomechanical Stresses Analysis of a Single Event Burnout Process

    Science.gov (United States)

    Tais, Carlos E.; Romero, Eduardo; Demarco, Gustavo L.

    2009-06-01

    This work analyzes the thermal and mechanical effects arising in a power Diffusion Metal Oxide Semiconductor (DMOS) during a Single Event Burnout (SEB) process. For studying these effects we propose a more detailed simulation structure than the previously used by other authors, solving the mathematical models by means of the Finite Element Method. We use a cylindrical heat generation region, with 5 W, 10 W, 50 W and 100 W for emulating the thermal phenomena occurring during SEB processes, avoiding the complexity of the mathematical treatment of the ion-semiconductor interaction.

  9. Fault trees based on past accidents. Factorial analysis of events

    International Nuclear Information System (INIS)

    Vaillant, M.

    1977-01-01

    The method of the fault tree is already useful in the qualitative step before any reliability calculation. The construction of the tree becomes even simpler when we just want to describe how the events happened. Differently from screenplays that introduce several possibilities by means of the conjunction OR, you only have here the conjunction AND, which will not be written at all. This method is presented by INRS (1) for the study of industrial injuries; it may also be applied to material damages. (orig.) [de

  10. Analysis of loss of offsite power events reported in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Volkanovski, Andrija, E-mail: Andrija.VOLKANOVSKI@ec.europa.eu [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Ballesteros Avila, Antonio; Peinador Veira, Miguel [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Kančev, Duško [Kernkraftwerk Goesgen-Daeniken AG, CH-4658 Daeniken (Switzerland); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) gGmbH, Schwertnergasse 1, 50667 Köln (Germany); Stephan, Jean-Luc [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 – 92262 Fontenay-aux-Roses Cedex (France)

    2016-10-15

    Highlights: • Loss of offsite power events were identified in four databases. • Engineering analysis of relevant events was done. • The dominant root cause for LOOP are human failures. • Improved maintenance procedures can decrease the number of LOOP events. - Abstract: This paper presents the results of analysis of the loss of offsite power events (LOOP) in four databases of operational events. The screened databases include: the Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases, the IAEA International Reporting System for Operating Experience (IRS) and the U.S. Licensee Event Reports (LER). In total 228 relevant loss of offsite power events were identified in the IRSN database, 190 in the GRS database, 120 in U.S. LER and 52 in IRS database. Identified events were classified in predefined categories. Obtained results show that the largest percentage of LOOP events is registered during On power operational mode and lasted for two minutes or more. The plant centered events is the main contributor to LOOP events identified in IRSN, GRS and IAEA IRS database. The switchyard centered events are the main contributor in events registered in the NRC LER database. The main type of failed equipment is switchyard failures in IRSN and IAEA IRS, main or secondary lines in NRC LER and busbar failures in GRS database. The dominant root cause for the LOOP events are human failures during test, inspection and maintenance followed by human failures due to the insufficient or wrong procedures. The largest number of LOOP events resulted in reactor trip followed by EDG start. The actions that can result in reduction of the number of LOOP events and minimize consequences on plant safety are identified and presented.

  11. Analysis of internal events for the Unit 1 of the Laguna Verde nuclear power station

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    This volume presents the results of the starter event analysis and the event tree analysis for the Unit 1 of the Laguna Verde nuclear power station. The starter event analysis includes the identification of all those internal events which cause a disturbance to the normal operation of the power station and require mitigation. Those called external events stay beyond the reach of this study. For the analysis of the Laguna Verde power station eight transient categories were identified, three categories of loss of coolant accidents (LOCA) inside the container, a LOCA out of the primary container, as well as the vessel break. The event trees analysis involves the development of the possible accident sequences for each category of starter events. Events trees by systems for the different types of LOCA and for all the transients were constructed. It was constructed the event tree for the total loss of alternating current, which represents an extension of the event tree for the loss of external power transient. Also the event tree by systems for the anticipated transients without scram was developed (ATWS). The events trees for the accident sequences includes the sequences evaluation with vulnerable nucleus, that is to say those sequences in which it is had an adequate cooling of nucleus but the remoting systems of residual heat had failed. In order to model adequately the previous, headings were added to the event tree for developing the sequences until the point where be solved the nucleus state. This process includes: the determination of the failure pressure of the primary container, the evaluation of the environment generated in the reactor building as result of the container failure or cracked of itself, the determination of the localization of the components in the reactor building and the construction of boolean expressions to estimate the failure of the subordinated components to an severe environment. (Author)

  12. Trend analysis of fire events at nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Hiroki

    2007-01-01

    We performed trend analyses to compare fire events occurring overseas (1995-2005) and in Japan (1966-2006). We decided to do this after extracting data on incidents (storms, heavy rain, tsunamis, fires, etc.) occurring at overseas nuclear power plants from the Events Occurred at Overseas Nuclear Power Plants recorded in the Nuclear Information Database at the Institute of Nuclear Safety System (INSS) and finding that fires were the most common of the incidents. Analyses compared the number of fires occurring domestically and overseas and analyzed their causes and the effect of the fires on the power plants. As a result, we found that electrical fires caused by such things as current overheating and electric arcing, account for over one half of the domestic and overseas incidents of fire, which indicates that maintenance management of electric facilities is the most important aspect of fire prevention. Also, roughly the same number of operational fires occurred at domestic and overseas plants, judging from the figures for annual occurrences per unit. However, the overall number of fires per unit at domestic facilities is one fourth that of overseas facilities. We surmise that, while management of operations that utilizes fire is comparable for overseas and domestic plants, this disparity results from differences in the way maintenance is carried out at facilities. (author)

  13. Schoolgirls' experience and appraisal of menstrual absorbents in rural Uganda: a cross-sectional evaluation of reusable sanitary pads.

    Science.gov (United States)

    Hennegan, Julie; Dolan, Catherine; Wu, Maryalice; Scott, Linda; Montgomery, Paul

    2016-12-07

    Governments, multinational organisations, and charities have commenced the distribution of sanitary products to address current deficits in girls' menstrual management. The few effectiveness studies conducted have focused on health and education outcomes but have failed to provide quantitative assessment of girls' preferences, experiences of absorbents, and comfort. Objectives of the study were, first, to quantitatively describe girls' experiences with, and ratings of reliability and acceptability of different menstrual absorbents. Second, to compare ratings of freely-provided reusable pads (AFRIpads) to other existing methods of menstrual management. Finally, to assess differences in self-reported freedom of activity during menses according to menstrual absorbent. Cross-sectional, secondary analysis of data from the final survey of a controlled trial of reusable sanitary padand puberty education provision was undertaken. Participants were 205 menstruating schoolgirls from eight schools in rural Uganda. 72 girls who reported using the intervention-provided reusable pads were compared to those using existing improvised methods (predominately new or old cloth). Schoolgirls using reusable pads provided significantly higher ratings of perceived absorbent reliability across activities, less difficulties changing absorbents, and less disgust with cleaning absorbents. There were no significant differences in reports of outside garment soiling (OR 1.00 95%CI 0.51-1.99), or odour (0.84 95%CI 0.40-1.74) during the last menstrual period. When girls were asked if menstruation caused them to miss daily activities there were no differences between those using reusable pads and those using other existing methods. However, when asked about activities avoided during menstruation, those using reusable pads participated less in physical sports, working in the field, fetching water, and cooking. Reusable pads were rated favourably. This translated into some benefits for self

  14. Multivariate Volatility Impulse Response Analysis of GFC News Events

    NARCIS (Netherlands)

    D.E. Allen (David); M.J. McAleer (Michael); R.J. Powell (Robert); A.K. Singh (Abhay)

    2015-01-01

    textabstractThis paper applies the Hafner and Herwartz (2006) (hereafter HH) approach to the analysis of multivariate GARCH models using volatility impulse response analysis. The data set features ten years of daily returns series for the New York Stock Exchange Index and the FTSE 100 index from the

  15. Multivariate Volatility Impulse Response Analysis of GFC News Events

    NARCIS (Netherlands)

    D.E. Allen (David); M.J. McAleer (Michael); R.J. Powell (Robert)

    2015-01-01

    markdownabstract__Abstract__ This paper applies the Hafner and Herwartz (2006) (hereafter HH) approach to the analysis of multivariate GARCH models using volatility impulse response analysis. The data set features ten years of daily returns series for the New York Stock Exchange Index and the

  16. Marginal regression analysis of recurrent events with coarsened censoring times.

    Science.gov (United States)

    Hu, X Joan; Rosychuk, Rhonda J

    2016-12-01

    Motivated by an ongoing pediatric mental health care (PMHC) study, this article presents weakly structured methods for analyzing doubly censored recurrent event data where only coarsened information on censoring is available. The study extracted administrative records of emergency department visits from provincial health administrative databases. The available information of each individual subject is limited to a subject-specific time window determined up to concealed data. To evaluate time-dependent effect of exposures, we adapt the local linear estimation with right censored survival times under the Cox regression model with time-varying coefficients (cf. Cai and Sun, Scandinavian Journal of Statistics 2003, 30, 93-111). We establish the pointwise consistency and asymptotic normality of the regression parameter estimator, and examine its performance by simulation. The PMHC study illustrates the proposed approach throughout the article. © 2016, The International Biometric Society.

  17. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    Science.gov (United States)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  18. Use of PSA for the analysis of operational events in nuclear power plants

    International Nuclear Information System (INIS)

    Hulsmans, M.

    2006-01-01

    An operational event is a safety-relevant incident that occurred in an industrial installation like a nuclear power plant (NPP). The probabilistic approach to event analysis focuses on the potential consequences of an operational event. Within its scope of application, it provides a quantitative assessment of the risk significance of this event (and similar events): it calculates the risk increase induced by the event. Such analyses may result in a more objective and a more accurate event severity measure than those provided by commonly used qualitative methods. Probabilistic event analysis complements the traditional event analysis approaches that are oriented towards the understanding of the (root) causes of an event. In practice, risk-based precursor analysis consists of the mapping of an operational event on a risk model of the installation, such as a probabilistic safety analysis (PSA) model. Precursor analyses result in an objective risk ranking of safety-significant events, called accident precursors. An unexpectedly high (or low) risk increase value is in itself already an important finding. This assessment also yields a lot of information on the structure of the risk, since the underlying dominant factors can easily be determined. Relevant 'what if' studies on similar events and conditions can be identified and performed (which is generally not considered in conventional event analysis), with the potential to yield even broader findings. The findings of such a structured assessment can be used for other purposes than merely risk ranking. The operational experience feedback process can be improved by helping to identify design measures and operational practices in order to prevent re-occurrence or in order to mitigate future consequences, and even to evaluate their expected effectiveness, contributing to the validation and prioritization of corrective measures. Confirmed and re-occurring precursors with correlated characteristics may point out opportunities

  19. Investigation and analysis of hydrogen ignition and explosion events in foreign nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okuda, Yasunori [Institute of Nuclear Safety System, Inc., Mihama, Fukui (Japan)

    2002-09-01

    Reports about hydrogen ignition and explosion events in foreign nuclear power plants from 1980 to 2001 were investigated, and 31 events were identified. Analysis showed that they were categorized in (1) outer leakage ignition events and (2) inner accumulation ignition events. The dominant event for PWR (pressurized water reactor) was outer leakage ignition in the main generator, and in BWR (boiling water reactor) it was inner accumulation ignition in the off-gas system. The outer leakage ignition was a result of work process failure with the ignition source, operator error, or main generator hydrogen leakage. The inner accumulation ignition events were caused by equipment failure or insufficient monitoring. With careful preventive measures, the factors leading to these events could be eliminated. (author)

  20. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  1. [Analysis on the adverse events of cupping therapy in the application].

    Science.gov (United States)

    Zhou, Xin; Ruan, Jing-wen; Xing, Bing-feng

    2014-10-01

    The deep analysis has been done on the cases of adverse events and common injury of cupping therapy encountered in recent years in terms of manipulation and patient's constitution. The adverse events of cupping therapy are commonly caused by improper manipulation of medical practitioners, ignoring contraindication and patient's constitution. Clinical practitioners should use cupping therapy cautiously, follow strictly the rules of standard manipulation and medical core system, pay attention to the contraindication and take strict precautions against the occurrence of adverse events.

  2. Analysis of hypoglycemic events using negative binomial models.

    Science.gov (United States)

    Luo, Junxiang; Qu, Yongming

    2013-01-01

    Negative binomial regression is a standard model to analyze hypoglycemic events in diabetes clinical trials. Adjusting for baseline covariates could potentially increase the estimation efficiency of negative binomial regression. However, adjusting for covariates raises concerns about model misspecification, in which the negative binomial regression is not robust because of its requirement for strong model assumptions. In some literature, it was suggested to correct the standard error of the maximum likelihood estimator through introducing overdispersion, which can be estimated by the Deviance or Pearson Chi-square. We proposed to conduct the negative binomial regression using Sandwich estimation to calculate the covariance matrix of the parameter estimates together with Pearson overdispersion correction (denoted by NBSP). In this research, we compared several commonly used negative binomial model options with our proposed NBSP. Simulations and real data analyses showed that NBSP is the most robust to model misspecification, and the estimation efficiency will be improved by adjusting for baseline hypoglycemia. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Analysis and design of VEK for extreme events - a challenge

    International Nuclear Information System (INIS)

    Woelfel, H.P.; Technische Univ. Darmstadt

    2006-01-01

    For analysis and design of the VEK building - especially for design against earthquake and airplane crash - a 3D-integral-model had been developed, being able of yielding any global response quantities - displacements, accelerations, sectional forces, response spectra, global reinforcement - for any load actions from one mathematical model. Especially for airplane crash a so called dynamic design results in reinforcement quantities at every time step and so leads to a realistic and economic design. The advantages of the integral-model had been transferred to the design of the processing installation where the structural analysis of steel structures, vessels and piping had been dealt with in one integral mathematical model. (orig.)

  4. An analysis of fog events at Belgrade International Airport

    Science.gov (United States)

    Veljović, Katarina; Vujović, Dragana; Lazić, Lazar; Vučković, Vladan

    2015-01-01

    A preliminary study of the occurrence of fog at Belgrade "Nikola Tesla" Airport was carried out using a statistical approach. The highest frequency of fog has occurred in the winter months of December and January and far exceeded the number of fog days in the spring and the beginning of autumn. The exceptionally foggy months, those having an extreme number of foggy days, occurred in January 1989 (18 days), December 1998 (18 days), February 2005 (17 days) and October 2001 (15 days). During the winter months (December, January and February) from 1990 to 2005 (16 years), fog occurred most frequently between 0600 and 1000 hours, and in the autumn, between 0500 and 0800 hours. In summer, fog occurred most frequently between 0300 and 0600 hours. During the 11-year period from 1995 to 2005, it was found that there was a 13 % chance for fog to occur on two consecutive days and a 5 % chance that it would occur 3 days in a row. In October 2001, the fog was observed over nine consecutive days. During the winter half year, 52.3 % of fog events observed at 0700 hours were in the presence of stratus clouds and 41.4 % were without the presence of low clouds. The 6-h cooling observed at the surface preceding the occurrence of fog between 0000 and 0700 hours ranged mainly from 1 to 4 °C. A new method was applied to assess the probability of fog occurrence based on complex fog criteria. It was found that the highest probability of fog occurrence (51.2 %) takes place in the cases in which the relative humidity is above 97 %, the dew-point depression is 0 °C, the cloud base is lower than 50 m and the wind is calm or weak 1 h before the onset of fog.

  5. Survival analysis using S analysis of time-to-event data

    CERN Document Server

    Tableman, Mara

    2003-01-01

    Survival Analysis Using S: Analysis of Time-to-Event Data is designed as a text for a one-semester or one-quarter course in survival analysis for upper-level or graduate students in statistics, biostatistics, and epidemiology. Prerequisites are a standard pre-calculus first course in probability and statistics, and a course in applied linear regression models. No prior knowledge of S or R is assumed. A wide choice of exercises is included, some intended for more advanced students with a first course in mathematical statistics. The authors emphasize parametric log-linear models, while also detailing nonparametric procedures along with model building and data diagnostics. Medical and public health researchers will find the discussion of cut point analysis with bootstrap validation, competing risks and the cumulative incidence estimator, and the analysis of left-truncated and right-censored data invaluable. The bootstrap procedure checks robustness of cut point analysis and determines cut point(s). In a chapter ...

  6. Organization of pulse-height analysis programs for high event rates

    Energy Technology Data Exchange (ETDEWEB)

    Cohn, C E [Argonne National Lab., Ill. (USA)

    1976-09-01

    The ability of a pulse-height analysis program to handle high event rates can be enhanced by organizing it so as to minimize the time spent in interrupt housekeeping. Specifically, the routine that services the data-ready interrupt from the ADC should test whether another event is ready before performing the interrupt return.

  7. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes...

  8. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Behrmann, Gerd

    1999-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses \\emph{compositionality} and \\emph{dependency analysis} to significantly improve the efficiency of symbolic model checking of state/event models...

  9. Silica sulfuric acid: a versatile and reusable heterogeneous catalyst ...

    African Journals Online (AJOL)

    ... and reusable heterogeneous catalyst for the synthesis of N-acyl carbamates and ... All the reactions were done at room temperature and the N-acyl carbamates ... This method is attractive and is in a close agreement with green chemistry.

  10. Developing Reusable and Reconfigurable Real-Time Software using Aspects and Components

    OpenAIRE

    Tešanović, Aleksandra

    2006-01-01

    Our main focus in this thesis is on providing guidelines, methods, and tools for design, configuration, and analysis of configurable and reusable real-time software, developed using a combination of aspect-oriented and component-based software development. Specifically, we define a reconfigurable real-time component model (RTCOM) that describes how a real-time component, supporting aspects and enforcing information hiding, could efficiently be designed and implemented. In this context, we out...

  11. Event Reconstruction and Analysis in the R3BRoot Framework

    International Nuclear Information System (INIS)

    Kresan, Dmytro; Al-Turany, Mohammad; Bertini, Denis; Karabowicz, Radoslaw; Manafov, Anar; Rybalchenko, Alexey; Uhlig, Florian

    2014-01-01

    The R 3 B experiment (Reaction studies with Relativistic Radioactive Beams) will be built within the future FAIR / GSI (Facility for Antiproton and Ion Research) in Darmstadt, Germany. The international collaboration R 3 B has a scientific program devoted to the physics of stable and radioactive beams at energies between 150 MeV and 1.5 GeV per nucleon. In preparation for the experiment, the R3BRoot software framework is under development, it deliver detector simulation, reconstruction and data analysis. The basic functionalities of the framework are handled by the FairRoot framework which is used also by the other FAIR experiments (CBM, PANDA, ASYEOS, etc) while the R 3 B detector specifics and reconstruction code are implemented inside R3BRoot. In this contribution first results of data analysis from the detector prototype test in November 2012 will be reported, moreover, comparison of the tracker performance versus experimental data, will be presented

  12. Offline analysis of HEP events by ''dynamic perceptron'' neural network

    International Nuclear Information System (INIS)

    Perrone, A.L.; Basti, G.; Messi, R.; Pasqualucci, E.; Paoluzi, L.

    1997-01-01

    In this paper we start from a critical analysis of the fundamental problems of the parallel calculus in linear structures and of their extension to the partial solutions obtained with non-linear architectures. Then, we present shortly a new dynamic architecture able to solve the limitations of the previous architectures through an automatic re-definition of the topology. This architecture is applied to real-time recognition of particle tracks in high-energy accelerators. (orig.)

  13. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    OpenAIRE

    Maria Cleofe Giorgino; Enrico Supino; Federico Barnabè

    2017-01-01

    Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR). The features of this tool are that it aims to represent the multidimensional imp...

  14. Analysis of spectral data with rare events statistics

    International Nuclear Information System (INIS)

    Ilyushchenko, V.I.; Chernov, N.I.

    1990-01-01

    The case is considered of analyzing experimental data, when the results of individual experimental runs cannot be summed due to large systematic errors. A statistical analysis of the hypothesis about the persistent peaks in the spectra has been performed by means of the Neyman-Pearson test. The computations demonstrate the confidence level for the hypothesis about the presence of a persistent peak in the spectrum is proportional to the square root of the number of independent experimental runs, K. 5 refs

  15. Analysis of 16 plasma vortex events in the geomagnetic tail

    International Nuclear Information System (INIS)

    Birn, J.; Hones, E.W. Jr.; Bame, S.J.; Russel, C.T.

    1985-01-01

    The analysis of 16 plasma vortex occurrences in the magnetotail plasma sheet of Hones et al. (1983) is extended. We used two- and three-dimensional plasma measurements and three-dimensional magnetic field measurements to study phase relations, energy propagation, and polarization properties. The results point toward an interpretation as a slow strongly damped MHD eigenmode which is generated by tailward traveling perturbations at the low-latitude interface between plasma sheet and magnetosheath

  16. Ultimate design load analysis of planetary gearbox bearings under extreme events

    DEFF Research Database (Denmark)

    Gallego Calderon, Juan Felipe; Natarajan, Anand; Cutululis, Nicolaos Antonio

    2017-01-01

    This paper investigates the impact of extreme events on the planet bearings of a 5 MW gearbox. The system is simulated using an aeroelastic tool, where the turbine structure is modeled, and MATLAB/Simulink, where the drivetrain (gearbox and generator) are modeled using a lumped-parameter approach....... Three extreme events are assessed: low-voltage ride through, emergency stop and normal stop. The analysis is focused on finding which event has the most negative impact on the bearing extreme radial loads. The two latter events are carried out following the guidelines of the International...

  17. Utilizing Provenance in Reusable Research Objects

    Directory of Open Access Journals (Sweden)

    Zhihao Yuan

    2018-03-01

    Full Text Available Science is conducted collaboratively, often requiring the sharing of knowledge about computational experiments. When experiments include only datasets, they can be shared using Uniform Resource Identifiers (URIs or Digital Object Identifiers (DOIs. An experiment, however, seldom includes only datasets, but more often includes software, its past execution, provenance, and associated documentation. The Research Object has recently emerged as a comprehensive and systematic method for aggregation and identification of diverse elements of computational experiments. While a necessary method, mere aggregation is not sufficient for the sharing of computational experiments. Other users must be able to easily recompute on these shared research objects. Computational provenance is often the key to enable such reuse. In this paper, we show how reusable research objects can utilize provenance to correctly repeat a previous reference execution, to construct a subset of a research object for partial reuse, and to reuse existing contents of a research object for modified reuse. We describe two methods to summarize provenance that aid in understanding the contents and past executions of a research object. The first method obtains a process-view by collapsing low-level system information, and the second method obtains a summary graph by grouping related nodes and edges with the goal to obtain a graph view similar to application workflow. Through detailed experiments, we show the efficacy and efficiency of our algorithms.

  18. ALFA detector, Background removal and analysis for elastic events

    CERN Document Server

    Belaloui, Nazim

    2017-01-01

    I worked on the ALFA project, which has the aim to measure the total cross section in PP collisions as a function of t, the momentum transfer by measuring the scattering angle of the protons. This measurement is done for all available energies; so far 7, 8 and 13 TeV. There are many analysis steps and we have focused on enhancing the signal-to-noise ratio. First of all I tried to be more familiar with ROOT, worked on understanding the code used to access to the data, plotting histograms, then cutting-off background.

  19. Viability of a Reusable In-Space Transportation System

    Science.gov (United States)

    Jefferies, Sharon A.; McCleskey, Carey M.; Nufer, Brian M.; Lepsch, Roger A.; Merrill, Raymond G.; North, David D.; Martin, John G.; Komar, David R.

    2015-01-01

    The National Aeronautics and Space Administration (NASA) is currently developing options for an Evolvable Mars Campaign (EMC) that expands human presence from Low Earth Orbit (LEO) into the solar system and to the surface of Mars. The Hybrid in-space transportation architecture is one option being investigated within the EMC. The architecture enables return of the entire in-space propulsion stage and habitat to cis-lunar space after a round trip to Mars. This concept of operations opens the door for a fully reusable Mars transportation system from cis-lunar space to a Mars parking orbit and back. This paper explores the reuse of in-space transportation systems, with a focus on the propulsion systems. It begins by examining why reusability should be pursued and defines reusability in space-flight context. A range of functions and enablers associated with preparing a system for reuse are identified and a vision for reusability is proposed that can be advanced and implemented as new capabilities are developed. Following this, past reusable spacecraft and servicing capabilities, as well as those currently in development are discussed. Using the Hybrid transportation architecture as an example, an assessment of the degree of reusability that can be incorporated into the architecture with current capabilities is provided and areas for development are identified that will enable greater levels of reuse in the future. Implications and implementation challenges specific to the architecture are also presented.

  20. Integration of risk matrix and event tree analysis: a natural stone ...

    Indian Academy of Sciences (India)

    M Kemal Özfirat

    2017-09-27

    Sep 27, 2017 ... Different types of accidents may occur in natural stone facilities during movement, dimensioning, cutting ... are numerous risk analysis methods such as preliminary ..... machine type and maintenance (MM) event, block control.

  1. Political Shocks and Abnormal Returns During the Taiwan Crisis: An Event Study Analysis

    National Research Council Canada - National Science Library

    Steeves, Geoffrey

    2002-01-01

    .... Focusing on the 1996 Taiwan Crisis, by means of event study analysis, this paper attempts to determine the extent to which this political shock affected the Taiwanese, and surrounding Japanese stock markets...

  2. Mixed Methods Analysis of Medical Error Event Reports: A Report from the ASIPS Collaborative

    National Research Council Canada - National Science Library

    Harris, Daniel M; Westfall, John M; Fernald, Douglas H; Duclos, Christine W; West, David R; Niebauer, Linda; Marr, Linda; Quintela, Javan; Main, Deborah S

    2005-01-01

    .... This paper presents a mixed methods approach to analyzing narrative error event reports. Mixed methods studies integrate one or more qualitative and quantitative techniques for data collection and analysis...

  3. JINR supercomputer of the module type for event parallel analysis

    International Nuclear Information System (INIS)

    Kolpakov, I.F.; Senner, A.E.; Smirnov, V.A.

    1987-01-01

    A model of a supercomputer with 50 million of operations per second is suggested. Its realization allows one to solve JINR data analysis problems for large spectrometers (in particular DELPHY collaboration). The suggested module supercomputer is based on 32-bit commercial available microprocessor with a processing rate of about 1 MFLOPS. The processors are combined by means of VME standard busbars. MicroVAX-11 is a host computer organizing the operation of the system. Data input and output is realized via microVAX-11 computer periphery. Users' software is based on the FORTRAN-77. The supercomputer is connected with a JINR net port and all JINR users get an access to the suggested system

  4. HiggsToFourLeptonsEV in the ATLAS EventView Analysis Framework

    CERN Document Server

    Lagouri, T; Del Peso, J

    2008-01-01

    ATLAS is one of the four experiments at the Large Hadron Collider (LHC) at CERN. This experiment has been designed to study a large range of physics topics, including searches for previously unobserved phenomena such as the Higgs Boson and super-symmetry. The physics analysis package HiggsToFourLeptonsEV for the Standard Model (SM) Higgs to four leptons channel with ATLAS is presented. The physics goal is to investigate with the ATLAS detector, the SM Higgs boson discovery potential through its observation in the four-lepton (electron and muon) final state. HiggsToFourLeptonsEV is based on the official ATLAS software ATHENA and the EventView (EV) analysis framework. EventView is a highly flexible and modular analysis framework in ATHENA and it is one of several analysis schemes for ATLAS physics user analysis. At the core of the EventView is the representative "view" of an event, which defines the contents of event data suitable for event-level physics analysis. The HiggsToFourLeptonsEV package, presented in ...

  5. A Reusable Framework for Regional Climate Model Evaluation

    Science.gov (United States)

    Hart, A. F.; Goodale, C. E.; Mattmann, C. A.; Lean, P.; Kim, J.; Zimdars, P.; Waliser, D. E.; Crichton, D. J.

    2011-12-01

    Climate observations are currently obtained through a diverse network of sensors and platforms that include space-based observatories, airborne and seaborne platforms, and distributed, networked, ground-based instruments. These global observational measurements are critical inputs to the efforts of the climate modeling community and can provide a corpus of data for use in analysis and validation of climate models. The Regional Climate Model Evaluation System (RCMES) is an effort currently being undertaken to address the challenges of integrating this vast array of observational climate data into a coherent resource suitable for performing model analysis at the regional level. Developed through a collaboration between the NASA Jet Propulsion Laboratory (JPL) and the UCLA Joint Institute for Regional Earth System Science and Engineering (JIFRESSE), the RCMES uses existing open source technologies (MySQL, Apache Hadoop, and Apache OODT), to construct a scalable, parametric, geospatial data store that incorporates decades of observational data from a variety of NASA Earth science missions, as well as other sources into a consistently annotated, highly available scientific resource. By eliminating arbitrary partitions in the data (individual file boundaries, differing file formats, etc), and instead treating each individual observational measurement as a unique, geospatially referenced data point, the RCMES is capable of transforming large, heterogeneous collections of disparate observational data into a unified resource suitable for comparison to climate model output. This facility is further enhanced by the availability of a model evaluation toolkit which consists of a set of Python libraries, a RESTful web service layer, and a browser-based graphical user interface that allows for orchestration of model-to-data comparisons by composing them visually through web forms. This combination of tools and interfaces dramatically simplifies the process of interacting with and

  6. Reusable Electronics and Adaptable Communication as Implemented in the Odin Modular Robot

    DEFF Research Database (Denmark)

    Garcia, Ricardo Franco Mendoza; Lyder, Andreas; Christensen, David Johan

    2009-01-01

    This paper describes the electronics and communication system of Odin, a novel heterogeneous modular robot made of links and joints. The electronics is divided into two printed circuit boards: a General board with reusable components and a Specific board with non-reusable components. While...... electrical signals. The implementations of actuator and power links show that splitting the electronics into General and Specific boards allows rapid development of different types of modules, and an analysis of performance indicates that the communication system is simple, fast and flexible....... As the electronic design reuses approx. 50% of components between two different types of modules, we find it convenient for heterogeneous modular robots where production costs demand a small set of parts. In addition, as the features of the communication system are desirable in modular robots, we think...

  7. Enhanced Flexibility and Reusability through State Machine-Based Architectures for Multisensor Intelligent Robotics

    Directory of Open Access Journals (Sweden)

    Héctor Herrero

    2017-05-01

    Full Text Available This paper presents a state machine-based architecture, which enhances the flexibility and reusability of industrial robots, more concretely dual-arm multisensor robots. The proposed architecture, in addition to allowing absolute control of the execution, eases the programming of new applications by increasing the reusability of the developed modules. Through an easy-to-use graphical user interface, operators are able to create, modify, reuse and maintain industrial processes, increasing the flexibility of the cell. Moreover, the proposed approach is applied in a real use case in order to demonstrate its capabilities and feasibility in industrial environments. A comparative analysis is presented for evaluating the presented approach versus traditional robot programming techniques.

  8. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    Science.gov (United States)

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  9. Is It Worth It? - the Economics of Reusable Space Transportation

    Science.gov (United States)

    Webb, Richard

    2016-01-01

    Over the past several decades billions of dollars have been invested by governments and private companies in the pursuit of lower cost access to space through earth-to-orbit (ETO) space transportation systems. Much of that investment has been focused on the development and operation of various forms of reusable transportation systems. From the Space Shuttle to current efforts by private commercial companies, the overarching belief of those making such investments has been that reusing system elements will be cheaper than utilizing expendable systems that involve throwing away costly engines, avionics, and other hardware with each flight. However, the view that reusable systems are ultimately a "better" approach to providing ETO transportation is not held universally by major stakeholders within the space transportation industry. While the technical feasibility of at least some degree of reusability has been demonstrated, there continues to be a sometimes lively debate over the merits and drawbacks of reusable versus expendable systems from an economic perspective. In summary, is it worth it? Based on our many years of direct involvement with the business aspects of several expendable and reusable transportation systems, it appears to us that much of the discussion surrounding reusability is hindered by a failure to clearly define and understand the financial and other metrics by which the financial "goodness" of a reusable or expandable approach is measured. As stakeholders, the different users and suppliers of space transportation have a varied set of criteria for determining the relative economic viability of alternative strategies, including reusability. Many different metrics have been used to measure the affordability of space transportation, such as dollars per payload pound (kilogram) to orbit, cost per flight, life cycle cost, net present value/internal rate of return, and many others. This paper will examine the key considerations that influence

  10. Preterm Versus Term Children: Analysis of Sedation/Anesthesia Adverse Events and Longitudinal Risk.

    Science.gov (United States)

    Havidich, Jeana E; Beach, Michael; Dierdorf, Stephen F; Onega, Tracy; Suresh, Gautham; Cravero, Joseph P

    2016-03-01

    Preterm and former preterm children frequently require sedation/anesthesia for diagnostic and therapeutic procedures. Our objective was to determine the age at which children who are born risk for sedation/anesthesia adverse events. Our secondary objective was to describe the nature and incidence of adverse events. This is a prospective observational study of children receiving sedation/anesthesia for diagnostic and/or therapeutic procedures outside of the operating room by the Pediatric Sedation Research Consortium. A total of 57,227 patients 0 to 22 years of age were eligible for this study. All adverse events and descriptive terms were predefined. Logistic regression and locally weighted scatterplot regression were used for analysis. Preterm and former preterm children had higher adverse event rates (14.7% vs 8.5%) compared with children born at term. Our analysis revealed a biphasic pattern for the development of adverse sedation/anesthesia events. Airway and respiratory adverse events were most commonly reported. MRI scans were the most commonly performed procedures in both categories of patients. Patients born preterm are nearly twice as likely to develop sedation/anesthesia adverse events, and this risk continues up to 23 years of age. We recommend obtaining birth history during the formulation of an anesthetic/sedation plan, with heightened awareness that preterm and former preterm children may be at increased risk. Further prospective studies focusing on the etiology and prevention of adverse events in former preterm patients are warranted. Copyright © 2016 by the American Academy of Pediatrics.

  11. An analysis of post-event processing in social anxiety disorder.

    Science.gov (United States)

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  12. Increasing the Operational Value of Event Messages

    Science.gov (United States)

    Li, Zhenping; Savkli, Cetin; Smith, Dan

    2003-01-01

    Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.

  13. Uncertainty analysis of one Main Circulation Pump trip event at the Ignalina NPP

    International Nuclear Information System (INIS)

    Vileiniskis, V.; Kaliatka, A.; Uspuras, E.

    2004-01-01

    One Main Circulation Pump (MCP) trip event is an anticipated transient with expected frequency of approximately one event per year. There were a few events when one MCP was inadvertently tripped. The throughput of the rest running pumps in the affected Main Circulation Circuit loop increased, however, the total coolant flow through the affected loop decreased. The main question arises whether this coolant flow rate is sufficient for adequate core cooling. This paper presents an investigation of one MCP trip event at the Ignalina NPP. According to international practice, the transient analysis should consist of deterministic analysis by employing best-estimate codes and uncertainty analysis. For that purpose, the plant's RELAP5 model and the GRS (Germany) System for Uncertainty and Sensitivity Analysis package (SUSA) were employed. Uncertainty analysis of flow energy loss in different parts of the Main Circulation Circuit, initial conditions and code-selected models was performed. Such analysis allows to estimate the influence of separate parameters on calculation results and to find the modelling parameters that have the largest impact on the event studied. On the basis of this analysis, recommendations for the further improvement of the model have been developed. (author)

  14. Self-Healing Nanocomposites for Reusable Composite Cryotanks

    Science.gov (United States)

    Eberly, Daniel; Ou, Runqing; Karcz, Adam; Skandan, Ganesh

    2013-01-01

    Composite cryotanks, or composite overwrapped pressure vessels (COPVs), offer advantages over currently used aluminum-lithium cryotanks, particularly with respect to weight savings. Future NASA missions are expected to use COPVs in spaceflight propellant tanks to store fuels, oxidizers, and other liquids for launch and space exploration vehicles. However, reliability, reparability, and reusability of the COPVs are still being addressed, especially in cryogenic temperature applications; this has limited the adoption of COPVs in reusable vehicle designs. The major problem with composites is the inherent brittleness of the epoxy matrix, which is prone to microcrack formation, either from exposure to cryogenic conditions or from impact from different sources. If not prevented, the microcracks increase gas permeation and leakage. Accordingly, materials innovations are needed to mitigate microcrack damage, and prevent damage in the first place, in composite cryotanks. The self-healing technology being developed is capable of healing the microcracks through the use of a novel engineered nanocomposite, where a uniquely designed nanoparticle additive is incorporated into the epoxy matrix. In particular, this results in an enhancement in the burst pressure after cryogenic cycling of the nanocomposite COPVs, relative to the control COPVs. Incorporating a novel, self-healing, epoxy-based resin into the manufacture of COPVs allows repeatable self-healing of microcracks to be performed through the simple application of a low-temperature heat source. This permits COPVs to be reparable and reusable with a high degree of reliability, as microcracks will be remediated. The unique phase-separated morphology that was imparted during COPV manufacture allows for multiple self-healing cycles. Unlike single-target approaches where one material property is often improved at the expense of another, robustness has been introduced to a COPV by a combination of a modified resin and

  15. Life Cycle Assessment and Costing Methods for Device Procurement: Comparing Reusable and Single-Use Disposable Laryngoscopes.

    Science.gov (United States)

    Sherman, Jodi D; Raibley, Lewis A; Eckelman, Matthew J

    2018-01-09

    between $495,000 and $604,000 for SUD handles and between $180,000 and $265,000 for SUD blades, compared to reusables, depending on cleaning scenario and assuming 4000 (rated) uses. Considering device attrition, reusable handles would be more economical than SUDs if they last through 4-5 uses, and reusable blades 5-7 uses, before loss. LCA and LCC are feasible methods to ease interpretation of environmental impacts and facility costs when weighing device procurement options. While management practices vary between institutions, all standard methods of cleaning were evaluated and sensitivity analyses performed so that results are widely applicable. For YNHH, the reusable options presented a considerable cost advantage, in addition to offering a better option environmentally. Avoiding overcleaning reusable laryngoscope handles and blades is desirable from an environmental perspective. Costs may vary between facilities, and LCC methodology demonstrates the importance of time-motion labor analysis when comparing reusable and disposable device options.

  16. Corrective action program at the Krsko NPP. Trending and analysis of minor events

    International Nuclear Information System (INIS)

    Bach, B.; Kavsek, D.

    2007-01-01

    Industry and On-site Operating Experience has shown that the significant events, minor events and near misses all share something in common: latent weaknesses that result in failed barriers and the same or similar (root) causes for that failure. All these types of events differ only in their resulting consequences: minor events and near misses have no immediate or significant impact to plant safety or reliability. However, the significant events are usually preceded by a number of those kinds of events and could be prevented from occurring if the root cause(s) of these precursor events could be identified and eliminated. It would be therefore poor management to leave minor events and near misses unreported and unanalysed. Reporting and analysing of minor events allows detection of latent weaknesses that may indicate the need for improvement. The benefit of low level event analysis is that deficiencies can be found in barriers that normally go unchallenged and may not be known that they are ineffective in stopping a significant event. In addition, large numbers of minor events and near misses may increase the probability of occurrence of a significant event, which in itself is a sufficient reason for addressing these types of events. However, as it is not often practical neither feasible to perform a detailed root cause determination for every minor events, trending and trend analysis are used to identify and correct the causes prior to their resulting in a significant event. Trending is monitoring a change in frequency of similar minor events occurrence. Adverse trend is an increase in the frequency of minor events which are sorted by commonality such as common equipment failure, human factors, common or similar causal factors, activity etc. or worsening performance of processes that has been trending. The primary goal of any trending programme should be to identify an adverse trend early enough that the operating organization can initiate an investigation to help

  17. Computer-aided event tree analysis by the impact vector method

    International Nuclear Information System (INIS)

    Lima, J.E.P.

    1984-01-01

    In the development of the Probabilistic Risk Analysis of Angra I, the ' large event tree/small fault tree' approach was adopted for the analysis of the plant behavior in an emergency situation. In this work, the event tree methodology is presented along with the adaptations which had to be made in order to attain a correct description of the safety system performances according to the selected analysis method. The problems appearing in the application of the methodology and their respective solutions are presented and discussed, with special emphasis to the impact vector technique. A description of the ETAP code ('Event Tree Analysis Program') developed for constructing and quantifying event trees is also given in this work. A preliminary version of the small-break LOCA analysis for Angra 1 is presented as an example of application of the methodology and of the code. It is shown that the use of the ETAP code sigmnificantly contributes to decreasing the time spent in event tree analyses, making it viable the practical application of the analysis approach referred above. (author) [pt

  18. Analysis of events with isolated leptons and missing transverse momentum in ep collisions at HERA

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, G.

    2007-02-07

    A study of events with isolated leptons and missing transverse momentum in ep collisions is presented. Within the Standard Model (SM) such topologies are expected mainly from production of real W bosons with subsequent leptonic decay. This thesis continues the analysis of such events done in the HERA-1 period where an excess over the SM prediction was observed for events with high hadronic transverse momentum P{sup X}{sub T}>25 GeV. New data of the HERA-2 period are added. The analysed data sample recorded in e{sup +}p collisions corresponds to an integrated luminosity of 220 pb{sup -1} which is a factor of two more with respect to the HERA-1 analysis. The e{sup -}p data correspond to 186 pb{sup -1} which is a factor of 13 more with respect to HERA-1. All three lepton generations (electrons muons and tau leptons) are analysed. In the electron and muon channels a total of 53 events are observed in 406 pb{sup -1}. This compares well to the SM expectation of 53.7{+-}6.5 events, dominated by W production. However a difference in the event rate is observed for different electron beam charges. In e{sup +}p data the excess of events with P{sup X}{sub T}>25 GeV is sustained, while the e{sup -}p data agree with the SM. In the tau channel 18 events are observed in all HERA data, with 20{+-}3 expected from the SM. The events are dominated by irreducible background from charged currents. The contribution from W production amounts to about 22%. One event with P{sup X}{sub T}>25 GeV is observed, where 1.4{+-}0.3 are expected from the SM. (orig.)

  19. Reusable self-healing hydrogels realized via in situ polymerization.

    Science.gov (United States)

    Vivek, Balachandran; Prasad, Edamana

    2015-04-09

    In this work, a self-healing hydrogel has been prepared using in situ polymerization of acrylic acid and acrylamide in the presence of glycogen. The hydrogel was characterized using NMR, SEM, FT-IR, rheology, and dynamic light scattering (DLS) studies. The developed hydrogel exhibits self-healing properties at neutral pH, high swelling ability, high elasticity, and excellent mechanical strength. The hydrogel exhibits modulus values (G', G″) as high as 10(6) Pa and shows an exceptionally high degree of swelling ratio (∼3.5 × 10(3)). Further, the polymer based hydrogel adsorbs toxic metal ions (Cd(2+), Pb(2+), and Hg(2+)) and organic dyes (methylene blue and methyl orange) from contaminated water with remarkable efficiency (90-98%). The mechanistic analysis indicated the presence of pseudo-second-order reaction kinetics. The reusability of the hydrogel has been demonstrated by repeating the adsorption-desorption process over five cycles with identical results in the adsorption efficiency.

  20. Sustaining Human Presence on Mars Using ISRU and a Reusable Lander

    Science.gov (United States)

    Arney, Dale C.; Jones, Christopher A.; Klovstad, Jordan J.; Komar, D.R.; Earle, Kevin; Moses, Robert; Shyface, Hilary R.

    2015-01-01

    This paper presents an analysis of the impact of ISRU (In-Site Resource Utilization), reusability, and automation on sustaining a human presence on Mars, requiring a transition from Earth dependence to Earth independence. The study analyzes the surface and transportation architectures and compared campaigns that revealed the importance of ISRU and reusability. A reusable Mars lander, Hercules, eliminates the need to deliver a new descent and ascent stage with each cargo and crew delivery to Mars, reducing the mass delivered from Earth. As part of an evolvable transportation architecture, this investment is key to enabling continuous human presence on Mars. The extensive use of ISRU reduces the logistics supply chain from Earth in order to support population growth at Mars. Reliable and autonomous systems, in conjunction with robotics, are required to enable ISRU architectures as systems must operate and maintain themselves while the crew is not present. A comparison of Mars campaigns is presented to show the impact of adding these investments and their ability to contribute to sustaining a human presence on Mars.

  1. Analysis of event tree with imprecise inputs by fuzzy set theory

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Chun, Moon Hyun

    1990-01-01

    Fuzzy set theory approach is proposed as a method to analyze event trees with imprecise or linguistic input variables such as 'likely' or 'improbable' instead of the numerical probability. In this paper, it is shown how the fuzzy set theory can be applied to the event tree analysis. The result of this study shows that the fuzzy set theory approach can be applied as an acceptable and effective tool for analysis of the event tree with fuzzy type of inputs. Comparisons of the fuzzy theory approach with the probabilistic approach of computing probabilities of final states of the event tree through subjective weighting factors and LHS technique show that the two approaches have common factors and give reasonable results

  2. Markov chains and semi-Markov models in time-to-event analysis.

    Science.gov (United States)

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  3. Analysis of human error and organizational deficiency in events considering risk significance

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Kim, Yoonik; Kim, Say Hyung; Kim, Chansoo; Chung, Chang Hyun; Jung, Won Dea

    2004-01-01

    In this study, we analyzed human and organizational deficiencies in the trip events of Korean nuclear power plants. K-HPES items were used in human error analysis, and the organizational factors by Jacobs and Haber were used for organizational deficiency analysis. We proposed the use of CCDP as a risk measure to consider risk information in prioritizing K-HPES items and organizational factors. Until now, the risk significance of events has not been considered in human error and organizational deficiency analysis. Considering the risk significance of events in the process of analysis is necessary for effective enhancement of nuclear power plant safety by focusing on causes of human error and organizational deficiencies that are associated with significant risk

  4. Future Launch Vehicle Structures - Expendable and Reusable Elements

    Science.gov (United States)

    Obersteiner, M. H.; Borriello, G.

    2002-01-01

    Further evolution of existing expendable launch vehicles will be an obvious element influencing the future of space transportation. Besides this reusability might be the change with highest potential for essential improvement. The expected cost reduction and finally contributing to this, the improvement of reliability including safe mission abort capability are driving this idea. Although there are ideas of semi-reusable launch vehicles, typically two stages vehicles - reusable first stage or booster(s) and expendable second or upper stage - it should be kept in mind that the benefit of reusability will only overwhelm if there is a big enough share influencing the cost calculation. Today there is the understanding that additional technology preparation and verification will be necessary to master reusability and get enough benefits compared with existing launch vehicles. This understanding is based on several technology and system concepts preparation and verification programmes mainly done in the US but partially also in Europe and Japan. The major areas of necessary further activities are: - System concepts including business plan considerations - Sub-system or component technologies refinement - System design and operation know-how and capabilities - Verification and demonstration oriented towards future mission mastering: One of the most important aspects for the creation of those coming programmes and activities will be the iterative process of requirements definition derived from concepts analyses including economical considerations and the results achieved and verified within technology and verification programmes. It is the intention of this paper to provide major trends for those requirements focused on future launch vehicles structures. This will include the aspects of requirements only valid for reusable launch vehicles and those common for expendable, semi-reusable and reusable launch vehicles. Structures and materials is and will be one of the

  5. Regression analysis of mixed recurrent-event and panel-count data.

    Science.gov (United States)

    Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L

    2014-07-01

    In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20: , 1-42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  7. Making sense of root cause analysis investigations of surgery-related adverse events.

    Science.gov (United States)

    Cassin, Bryce R; Barach, Paul R

    2012-02-01

    This article discusses the limitations of root cause analysis (RCA) for surgical adverse events. Making sense of adverse events involves an appreciation of the unique features in a problematic situation, which resist generalization to other contexts. The top priority of adverse event investigations must be to inform the design of systems that help clinicians to adapt and respond effectively in real time to undesirable combinations of design, performance, and circumstance. RCAs can create opportunities in the clinical workplace for clinicians to reflect on local barriers and identify enablers of safe and reliable outcomes. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Putting Reusability First: A Paradigm Switch in Remote Laboratories Engineering

    Directory of Open Access Journals (Sweden)

    Romain Vérot

    2009-02-01

    Full Text Available In this paper, we present a new devices brought online thanks to our Collaborative Remote Laboratories framework. Whereas previous devices integrated in our remote laboratory belongs to the domain of electronics, such as Vector Network Analyzers, the devices at the concern in this paper are, on one hand, an antenna workbench, and on the other, an homemade switching device, which embeds several electronic components. Because the middleware and framework for our environment were designed to be reusable, we wanted to put it to the test by integrating new and different devices in our Online Engineering catalog. After presenting the devices to be put online, we will expose the software development efforts required in regards to the reusability of the solution. As a consequence, the expose work and results tend to make the Online Engineering software architects to think reusability first, breaking with the current trends to implement Remote Labs one after the other, without much reusability, apart the capitalized experience. In this, we defend a paradigm switch in our current engineering approaches for Remote Laboratories implementations: Reusability should be thought first.

  9. On the Concepts of Usability and Reusability of Learning Objects

    Directory of Open Access Journals (Sweden)

    Miguel-Angel Sicilia

    2003-10-01

    Full Text Available “Reusable learning objects” oriented towards increasing their potential reusability are required to satisfy concerns about their granularity and their independence of concrete contexts of use. Such requirements also entail that the definition of learning object “usability,” and the techniques required to carry out their “usability evaluation” must be substantially different from those commonly used to characterize and evaluate the usability of conventional educational applications. In this article, a specific characterization of the concept of learning object usability is discussed, which places emphasis on “reusability,” the key property of learning objects residing in repositories. The concept of learning object reusability is described as the possibility and adequacy for the object to be usable in prospective educational settings, so that usability and reusability are considered two interrelated – and in many cases conflicting – properties of learning objects. Following the proposed characterization of two characteristics or properties of learning objects, a method to evaluate usability of specific learning objects will be presented.

  10. Application and Use of PSA-based Event Analysis in Belgium

    International Nuclear Information System (INIS)

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  11. Analysis of external flooding events occurred in foreign nuclear power plant sites

    International Nuclear Information System (INIS)

    Li Dan; Cai Hankun; Xiao Zhi; An Hongzhen; Mao Huan

    2013-01-01

    This paper screens and studies 17 external flooding events occurred in foreign NPP sites, analysis the characteristic of external flooding events based on the source of the flooding, the impact on the building, systems and equipment, as well as the threat to nuclear safety. Furthermore, based on the experiences and lessons learned from Fukushima nuclear accident relating to external flooding and countermeasures carried out in the world, some suggestions are proposed in order to improve external flooding response capacity for Chinese NPPs. (authors)

  12. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  13. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Directory of Open Access Journals (Sweden)

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  14. Relation of air mass history to nucleation events in Po Valley, Italy, using back trajectories analysis

    Directory of Open Access Journals (Sweden)

    L. Sogacheva

    2007-01-01

    Full Text Available In this paper, we study the transport of air masses to San Pietro Capofiume (SPC in Po Valley, Italy, by means of back trajectories analysis. Our main aim is to investigate whether air masses originate over different regions on nucleation event days and on nonevent days, during three years when nucleation events have been continuously recorded at SPC. The results indicate that nucleation events occur frequently in air masses arriving from Central Europe, whereas event frequency is much lower in the air transported from southern directions and from the Atlantic Ocean. We also analyzed the behaviour of meteorological parameters during 96 h transport to SPC, and found that, on average, event trajectories undergo stronger subsidence during the last 12 h before the arrival at SPC than nonevent trajectories. This causes a reversal in the temperature and relative humidity (RH differences between event and nonevent trajectories: between 96 and 12 h back time, temperature is lower and RH is higher for event than nonevent trajectories and between 12 and 0 h vice versa. Boundary layer mixing is stronger along the event trajectories compared to nonevent trajectories. The absolute humidity (AH is similar for the event and nonevent trajectories between about 96 h and about 60 h back time, but after that, the event trajectories AH becomes lower due to stronger rain. We also studied transport of SO2 to SPC, and conclude that although sources in Po Valley most probably dominate the measured concentrations, certain Central and Eastern European sources also make a substantial contribution.

  15. Design, Fabrication, and Initial Operation of a Reusable Irradiation Facility

    International Nuclear Information System (INIS)

    Heatherly, D.W.; Thoms, K.R.; Siman-Tov, I.I.; Hurst, M.T.

    1999-01-01

    A Heavy-Section Steel Irradiation (HSSI) Program project, funded by the US Nuclear Regulatory Commission, was initiated at Oak Ridge National Laboratory to develop reusable materials irradiation facilities in which metallurgical specimens of reactor pressure vessel steels could be irradiated. As a consequence, two new, identical, reusable materials irradiation facilities have been designed, fabricated, installed, and are now operating at the Ford Nuclear Reactor at the University of Michigan. The facilities are referred to as the HSSI-IAR facilities with the individual facilities being designated as IAR-1 and IAR-2. This new and unique facility design requires no cutting or grinding operations to retrieve irradiated specimens, all capsule hardware is totally reusable, and materials transported from site to site are limited to specimens only. At the time of this letter report, the facilities have operated successfully for approximately 2500 effective full-power hours

  16. Benefits of Government Incentives for Reusable Launch Vehicle Development

    Science.gov (United States)

    Shaw, Eric J.; Hamaker, Joseph W.; Prince, Frank A.

    1998-01-01

    Many exciting new opportunities in space, both government missions and business ventures, could be realized by a reduction in launch prices. Reusable launch vehicle (RLV) designs have the potential to lower launch costs dramatically from those of today's expendable and partially-expendable vehicles. Unfortunately, governments must budget to support existing launch capability, and so lack the resources necessary to completely fund development of new reusable systems. In addition, the new commercial space markets are too immature and uncertain to motivate the launch industry to undertake a project of this magnitude and risk. Low-cost launch vehicles will not be developed without a mature market to service; however, launch prices must be reduced in order for a commercial launch market to mature. This paper estimates and discusses the various benefits that may be reaped from government incentives for a commercial reusable launch vehicle program.

  17. Risk Perception and Communication in Commercial Reusable Launch Vehicle Operations

    Science.gov (United States)

    Hardy, Terry L.

    2005-12-01

    A number of inventors and entrepreneurs are currently attempting to develop and commercially operate reusable launch vehicles to carry voluntary participants into space. The operation of these launch vehicles, however, produces safety risks to the crew, to the space flight participants, and to the uninvolved public. Risk communication therefore becomes increasingly important to assure that those involved in the flight understand the risk and that those who are not directly involved understand the personal impact of RLV operations on their lives. Those involved in the launch vehicle flight may perceive risk differently from those non-participants, and these differences in perception must be understood to effectively communicate this risk. This paper summarizes existing research in risk perception and communication and applies that research to commercial reusable launch vehicle operations. Risk communication is discussed in the context of requirements of United States law for informed consent from any space flight participants on reusable suborbital launch vehicles.

  18. Analysis of internal events for the Unit 1 of the Laguna Verde Nuclear Power Station. Appendixes

    International Nuclear Information System (INIS)

    Huerta B, A.; Lopez M, R.

    1995-01-01

    This volume contains the appendices for the accident sequences analysis for those internally initiated events for Laguna Verde Unit 1, Nuclear Power Plant. The appendix A presents the comments raised by the Sandia National Laboratories technical staff as a result of the review of the Internal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant. This review was performed during a joint Sandia/CNSNS multi-day meeting by the end 1992. Also included is a brief evaluation on the applicability of these comments to the present study. The appendix B presents the fault tree models printed for each of the systems included and.analyzed in the Internal Event Analysis for LVNPP. The appendice C presents the outputs of the TEMAC code, used for the cuantification of the dominant accident sequences as well as for the final core damage evaluation. (Author)

  19. Analysis of internal events for the Unit 1 of the Laguna Verde Nuclear Power Station. Appendixes

    International Nuclear Information System (INIS)

    Huerta B, A.; Lopez M, R.

    1995-01-01

    This volume contains the appendices for the accident sequences analysis for those internally initiated events for Laguna Verde Unit 1, Nuclear Power Plant. The appendix A presents the comments raised by the Sandia National Laboratories technical staff as a result of the review of the Internal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant. This review was performed during a joint Sandia/CNSNS multi-day meeting by the end 1992. Also included is a brief evaluation on the applicability of these comments to the present study. The appendix B presents the fault tree models printed for each of the systems included and analyzed in the Internal Event Analysis for LVNPP. The appendice C presents the outputs of the TEMAC code, used for the cuantification of the dominant accident sequences as well as for the final core damage evaluation. (Author)

  20. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    Science.gov (United States)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  1. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    Science.gov (United States)

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  2. Superposed epoch analysis of O+ auroral outflow during sawtooth events and substorms

    Science.gov (United States)

    Nowrouzi, N.; Kistler, L. M.; Lund, E. J.; Cai, X.

    2017-12-01

    Sawtooth events are repeated injection of energetic particles at geosynchronous orbit. Studies have shown that 94% of sawtooth events occurred during magnetic storm times. The main factor that causes a sawtooth event is still an open question. Simulations have suggested that heavy ions like O+ may play a role in triggering the injections. One of the sources of the O+ in the Earth's magnetosphere is the nightside aurora. O+ ions coming from the nightside auroral region have direct access to the near-earth magnetotail. A model (Brambles et al. 2013) for interplanetary coronal mass ejection driven sawtooth events found that nightside O+ outflow caused the subsequent teeth of the sawtooth event through a feedback mechanism. This work is a superposed epoch analysis to test whether the observed auroral outflow supports this model. Using FAST spacecraft data from 1997-2007, we examine the auroral O+ outflow as a function of time relative to an injection onset. Then we determine whether the profile of outflow flux of O+ during sawtooth events is different from the outflow observed during isolated substorms. The auroral region boundaries are estimated using the method of (Andersson et al. 2004). Subsequently the O+ outflow flux inside these boundaries are calculated and binned as a function of superposed epoch time for substorms and sawtooth "teeth". In this way, we will determine if sawtooth events do in fact have greater O+ outflow, and if that outflow is predominantly from the nightside, as suggested by the model results.

  3. Regression analysis of mixed panel count data with dependent terminal events.

    Science.gov (United States)

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L

    2017-05-10

    Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Analysis of adverse events occurred at overseas nuclear power plants in 2003

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa; Sato, Masahiro; Takagawa, Kenichi; Fushimi, Yasuyuki; Shimada, Hiroki; Shimada, Yoshio

    2004-01-01

    The adverse events that have occurred in the overseas nuclear power plants can be studied to provide an indication of how to improve the safety and the reliability of nuclear power plants in Japan. The Institute of Nuclear Safety Systems (INSS) obtains information related to overseas adverse events and incidents, and by evaluating them proposes improvements to prevent similar occurrences in Japanese PWR plants. In 2003, INSS obtained approximately 2800 pieces of information and, by evaluating them, proposed nine recommendations to Japanese utilities. This report shows a summary of the evaluation activity and of the tendency analysis based on individual event analyzed in 2003. The tendency analysis was undertaken on about 1600 analyzed events, from the view point of Mechanics, Electrics, Instruments and Controls and Operations, about the causes, countermeasures, troubled equipments and the possible of lessons learnt from overseas events. This report is to show the whole tendency of overseas events and incidents for the improvement of the safety and reliability of domestic PWR plants. (author)

  5. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Lisbeth A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  6. Analysis of events occurred at overseas nuclear power plants in 2004

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa; Nishioka, Hiromasa; Sato, Masahiro; Chiba, Gorou; Takagawa, Kenichi; Shimada, Hiroki

    2005-01-01

    The Institute of Nuclear Safety Systems (INSS) investigates the information related to events and incidents occurred at overseas nuclear power plants, and proposes recommendations for the improvement of the safety and reliability of domestic PWR plants by evaluating them. Succeeding to the 2003 report, this report shows the summary of the evaluation activity and of the tendency analysis based on about 2800 information obtained in 2004. The tendency analysis was undertaken on about 1700 analyzed events, from the view point of mechanics, electrics and operations, about the causes, troubled equipments and so on. (author)

  7. Wound dressing with reusable electronics for wireless monitoring

    KAUST Repository

    Shamim, Atif

    2016-10-20

    A wound dressing device with reusable electronics for wireless monitoring and a method of making the same are provided. The device can be a smart device. In an embodiment, the device has a disposable portion including one or more sensors and a reusable portion including wireless electronics. The one or more sensors can be secured to a flexible substrate and can be printed by non-contact printing on the substrate. The disposable portion can be removably coupled to the one or more sensors. The device can include one or more sensors for wireless monitoring of a wound, a wound dressing, a body fluid exuded by the wound and/or wearer health.

  8. Logistic Organization of Mass Events in the Light of SWOT Analysis - Case Study

    Directory of Open Access Journals (Sweden)

    Joanna Woźniak

    2018-02-01

    Full Text Available Rzeszow Juwenalia is the largest free-entry student event in Subcarpathia, and, at the same time, one of the best in Poland. On average, more than 25,000 people stay on the campus of Rzeszow University of Technology for every single day of the event. Such an enormous undertaking requires developing a strategy which will make it possible to design and coordinate the event effectively. In connection with that, the principal objective of this paper is to present the strengths and weaknesses of Rzeszow Juwenalia, and also to attempt to verify opportunities and threats related to the event. SWOT analysis was used in order to attain the adopted objective. With the use of it, results making it possible to conduct a detailed assessment of the undertaking were obtained. In the publication were presented proposals of improvement activities which may be implemented in the future.

  9. DISPELLING ILLUSIONS OF REFLECTION: A NEW ANALYSIS OF THE 2007 MAY 19 CORONAL 'WAVE' EVENT

    International Nuclear Information System (INIS)

    Attrill, Gemma D. R.

    2010-01-01

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified 'reflections' are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  10. Analysis of events related to cracks and leaks in the reactor coolant pressure boundary

    Energy Technology Data Exchange (ETDEWEB)

    Ballesteros, Antonio, E-mail: Antonio.Ballesteros-Avila@ec.europa.eu [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Sanda, Radian; Peinador, Miguel; Zerger, Benoit [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Negri, Patrice [IRSN: Institut de Radioprotection et de Sûreté Nucléaire (France); Wenke, Rainer [GRS: Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) mbH (Germany)

    2014-08-15

    Highlights: • The important role of Operating Experience Feedback is emphasised. • Events relating to cracks and leaks in the reactor coolant pressure boundary are analysed. • A methodology for event investigation is described. • Some illustrative results of the analysis of events for specific components are presented. - Abstract: The presence of cracks and leaks in the reactor coolant pressure boundary may jeopardise the safe operation of nuclear power plants. Analysis of cracks and leaks related events is an important task for the prevention of their recurrence, which should be performed in the context of activities on Operating Experience Feedback. In response to this concern, the EU Clearinghouse operated by the JRC-IET supports and develops technical and scientific work to disseminate the lessons learned from past operating experience. In particular, concerning cracks and leaks, the studies carried out in collaboration with IRSN and GRS have allowed to identify the most sensitive areas to degradation in the plant primary system and to elaborate recommendations for upgrading the maintenance, ageing management and inspection programmes. An overview of the methodology used in the analysis of cracks and leaks related events is presented in this paper, together with the relevant results obtained in the study.

  11. Erectile dysfunction and cardiovascular events in diabetic men: a meta-analysis of observational studies.

    Directory of Open Access Journals (Sweden)

    Tomohide Yamada

    Full Text Available BACKGROUND: Several studies have shown that erectile dysfunction (ED influences the risk of cardiovascular events (CV events. However, a meta-analysis of the overall risk of CV events associated with ED in patients with diabetes has not been performed. METHODOLOGY/PRINCIPAL FINDINGS: We searched MEDLINE and the Cochrane Library for pertinent articles (including references published between 1951 and April 22, 2012. English language reports of original observational cohort studies and cross-sectional studies were included. Pooled effect estimates were obtained by random effects meta-analysis. A total of 3,791 CV events were reported in 3 cohort studies and 9 cross-sectional studies (covering 22,586 subjects. Across the cohort studies, the overall odds ratio (OR of diabetic men with ED versus those without ED was 1.74 (95% confidence interval [CI]: 1.34-2.27; P0.05. Moreover, meta-regression analysis found no relationship between the method used to assess ED (questionnaire or interview, mean age, mean hemoglobin A(1c, mean body mass index, or mean duration of diabetes and the risk of CV events or CHD. In the cross-sectional studies, the OR of diabetic men with ED versus those without ED was 3.39 (95% CI: 2.58-4.44; P<0.001 for CV events (N = 9, 3.43 (95% CI: 2.46-4.77; P<0.001 for CHD (N = 7, and 2.63 (95% CI: 1.41-4.91; P = 0.002 for peripheral vascular disease (N = 5. CONCLUSION/SIGNIFICANCE: ED was associated with an increased risk of CV events in diabetic patients. Prevention and early detection of cardiovascular disease are important in the management of diabetes, especially in view of the rapid increase in its prevalence.

  12. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual

    International Nuclear Information System (INIS)

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  13. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual. Companion CD

    International Nuclear Information System (INIS)

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  14. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    Science.gov (United States)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two

  15. Project of Ariane 5 LV family advancement by use of reusable fly-back boosters (named “Bargouzine”)

    Science.gov (United States)

    Sumin, Yu.; Bonnal, Ch.; Kostromin, S.; Panichkin, N.

    2007-12-01

    The paper concerns possible concept variants of a partially reusable Heavy-Lift Launch Vehicle derived from the advanced basic launcher (Ariane-2010) by means of substitution of the EAP Solid Rocket Boosters for a Reusable Starting Stage consisting two Liquid-propellant Reusable Fly-Back Boosters called "Bargouzin". This paper describes the status of the presently studied RFBB concepts during its three phases. The first project phase was dedicated to feasibility expertise of liquid-rocket reusable fly-back boosters ("Baikal" type) utilization for heavy-lift space launch vehicle. The design features and main conclusions are presented. The second phase has been performed with the purpose of selection of preferable concept among the alternative ones for the future Ariane LV modernization by using RFBB instead of EAP Boosters. The main requirements, logic of work, possible configuration and conclusion are presented. Initial aerodynamic, ballistic, thermoloading, dynamic loading, trade-off and comparison analysis have been performed on these concepts. The third phase consists in performing a more detailed expertise of the chosen LV concept. This part summarizes some of the more detailed results related to flight performance, system mass, thermoprotection system, aspects of technologies, ground complex modification, comparison analyses and conclusion.

  16. The limiting events transient analysis by RETRAN02 and VIPRE01 for an ABWR

    International Nuclear Information System (INIS)

    Tsai Chiungwen; Shih Chunkuan; Wang Jongrong; Lin Haotzu; Jin Jiunan; Cheng Suchin

    2009-01-01

    This paper describes the transient analysis of generator load rejection (LR) and One Turbine Control Valve Closure (OTCVC) events for Lungmen nuclear power plant (LMNPP). According to the Critical Power Ratio (CPR) criterion, the Preliminary Safety Analysis Report (PSAR) concluded that LR and OTCVC are the first and second limiting events respectively. In addition, the fuel type is changed from GE12 to GE14 now. It's necessary to re-analyze these two events for safety consideration. In this study, to quantify the impact to reactor, the difference of initial critical power ratio (ICPR) and minimum critical power ratio (MCPR), ie. ΔCPR is calculated. The ΔCPRs of the LR and OTCVC events are calculated with the combination of RETRAN02 and VIPRE01 codes. In RETRAN02 calculation, a thermal-hydraulic model was prepared for the transient analysis. The data including upper plenum pressure, core inlet flow, normalized power, and axial power shapes during transient are furthermore submitted into VIPRE01 for ΔCPR calculation. In VIPRE01 calculation, there was a hot channel model built to simulate the hottest fuel bundle. Based on the thermal-hydraulic data from RETRAN02, the ΔCPRs are calculated by VIPRE01 hot channel model. Additionally, the different TCV control modes are considered to study the influence of different TCV closure curves on transient behavior. Meanwhile, sensitivity studies including different initial system pressure and different initial power/flow conditions are also considered. Based on this analysis, the maximum ΔCPRs for LR and OTCVC are 0.162 and 0.191 respectively. According CPR criterion, the result shows that the impact caused by OTCVC event leads to be larger than LR event. (author)

  17. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)

    2014-07-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  18. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    International Nuclear Information System (INIS)

    Kančev, Duško; Duchac, Alexander; Zerger, Benoit; Maqua, Michael; Wattrelos, Didier

    2014-01-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  19. Time-to-event analysis of mastitis at first-lactation in Valle del Belice ewes

    NARCIS (Netherlands)

    Portolano, B.; Firlocchiaro, R.; Kaam, van J.B.C.H.M.; Riggio, V.; Maizon, D.O.

    2007-01-01

    A time-to-event study for mastitis at first-lactation in Valle del Belice ewes was conducted, using survival analysis with an animal model. The goals were to evaluate the effect of lambing season and level of milk production on the time from lambing to the day when a ewe experienced a test-day with

  20. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    Science.gov (United States)

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  1. Analysis of operational events by ATHEANA framework for human factor modelling

    International Nuclear Information System (INIS)

    Bedreaga, Luminita; Constntinescu, Cristina; Doca, Cezar; Guzun, Basarab

    2007-01-01

    In the area of human reliability assessment, the experts recognise the fact that the current methods have not represented correctly the role of human in prevention, initiating and mitigating the accidents in nuclear power plants. The nature of this deficiency appears because the current methods used in modelling of human factor have not taken into account the human performance and reliability such as it has been observed in the operational events. ATHEANA - A Technique for Human Error ANAlysis - is a new methodology for human analysis that has included the specific data of operational events and also psychological models for human behaviour. This method has included new elements such as the unsafe action and error mechanisms. In this paper we present the application of ATHEANA framework in the analysis of operational events that appeared in different nuclear power plants during 1979-2002. The analysis of operational events has consisted of: - identification of the unsafe actions; - including the unsafe actions into a category, omission ar commission; - establishing the type of error corresponding to the unsafe action: slip, lapse, mistake and circumvention; - establishing the influence of performance by shaping the factors and some corrective actions. (authors)

  2. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    Science.gov (United States)

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  3. The Analysis of the Properties of Super Solar Proton Events and the Associated Phenomena

    Science.gov (United States)

    Cheng, L. B.; Le, G. M.; Lu, Y. P.; Chen, M. H.; Li, P.; Yin, Z. Q.

    2014-05-01

    The solar flare, the propagation speed of shock driven by coronal mass ejection (CME) from the sun to the Earth, the source longitudes and Carrington longitudes, and the geomagnetic storms associated with each super solar proton event with the peak flux equal to or exceeding 10000 pfu have been investigated. The analysis results show that the source longitudes of super solar proton events ranged from E30° to W75°. The Carrington longitudes of source regions of super solar proton events distributed in the two longitude bands, 130°˜220° and 260°˜320°, respectively. All super solar proton events were accompanied by major solar flares and fast CMEs. The averaged speeds of shocks propagated from the sun to the Earth were greater than 1200 km/s. Eight super solar proton events were followed by major geomagnetic storms (Dst≤-100 nT). One super solar proton event was followed by a geomagnetic storm with Dst=-96 nT.

  4. Probabilistic safety analysis for fire events for the NPP Isar 2

    International Nuclear Information System (INIS)

    Schmaltz, H.; Hristodulidis, A.

    2007-01-01

    The 'Probabilistic Safety Analysis for Fire Events' (Fire-PSA KKI2) for the NPP Isar 2 was performed in addition to the PSA for full power operation and considers all possible events which can be initiated due to a fire. The aim of the plant specific Fire-PSA was to perform a quantitative assessment of fire events during full power operation, which is state of the art. Based on simplistic assumptions referring to the fire induced failures, the influence of system- and component-failures on the frequency of the core damage states was analysed. The Fire-PSA considers events, which will result due to fire-induced failures of equipment on the one hand in a SCRAM and on the other hand in events, which will not have direct operational effects but because of the fire-induced failure of safety related installations the plant will be shut down as a precautionary measure. These events are considered because they may have a not negligible influence on the frequency of core damage states in case of failures during the plant shut down because of the reduced redundancy of safety related systems. (orig.)

  5. Fuel element thermo-mechanical analysis during transient events using the FMS and FETMA codes

    International Nuclear Information System (INIS)

    Hernandez Lopez Hector; Hernandez Martinez Jose Luis; Ortiz Villafuerte Javier

    2005-01-01

    In the Instituto Nacional de Investigaciones Nucleares of Mexico, the Fuel Management System (FMS) software package has been used for long time to simulate the operation of a BWR nuclear power plant in steady state, as well as in transient events. To evaluate the fuel element thermo-mechanical performance during transient events, an interface between the FMS codes and our own Fuel Element Thermo Mechanical Analysis (FETMA) code is currently being developed and implemented. In this work, the results of the thermo-mechanical behavior of fuel rods in the hot channel during the simulation of transient events of a BWR nuclear power plant are shown. The transient events considered for this work are a load rejection and a feedwater control failure, which among the most important events that can occur in a BWR. The results showed that conditions leading to fuel rod failure at no time appeared for both events. Also, it is shown that a transient due load rejection is more demanding on terms of safety that the failure of a controller of the feedwater. (authors)

  6. Root cause analysis of critical events in neurosurgery, New South Wales.

    Science.gov (United States)

    Perotti, Vanessa; Sheridan, Mark M P

    2015-09-01

    Adverse events reportedly occur in 5% to 10% of health care episodes. Not all adverse events are the result of error; they may arise from systemic faults in the delivery of health care. Catastrophic events are not only physically devastating to patients, but they also attract medical liability and increase health care costs. Root cause analysis (RCA) has become a key tool for health care services to understand those adverse events. This study is a review of all the RCA case reports involving neurosurgical patients in New South Wales between 2008 and 2013. The case reports and data were obtained from the Clinical Excellence Commission database. The data was then categorized by the root causes identified and the recommendations suggested by the RCA committees. Thirty-two case reports were identified in the RCA database. Breaches in policy account for the majority of root causes identified, for example, delays in transfer of patients or wrong-site surgery, which always involved poor adherence to correct patient and site identification procedures. The RCA committees' recommendations included education for staff, and improvements in rostering and procedural guidelines. RCAs have improved the patient safety profile; however, the RCA committees have no power to enforce any recommendation or ensure compliance. A single RCA may provide little learning beyond the unit and staff involved. However, through aggregation of RCA data and dissemination strategies, health care workers can learn from adverse events and prevent future events from occurring. © 2015 Royal Australasian College of Surgeons.

  7. Hazard analysis of typhoon-related external events using extreme value theory

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yo Chan; Jang, Seung Cheol [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lim, Tae Jin [Dept. of Industrial Information Systems Engineering, Soongsil University, Seoul (Korea, Republic of)

    2015-02-15

    After the Fukushima accident, the importance of hazard analysis for extreme external events was raised. To analyze typhoon-induced hazards, which are one of the significant disasters of East Asian countries, a statistical analysis using the extreme value theory, which is a method for estimating the annual exceedance frequency of a rare event, was conducted for an estimation of the occurrence intervals or hazard levels. For the four meteorological variables, maximum wind speed, instantaneous wind speed, hourly precipitation, and daily precipitation, the parameters of the predictive extreme value theory models were estimated. The 100-year return levels for each variable were predicted using the developed models and compared with previously reported values. It was also found that there exist significant long-term climate changes of wind speed and precipitation. A fragility analysis should be conducted to ensure the safety levels of a nuclear power plant for high levels of wind speed and precipitation, which exceed the results of a previous analysis.

  8. Screening Analysis of Criticality Features, Events, and Processes for License Application

    International Nuclear Information System (INIS)

    J.A. McClure

    2004-01-01

    This report documents the screening analysis of postclosure criticality features, events, and processes. It addresses the probability of criticality events resulting from degradation processes as well as disruptive events (i.e., seismic, rock fall, and igneous). Probability evaluations are performed utilizing the configuration generator described in ''Configuration Generator Model'', a component of the methodology from ''Disposal Criticality Analysis Methodology Topical Report''. The total probability per package of criticality is compared against the regulatory probability criterion for inclusion of events established in 10 CFR 63.114(d) (consider only events that have at least one chance in 10,000 of occurring over 10,000 years). The total probability of criticality accounts for the evaluation of identified potential critical configurations of all baselined commercial and U.S. Department of Energy spent nuclear fuel waste form and waste package combinations, both internal and external to the waste packages. This criticality screening analysis utilizes available information for the 21-Pressurized Water Reactor Absorber Plate, 12-Pressurized Water Reactor Absorber Plate, 44-Boiling Water Reactor Absorber Plate, 24-Boiling Water Reactor Absorber Plate, and the 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Short waste package types. Where defensible, assumptions have been made for the evaluation of the following waste package types in order to perform a complete criticality screening analysis: 21-Pressurized Water Reactor Control Rod, 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Long, and 2-Multi-Canister Overpack/2-Defense High-Level Radioactive Waste package types. The inputs used to establish probabilities for this analysis report are based on information and data generated for the Total System Performance Assessment for the License Application, where available. This analysis report determines whether criticality is to be

  9. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  10. Identification of fire modeling issues based on an analysis of real events from the OECD FIRE database

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, Dominik [Swiss Federal Nuclear Safety Inspectorate ENSI, Brugg (Switzerland)

    2017-03-15

    Precursor analysis is widely used in the nuclear industry to judge the significance of events relevant to safety. However, in case of events that may damage equipment through effects that are not ordinary functional dependencies, the analysis may not always fully appreciate the potential for further evolution of the event. For fires, which are one class of such events, this paper discusses modelling challenges that need to be overcome when performing a probabilistic precursor analysis. The events used to analyze are selected from the Organisation for Economic Cooperation and Development (OECD) Fire Incidents Records Exchange (FIRE) Database.

  11. Is the efficacy of antidepressants in panic disorder mediated by adverse events? A mediational analysis.

    Directory of Open Access Journals (Sweden)

    Irene Bighelli

    Full Text Available It has been hypothesised that the perception of adverse events in placebo-controlled antidepressant clinical trials may induce patients to conclude that they have been randomized to the active arm of the trial, leading to the breaking of blind. This may enhance the expectancies for improvement and the therapeutic response. The main objective of this study is to test the hypothesis that the efficacy of antidepressants in panic disorder is mediated by the perception of adverse events. The present analysis is based on a systematic review of published and unpublished randomised trials comparing antidepressants with placebo for panic disorder. The Baron and Kenny approach was applied to investigate the mediational role of adverse events in the relationship between antidepressants treatment and efficacy. Fourteen placebo-controlled antidepressants trials were included in the analysis. We found that: (a antidepressants treatment was significantly associated with better treatment response (ß = 0.127, 95% CI 0.04 to 0.21, p = 0.003; (b antidepressants treatment was not associated with adverse events (ß = 0.094, 95% CI -0.05 to 0.24, p = 0.221; (c adverse events were negatively associated with treatment response (ß = 0.035, 95% CI -0.06 to -0.05, p = 0.022. Finally, after adjustment for adverse events, the relationship between antidepressants treatment and treatment response remained statistically significant (ß = 0.122, 95% CI 0.01 to 0.23, p = 0.039. These findings do not support the hypothesis that the perception of adverse events in placebo-controlled antidepressant clinical trials may lead to the breaking of blind and to an artificial inflation of the efficacy measures. Based on these results, we argue that the moderate therapeutic effect of antidepressants in individuals with panic disorder is not an artefact, therefore reflecting a genuine effect that doctors can expect to replicate under real-world conditions.

  12. A multiprocessor system for the analysis of pictures of nuclear events

    CERN Document Server

    Bacilieri, P; Matteuzzi, P; Sini, G P; Zanotti, U

    1979-01-01

    The pictures of nuclear events obtained from the bubble chambers such as Gargamelle and BEBC at CERN and others from Serpukhov are geometrically processed at CNAF (Centro Nazionale Analysis Photogrammi) in Bologna. The analysis system includes an Erasme table and a CRT flying spot digitizer. The difficulties connected with the pictures of the four stereoscopic views of the bubble chambers are overcome by the choice of a strong interactive system. (0 refs).

  13. Working group of experts on rare events in human error analysis and quantification

    International Nuclear Information System (INIS)

    Goodstein, L.P.

    1977-01-01

    In dealing with the reference problem of rare events in nuclear power plants, the group has concerned itself with the man-machine system and, in particular, with human error analysis and quantification. The Group was requested to review methods of human reliability prediction, to evaluate the extent to which such analyses can be formalized and to establish criteria to be met by task conditions and system design which would permit a systematic, formal analysis. Recommendations are given on the Fessenheim safety system

  14. Bacterial contamination of re-usable laryngoscope blades during the ...

    African Journals Online (AJOL)

    We aimed to assess the level of microbial contamination of re-usable laryngoscope blades at a public hospital in South Africa. Setting. The theatre complex of a secondary-level public hospital in Johannesburg. Methods. Blades from two different theatres were sampled twice daily, using a standardised technique, over a ...

  15. A reusable multi-agent architecture for active intelligent websites

    NARCIS (Netherlands)

    Jonker, C.M.; Lam, R.A.; Treur, J.

    In this paper a reusable multi-agent architecture for intelligent Websites is presented and illustrated for an electronic department store. The architecture has been designed and implemented using the compositional design method for multi-agent systems DESIRE. The agents within this architecture are

  16. Research Data Reusability: Conceptual Foundations, Barriers and Enabling Technologies

    Directory of Open Access Journals (Sweden)

    Costantino Thanos

    2017-01-01

    Full Text Available High-throughput scientific instruments are generating massive amounts of data. Today, one of the main challenges faced by researchers is to make the best use of the world’s growing wealth of data. Data (reusability is becoming a distinct characteristic of modern scientific practice. By data (reusability, we mean the ease of using data for legitimate scientific research by one or more communities of research (consumer communities that is produced by other communities of research (producer communities. Data (reusability allows the reanalysis of evidence, reproduction and verification of results, minimizing duplication of effort, and building on the work of others. It has four main dimensions: policy, legal, economic and technological. The paper addresses the technological dimension of data reusability. The conceptual foundations of data reuse as well as the barriers that hamper data reuse are presented and discussed. The data publication process is proposed as a bridge between the data author and user and the relevant technologies enabling this process are presented.

  17. Wound dressing with reusable electronics for wireless monitoring

    KAUST Repository

    Shamim, Atif; Farooqui, Muhammad Fahad

    2016-01-01

    A wound dressing device with reusable electronics for wireless monitoring and a method of making the same are provided. The device can be a smart device. In an embodiment, the device has a disposable portion including one or more sensors and a

  18. Hospital information system: reusability, designing, modelling, recommendations for implementing.

    Science.gov (United States)

    Huet, B

    1998-01-01

    The aims of this paper are to precise some essential conditions for building reuse models for hospital information systems (HIS) and to present an application for hospital clinical laboratories. Reusability is a general trend in software, however reuse can involve a more or less part of design, classes, programs; consequently, a project involving reusability must be precisely defined. In the introduction it is seen trends in software, the stakes of reuse models for HIS and the special use case constituted with a HIS. The main three parts of this paper are: 1) Designing a reuse model (which objects are common to several information systems?) 2) A reuse model for hospital clinical laboratories (a genspec object model is presented for all laboratories: biochemistry, bacteriology, parasitology, pharmacology, ...) 3) Recommendations for generating plug-compatible software components (a reuse model can be implemented as a framework, concrete factors that increase reusability are presented). In conclusion reusability is a subtle exercise of which project must be previously and carefully defined.

  19. Towards a reusable architecture for message exchange in pervasive healthcare

    NARCIS (Netherlands)

    Cardoso de Moraes, J.L.; Lopes de Souza, Wanderley; Ferreira Pires, Luis; do Prado, Antonio Francisco; Hammoudi, S.; Maciaszek, L.A.; Cordeiro, J.; Dietz, J.L.G.

    The main objective of this paper is to present a reusable architecture for message exchange in pervasive healthcare environments meant to be generally applicable to different applications in the healthcare domain. This architecture has been designed by integrating different concepts and technologies

  20. Identification of homogeneous regions for rainfall regional frequency analysis considering typhoon event in South Korea

    Science.gov (United States)

    Heo, J. H.; Ahn, H.; Kjeldsen, T. R.

    2017-12-01

    South Korea is prone to large, and often disastrous, rainfall events caused by a mixture of monsoon and typhoon rainfall phenomena. However, traditionally, regional frequency analysis models did not consider this mixture of phenomena when fitting probability distributions, potentially underestimating the risk posed by the more extreme typhoon events. Using long-term observed records of extreme rainfall from 56 sites combined with detailed information on the timing and spatial impact of past typhoons from the Korea Meteorological Administration (KMA), this study developed and tested a new mixture model for frequency analysis of two different phenomena; events occurring regularly every year (monsoon) and events only occurring in some years (typhoon). The available annual maximum 24 hour rainfall data were divided into two sub-samples corresponding to years where the annual maximum is from either (1) a typhoon event, or (2) a non-typhoon event. Then, three-parameter GEV distribution was fitted to each sub-sample along with a weighting parameter characterizing the proportion of historical events associated with typhoon events. Spatial patterns of model parameters were analyzed and showed that typhoon events are less commonly associated with annual maximum rainfall in the North-West part of the country (Seoul area), and more prevalent in the southern and eastern parts of the country, leading to the formation of two distinct typhoon regions: (1) North-West; and (2) Southern and Eastern. Using a leave-one-out procedure, a new regional frequency model was tested and compared to a more traditional index flood method. The results showed that the impact of typhoon on design events might previously have been underestimated in the Seoul area. This suggests that the use of the mixture model should be preferred where the typhoon phenomena is less frequent, and thus can have a significant effect on the rainfall-frequency curve. This research was supported by a grant(2017-MPSS31

  1. An Initiating-Event Analysis for PSA of Hanul Units 3 and 4: Results and Insights

    International Nuclear Information System (INIS)

    Kim, Dong-San; Park, Jin Hee

    2015-01-01

    As a part of the PSA, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard related to IE analysis. This paper describes the methods of, results and some insights from the IE analysis for the PSA of the Hanul units 3 and 4. In this study, as a part of the PSA for the Hanul units 3 and 4, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard. In comparison with the previous IE analysis, this study performed a more systematic and detailed analysis to identify potential initiating events, and calculated the IE frequencies by using the state-of-the-art methods and the latest data. As a result, not a few IE frequencies are quite different from the previous frequencies, which can change the major accident sequences obtained from the quantification of the PSA model

  2. Bantam: A Systematic Approach to Reusable Launch Vehicle Technology Development

    Science.gov (United States)

    Griner, Carolyn; Lyles, Garry

    1999-01-01

    The Bantam technology project is focused on providing a low cost launch capability for very small (100 kilogram) NASA and University science payloads. The cost goal has been set at one million dollars per launch. The Bantam project, however, represents much more than a small payload launch capability. Bantam represents a unique, systematic approach to reusable launch vehicle technology development. This technology maturation approach will enable future highly reusable launch concepts in any payload class. These launch vehicle concepts of the future could deliver payloads for hundreds of dollars per pound, enabling dramatic growth in civil and commercial space enterprise. The National Aeronautics and Space Administration (NASA) has demonstrated a better, faster, and cheaper approach to science discovery in recent years. This approach is exemplified by the successful Mars Exploration Program lead by the Jet Propulsion Laboratory (JPL) for the NASA Space Science Enterprise. The Bantam project represents an approach to space transportation technology maturation that is very similar to the Mars Exploration Program. The NASA Advanced Space Transportation Program (ASTP) and Future X Pathfinder Program will combine to systematically mature reusable space transportation technology from low technology readiness to system level flight demonstration. New reusable space transportation capability will be demonstrated at a small (Bantam) scale approximately every two years. Each flight demonstration will build on the knowledge derived from the previous flight tests. The Bantam scale flight demonstrations will begin with the flights of the X-34. The X-34 will demonstrate reusable launch vehicle technologies including; flight regimes up to Mach 8 and 250,000 feet, autonomous flight operations, all weather operations, twenty-five flights in one year with a surge capability of two flights in less than twenty-four hours and safe abort. The Bantam project will build on this initial

  3. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    Science.gov (United States)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  4. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    Science.gov (United States)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  5. Disruptive event uncertainties in a perturbation approach to nuclear waste repository risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, T.F.

    1980-09-01

    A methodology is developed for incorporating a full range of the principal forecasting uncertainties into a risk analysis of a nuclear waste repository. The result of this methodology is a set of risk curves similar to those used by Rasmussen in WASH-1400. The set of curves is partially derived from a perturbation approach to analyze potential disruptive event sequences. Such a scheme could be useful in truncating the number of disruptive event scenarios and providing guidance to those establishing data-base development priorities.

  6. Analysis of transverse momentum and event shape in νN scattering

    International Nuclear Information System (INIS)

    Bosetti, P.C.; Graessler, H.; Lanske, D.; Schulte, R.; Schultze, K.; Simopoulou, E.; Vayaki, A.; Barnham, K.W.J.; Hamisi, F.; Miller, D.B.; Mobayyen, M.M.; Wainstein, S.; Aderholz, M.; Hantke, D.; Hoffmann, E.; Katz, U.F.; Kern, J.; Schmitz, N.; Wittek, W.; Albajar, C.; Batley, J.R.; Myatt, G.; Perkins, D.H.; Radojicic, D.; Renton, P.; Saitta, S.; Bullock, F.W.; Burke, S.

    1990-01-01

    The transverse momentum distributions of hadrons produced in neutrino-nucleon charged current interactions and their dependence on W are analysed in detail. It is found that the components of the transverse momentum in the event plane and normal to it increase with W at about the same rate throughout the available W range. A comparison with e + e - data is made. Studies of the energy flow and angular distributions in the events classified as planar do not show clear evidence for high energy, wide angle gluon radiation, in contrast to the conclusion of a previous analysis of similar neutrino data. (orig.)

  7. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  8. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  9. Cost comparison of re-usable and single-use fibrescopes in a large English teaching hospital.

    Science.gov (United States)

    McCahon, R A; Whynes, D K

    2015-06-01

    A number of studies in the U.S.A. and mainland Europe have described the costs of fibreoptic tracheal intubation. However, no such data from the UK appear available. We performed a cost assessment of fibreoptic intubation, using re-usable (various devices from Olympus, Acutronic and Karl Storz) and single-use (Ambu aScope) fibrescopes, at the Queens Medical Centre, Nottingham, U.K., between 1 January 2009 and 31 March 2014. The total annual cost of fibreoptic intubation with re-usable fibrescopes was £46,385. Based on 141 fibreoptic intubations per year, this equated to £329 per use, an average dominated by repair/maintenance costs (43%) and capital depreciation costs (42%). In comparison, the total annual cost of using single-use fibrescopes for the same work would have been around £200 per use. The analysis enabled us to develop a generic model, wherein we were able to describe the relationship between total cost of use vs number of uses for a fibrescope. An 'isopleth' was identified for this relationship: a line that joined all the points where the cost of re-usable vs single-use fibrescopes was equal. It appears cheaper to use single-use fibrescopes at up to 200 fibreoptic intubations per year (a range commensurate with normal practice) even when the repair rate for re-usable fibrescopes is low. Any centre, knowing its fibrescope use and repair rate, can plot its data similarly to help ascertain which of the re-usable or single-use fibrescope represents better value. © 2015 The Association of Anaesthetists of Great Britain and Ireland.

  10. Carbon Footprint in Flexible Ureteroscopy: A Comparative Study on the Environmental Impact of Reusable and Single-Use Ureteroscopes.

    Science.gov (United States)

    Davis, Niall F; McGrath, Shannon; Quinlan, Mark; Jack, Gregory; Lawrentschuk, Nathan; Bolton, Damien M

    2018-03-01

    There are no comparative assessments on the environmental impact of endourologic instruments. We evaluated and compared the environmental impact of single-use flexible ureteroscopes with reusable flexible ureteroscopes. An analysis of the typical life cycle of the LithoVue™ (Boston Scientific) single-use digital flexible ureteroscope and Olympus Flexible Video Ureteroscope (URV-F) was performed. To measure the carbon footprint, data were obtained on manufacturing of single-use and reusable flexible ureteroscopes and from typical uses obtained with a reusable scope, including repairs, replacement instruments, and ultimate disposal of both ureteroscopes. The solid waste generated (kg) and energy consumed (kWh) during each case were quantified and converted into their equivalent mass of carbon dioxide (kg of CO 2 ) released. Flexible ureteroscopic raw materials composed of plastic (90%), steel (4%), electronics (4%), and rubber (2%). The manufacturing cost of a flexible ureteroscope was 11.49 kg of CO 2 per 1 kg of ureteroscope. The weight of the single-use LithoVue and URV-F flexible ureteroscope was 0.3 and 1 kg, respectively. The total carbon footprint of the lifecycle assessment of the LithoVue was 4.43 kg of CO 2 per endourologic case. The total carbon footprint of the lifecycle of the reusable ureteroscope was 4.47 kg of CO 2 per case. The environmental impacts of the reusable flexible ureteroscope and the single-use flexible ureteroscope are comparable. Urologists should be aware that the typical life cycle of urologic instruments is a concerning source of environmental emissions.

  11. Internal event analysis for Laguna Verde Unit 1 Nuclear Power Plant. Accident sequence quantification and results

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1994-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant, CNSNS-TR 004, in five volumes. The reports are organized as follows: CNSNS-TR 004 Volume 1: Introduction and Methodology. CNSNS-TR4 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR 004 Volume 3: System Analysis. CNSNS-TR 004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR 005 Volume 5: Appendices A, B and C. This volume presents the development of the dependent failure analysis, the treatment of the support system dependencies, the identification of the shared-components dependencies, and the treatment of the common cause failure. It is also presented the identification of the main human actions considered along with the possible recovery actions included. The development of the data base and the assumptions and limitations in the data base are also described in this volume. The accident sequences quantification process and the resolution of the core vulnerable sequences are presented. In this volume, the source and treatment of uncertainties associated with failure rates, component unavailabilities, initiating event frequencies, and human error probabilities are also presented. Finally, the main results and conclusions for the Internal Event Analysis for Laguna Verde Nuclear Power Plant are presented. The total core damage frequency calculated is 9.03x 10-5 per year for internal events. The most dominant accident sequences found are the transients involving the loss of offsite power, the station blackout accidents, and the anticipated transients without SCRAM (ATWS). (Author)

  12. Analysis methodology for the post-trip return to power steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chul Shin; Kim, Chul Woo; You, Hyung Keun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different from that of non-RTP SLB and is more difficult. Therefore, it is necessary to develop a methodology to analyze the response of the NSSS parameters to the post-trip RTP SLB events and the fuel performance after the total reactivity exceeds the criticality. In this analysis, the cases with and without offsite power were simulated crediting 3-D reactivity feedback effect due to a local heatup in the vicinity of stuck CEA and compared with the cases without 3-D reactivity feedback with respect to post-trip fuel performance. Departure-to Nucleate Boiling Ratio (DNBR) and Linear Heat Generation Rate (LHGR). 36 tabs., 32 figs., 11 refs. (Author) .new.

  13. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, de H.C.W.; Zwaan, L.; Timmermans, D.R.M.; Groenewegen, P.P.; Wagner, C.

    2009-01-01

    BACKGROUND: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis: is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  14. Analysis of unintended events in hospitals : inter-rater reliability of constructing causal trees and classifying root causes

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Groenewegen, P.P.; Timmermans, D.

    2009-01-01

    Background. Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  15. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes.

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Timmermans, D.; Groenewegen, P.; Wagner, C.

    2009-01-01

    Background: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  16. Antipsychotics, glycemic disorders, and life-threatening diabetic events: a Bayesian data-mining analysis of the FDA adverse event reporting system (1968-2004).

    Science.gov (United States)

    DuMouchel, William; Fram, David; Yang, Xionghu; Mahmoud, Ramy A; Grogg, Amy L; Engelhart, Luella; Ramaswamy, Krishnan

    2008-01-01

    This analysis compared diabetes-related adverse events associated with use of different antipsychotic agents. A disproportionality analysis of the US Food and Drug Administration (FDA) Adverse Event Reporting System (AERS) was performed. Data from the FDA postmarketing AERS database (1968 through first quarter 2004) were evaluated. Drugs studied included aripiprazole, clozapine, haloperidol, olanzapine, quetiapine, risperidone, and ziprasidone. Fourteen Medical Dictionary for Regulatory Activities (MedDRA) Primary Terms (MPTs) were chosen to identify diabetes-related adverse events; 3 groupings into higher-level descriptive categories were also studied. Three methods of measuring drug-event associations were used: proportional reporting ratio, the empirical Bayes data-mining algorithm known as the Multi-Item Gamma Poisson Shrinker, and logistic regression (LR) analysis. Quantitative measures of association strength, with corresponding confidence intervals, between drugs and specified adverse events were computed and graphed. Some of the LR analyses were repeated separately for reports from patients under and over 45 years of age. Differences in association strength were declared statistically significant if the corresponding 90% confidence intervals did not overlap. Association with various glycemic events differed for different drugs. On average, the rankings of association strength agreed with the following ordering: low association, ziprasidone, aripiprazole, haloperidol, and risperidone; medium association, quetiapine; and strong association, clozapine and olanzapine. The median rank correlation between the above ordering and the 17 sets of LR coefficients (1 set for each glycemic event) was 93%. Many of the disproportionality measures were significantly different across drugs, and ratios of disproportionality factors of 5 or more were frequently observed. There are consistent and substantial differences between atypical antipsychotic drugs in the

  17. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data.

    Science.gov (United States)

    Luo, Jake; Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-10-17

    Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (Pgroup. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older patients (older than 40 years) were more likely to observe a diverse range of adverse events (Page group (older

  18. Living with extreme weather events - perspectives from climatology, geomorphological analysis, chronicles and opinion polls

    Science.gov (United States)

    Auer, I.; Kirchengast, A.; Proske, H.

    2009-09-01

    The ongoing climate change debate focuses more and more on changing extreme events. Information on past events can be derived from a number of sources, such as instrumental data, residual impacts in the landscape, but also chronicles and people's memories. A project called "A Tale of Two Valleys” within the framework of the research program "proVision” allowed to study past extreme events in two inner-alpine valleys from the sources mentioned before. Instrumental climate time series provided information for the past 200 years, however great attention had to be given to the homogeneity of the series. To derive homogenized time series of selected climate change indices methods like HOCLIS and Vincent have been applied. Trend analyses of climate change indices inform about increase or decrease of extreme events. Traces of major geomorphodynamic processes of the past (e.g. rockfalls, landslides, debris flows) which were triggered or affected by extreme weather events are still apparent in the landscape and could be evaluated by geomorphological analysis using remote sensing and field data. Regional chronicles provided additional knowledge and covered longer periods back in time, however compared to meteorological time series they enclose a high degree of subjectivity and intermittent recordings cannot be obviated. Finally, questionnaires and oral history complemented our picture of past extreme weather events. People were differently affected and have different memories of it. The joint analyses of these four data sources showed agreement to some extent, however also showed some reasonable differences: meteorological data are point measurements only with a sometimes too coarse temporal resolution. Due to land-use changes and improved constructional measures the impact of an extreme meteorological event may be different today compared to earlier times.

  19. The logic of surveillance guidelines: an analysis of vaccine adverse event reports from an ontological perspective.

    Directory of Open Access Journals (Sweden)

    Mélanie Courtot

    Full Text Available BACKGROUND: When increased rates of adverse events following immunization are detected, regulatory action can be taken by public health agencies. However to be interpreted reports of adverse events must be encoded in a consistent way. Regulatory agencies rely on guidelines to help determine the diagnosis of the adverse events. Manual application of these guidelines is expensive, time consuming, and open to logical errors. Representing these guidelines in a format amenable to automated processing can make this process more efficient. METHODS AND FINDINGS: Using the Brighton anaphylaxis case definition, we show that existing clinical guidelines used as standards in pharmacovigilance can be logically encoded using a formal representation such as the Adverse Event Reporting Ontology we developed. We validated the classification of vaccine adverse event reports using the ontology against existing rule-based systems and a manually curated subset of the Vaccine Adverse Event Reporting System. However, we encountered a number of critical issues in the formulation and application of the clinical guidelines. We report these issues and the steps being taken to address them in current surveillance systems, and in the terminological standards in use. CONCLUSIONS: By standardizing and improving the reporting process, we were able to automate diagnosis confirmation. By allowing medical experts to prioritize reports such a system can accelerate the identification of adverse reactions to vaccines and the response of regulatory agencies. This approach of combining ontology and semantic technologies can be used to improve other areas of vaccine adverse event reports analysis and should inform both the design of clinical guidelines and how they are used in the future. AVAILABILITY: Sufficient material to reproduce our results is available, including documentation, ontology, code and datasets, at http://purl.obolibrary.org/obo/aero.

  20. Analysis of the Power oscillations event in Laguna Verde Nuclear Power Plant. Preliminary Report

    International Nuclear Information System (INIS)

    Gonzalez M, V.M.; Amador G, R.; Castillo, R.; Hernandez, J.L.

    1995-01-01

    The event occurred at Unit 1 of Laguna Verde Nuclear Power Plant in January 24, 1995, is analyzed using the Ramona 3 B code. During this event, Unit 1 suffered power oscillation when operating previous to the transfer at high speed recirculating pumps. This phenomenon was timely detected by reactor operator who put the reactor in shut-down doing a manual Scram. Oscillations reached a maximum extent of 10.5% of nominal power from peak to peak with a frequency of 0.5 Hz. Preliminary evaluations show that the event did not endangered the fuel integrity. The results of simulating the reactor core with Ramona 3 B code show that this code is capable to moderate reactor oscillations. Nevertheless it will be necessary to perform a more detailed simulation of the event in order to prove that the code can predict the beginning of oscillations. It will be need an additional analysis which permit the identification of factors that influence the reactor stability in order to express recommendations and in this way avoid the recurrence of this kind of events. (Author)

  1. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    Science.gov (United States)

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.

  2. A hydrological analysis of the 4 November 2011 event in Genoa

    Directory of Open Access Journals (Sweden)

    F. Silvestro

    2012-09-01

    Full Text Available On the 4 November 2011 a flash flood event hit the area of Genoa with dramatic consequences. Such an event represents, from the meteorological and hydrological perspective, a paradigm of flash floods in the Mediterranean environment.

    The hydro-meteorological probabilistic forecasting system for small and medium size catchments in use at the Civil Protection Centre of Liguria region exhibited excellent performances for the event, by predicting, 24–48 h in advance, the potential level of risk associated with the forecast. It greatly helped the decision makers in issuing a timely and correct alert.

    In this work we present the operational outputs of the system provided during the Liguria events and the post event hydrological modelling analysis that has been carried out accounting also for the crowd sourcing information and data. We discuss the benefit of the implemented probabilistic systems for decision-making under uncertainty, highlighting how, in this case, the multi-catchment approach used for predicting floods in small basins has been crucial.

  3. Accuracy analysis of measurements on a stable power-law distributed series of events

    International Nuclear Information System (INIS)

    Matthews, J O; Hopcraft, K I; Jakeman, E; Siviour, G B

    2006-01-01

    We investigate how finite measurement time limits the accuracy with which the parameters of a stably distributed random series of events can be determined. The model process is generated by timing the emigration of individuals from a population that is subject to deaths and a particular choice of multiple immigration events. This leads to a scale-free discrete random process where customary measures, such as mean value and variance, do not exist. However, converting the number of events occurring in fixed time intervals to a 1-bit 'clipped' process allows the construction of well-behaved statistics that still retain vestiges of the original power-law and fluctuation properties. These statistics include the clipped mean and correlation function, from measurements of which both the power-law index of the distribution of events and the time constant of its fluctuations can be deduced. We report here a theoretical analysis of the accuracy of measurements of the mean of the clipped process. This indicates that, for a fixed experiment time, the error on measurements of the sample mean is minimized by an optimum choice of the number of samples. It is shown furthermore that this choice is sensitive to the power-law index and that the approach to Poisson statistics is dominated by rare events or 'outliers'. Our results are supported by numerical simulation

  4. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  5. Assessment of the Feasibility of Innovative Reusable Launchers

    Science.gov (United States)

    Chiesa, S.; Corpino, S.; Viola, N.

    The demand for getting access to space, in particular to Low Earth Orbit, is increasing and fully reusable launch vehicles (RLVs) are likely to play a key role in the development of future space activities. Up until now this kind of space systems has not been successfully carried out: in fact today only the Space Shuttle, which belongs to the old generation of launchers, is operative and furthermore it is not a fully reusable system. In the nineties many studies regarding advanced transatmospheric planes were started, but no one was accomplished because of the technological problems encountered and the high financial resources required with the corresponding industrial risk. One of the most promising project was the Lockheed Venture Star, which seemed to have serious chances to be carried out. Anyway, if this ever happens, it will take quite a long time thus the operative life of Space Shuttle will have to be extended for the International Space Station support. The purpose of the present work is to assess the feasibility of different kinds of advanced reusable launch vehicles to gain access to space and to meet the requirements of today space flight needs, which are mainly safety and affordability. Single stage to orbit (SSTO), two stage to orbit (TSTO) and the so called "one and a half" stage to orbit vehicles are here taken into account to highlight their advantages and disadvantages. The "one and a half" stage to orbit vehicle takes off and climbs to meet a tanker aircraft to be aerially refuelled and then, after disconnecting from the tanker, it flies to reach the orbit. In this case, apart from the space vehicle, also the tanker aircraft needs a dedicated study to examine the problems related to the refuelling at high subsonic speeds and at a height near the tropopause. Only winged vehicles which take off and land horizontally are considered but different architectural layouts and propulsive configurations are hypothesised. Unlike the Venture Star, which

  6. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  7. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  8. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  9. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    Science.gov (United States)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  10. Cryogenic dark matter search (CDMS II): Application of neural networks and wavelets to event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Attisha, Michael J. [Brown U.

    2006-01-01

    The Cryogenic Dark Matter Search (CDMS) experiment is designed to search for dark matter in the form of Weakly Interacting Massive Particles (WIMPs) via their elastic scattering interactions with nuclei. This dissertation presents the CDMS detector technology and the commissioning of two towers of detectors at the deep underground site in Soudan, Minnesota. CDMS detectors comprise crystals of Ge and Si at temperatures of 20 mK which provide ~keV energy resolution and the ability to perform particle identification on an event by event basis. Event identification is performed via a two-fold interaction signature; an ionization response and an athermal phonon response. Phonons and charged particles result in electron recoils in the crystal, while neutrons and WIMPs result in nuclear recoils. Since the ionization response is quenched by a factor ~ 3(2) in Ge(Si) for nuclear recoils compared to electron recoils, the relative amplitude of the two detector responses allows discrimination between recoil types. The primary source of background events in CDMS arises from electron recoils in the outer 50 µm of the detector surface which have a reduced ionization response. We develop a quantitative model of this ‘dead layer’ effect and successfully apply the model to Monte Carlo simulation of CDMS calibration data. Analysis of data from the two tower run March-August 2004 is performed, resulting in the world’s most sensitive limits on the spin-independent WIMP-nucleon cross-section, with a 90% C.L. upper limit of 1.6 × 10-43 cm2 on Ge for a 60 GeV WIMP. An approach to performing surface event discrimination using neural networks and wavelets is developed. A Bayesian methodology to classifying surface events using neural networks is found to provide an optimized method based on minimization of the expected dark matter limit. The discrete wavelet analysis of CDMS phonon pulses improves surface event discrimination in conjunction with the neural

  11. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    Science.gov (United States)

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  12. Brief communication: Post-event analysis of loss of life due to hurricane Harvey

    OpenAIRE

    Jonkman, Sebastiaan N.; Godfroy, Maartje; Sebastian, Antonia; Kolen, Bas

    2018-01-01

    An analysis was made of the loss of life directly caused by hurricane Harvey. Information was collected for 70 fatalities that occurred directly due to the event. Most of the fatalities occurred in the greater Houston area, which was most severely affected by extreme rainfall and heavy flooding. The majority of fatalities in this area were recovered outside the designated 100 and 500 year flood zones. Most fatalities occurred due to drowning (81 %), particularly in and around vehicles...

  13. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    Science.gov (United States)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  14. SHAREHOLDERS VALUE AND CATASTROPHE BONDS. AN EVENT STUDY ANALYSIS AT EUROPEAN LEVEL

    OpenAIRE

    Constantin, Laura-Gabriela; Cernat-Gruici, Bogdan; Lupu, Radu; Nadotti Loris, Lino Maria

    2015-01-01

    Considering that the E.U. based (re)insurance companies are increasingly active within the segment of alternative risk transfer market, the aim of the present paper is to emphasize the impact of issuing cat bonds on the shareholders’ value for highlighting the competitive advantages of the analysed (re)insurance companies while pursuing the consolidation of their resilience in a turbulent economic environment.Eminently an applicative research, the analysis employs an event study methodology w...

  15. FINANCIAL MARKET REACTIONS TO INTERNATIONAL MERGERS & ACQUISITIONS IN THE BREWING INDUSTRY: AN EVENT STUDY ANALYSIS

    OpenAIRE

    Heyder, Matthias; Ebneth, Oliver; Theuvsen, Ludwig

    2008-01-01

    Cross-border acquisitions have been the growing trend in recent years in the world brewing industry, giving brewers the opportunity to enhance their degree of internationalization and market share remarkably. This study employs event study analysis to examine 31 mergers and acquisitions among leading European brewing groups. Differences regarding financial market reactions can be determined within the European peer group. Managerial implications as well as future research propositions conclud...

  16. Neural network approach in multichannel auditory event-related potential analysis.

    Science.gov (United States)

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  17. Advanced reactor passive system reliability demonstration analysis for an external event

    International Nuclear Information System (INIS)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2017-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event

  18. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    Directory of Open Access Journals (Sweden)

    Matthew Bucknor

    2017-03-01

    Full Text Available Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general for the postulated transient event.

  19. Advanced reactor passive system reliability demonstration analysis for an external event

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin [Argonne National Laboratory, Argonne (United States)

    2017-03-15

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.

  20. Analysis and Modelling of Taste and Odour Events in a Shallow Subtropical Reservoir

    Directory of Open Access Journals (Sweden)

    Edoardo Bertone

    2016-08-01

    Full Text Available Understanding and predicting Taste and Odour events is as difficult as critical for drinking water treatment plants. Following a number of events in recent years, a comprehensive statistical analysis of data from Lake Tingalpa (Queensland, Australia was conducted. Historical manual sampling data, as well as data remotely collected by a vertical profiler, were collected; regression analysis and self-organising maps were the used to determine correlations between Taste and Odour compounds and potential input variables. Results showed that the predominant Taste and Odour compound was geosmin. Although one of the main predictors was the occurrence of cyanobacteria blooms, it was noticed that the cyanobacteria species was also critical. Additionally, water temperature, reservoir volume and oxidised nitrogen availability, were key inputs determining the occurrence and magnitude of the geosmin peak events. Based on the results of the statistical analysis, a predictive regression model was developed to provide indications on the potential occurrence, and magnitude, of peaks in geosmin concentration. Additionally, it was found that the blue green algae probe of the lake’s vertical profiler has the potential to be used as one of the inputs for an automated geosmin early warning system.

  1. RELAP5/MOD 3.3 analysis of Reactor Coolant Pump Trip event at NPP Krsko

    International Nuclear Information System (INIS)

    Bencik, V.; Debrecin, N.; Foretic, D.

    2003-01-01

    In the paper the results of the RELAP5/MOD 3.3 analysis of the Reactor Coolant Pump (RCP) Trip event at NPP Krsko are presented. The event was initiated by an operator action aimed to prevent the RCP 2 bearing damage. The action consisted of a power reduction, that lasted for 50 minutes, followed by a reactor and a subsequent RCP 2 trip when the reactor power was reduced to 28 %. Two minutes after reactor trip, the Main Steam Isolation Valves (MSIV) were isolated and the steam dump flow was closed. On the secondary side the Steam Generator (SG) pressure rose until SG 1 Safety Valve (SV) 1 opened. The realistic RELAP5/MOD 3.3 analysis has been performed in order to model the particular plant behavior caused by operator actions. The comparison of the RELAP5/MOD 3.3 results with the measurement for the power reduction transient has shown small differences for the major parameters (nuclear power, average temperature, secondary pressure). The main trends and physical phenomena following the RCP Trip event were well reproduced in the analysis. The parameters that have the major influence on transient results have been identified. In the paper the influence of SG 1 relief and SV valves on transient results was investigated more closely. (author)

  2. The January 2001, El Salvador event: a multi-data analysis

    Science.gov (United States)

    Vallee, M.; Bouchon, M.; Schwartz, S. Y.

    2001-12-01

    On January 13, 2001, a large normal event (Mw=7.6) occured 100 kilometers away from the Salvadorian coast (Central America) with a centroid depth of about 50km. The size of this event is surprising according to the classical idea that such events have to be much weaker than thrust events in subduction zones. We analysed this earthquake with different types of data: because teleseismic waves are the only data which offer a good azimuthal coverage, we first built a kinematic source model with P and SH waves provided by the IRIS-GEOSCOPE networks. The ambiguity between the 30o plane (plunging toward Pacific Ocean) and the 60o degree plane (plunging toward Central America) leaded us to do a parallel analysis of the two possible planes. We used a simple point-source modelling in order to define the main characteristics of the event and then used an extended source to retrieve the kinematic features of the rupture. For the 2 possible planes, this analysis reveals a downdip and northwest rupture propagation but the difference of fit remains subtle even when using the extended source. In a second part we confronted our models for the two planes with other seismological data, which are (1) regional data, (2) surface wave data through an Empirical Green Function given by a similar but much weaker earthquake which occured in July 1996 and lastly (3) nearfield data provided by Universidad Centroamericana (UCA) and Centro de Investigationes Geotecnicas (CIG). Regional data do not allow to discriminate the 2 planes neither but surface waves and especially near field data confirm that the fault plane is the steepest one plunging toward Central America. Moreover, the slight directivity toward North is confirmed by surface waves.

  3. Analysis on Outcome of 3537 Patients with Coronary Artery Disease: Integrative Medicine for Cardiovascular Events

    Directory of Open Access Journals (Sweden)

    Zhu-ye Gao

    2013-01-01

    Full Text Available Aims. To investigate the treatment of hospitalized patients with coronary artery disease (CAD and the prognostic factors in Beijing, China. Materials and Methods. A multicenter prospective study was conducted through an integrative platform of clinical and research at 12 hospitals in Beijing, China. The clinical information of 3537 hospitalized patients with CAD was collected from September 2009 to May 2011, and the efficacy of secondary prevention during one-year followup was evaluated. In addition, a logistic regression analysis was performed to identify some factors which will have independent impact on the prognosis. Results. The average age of all patients was 64.88 ± 11.97. Of them, 65.42% are males. The medicines for patients were as follows: antiplatelet drugs accounting for 91.97%, statins accounting for 83.66%, β-receptor blockers accounting for 72.55%, ACEI/ARB accounting for 58.92%, and revascularization (including PCI and CABG accounting for 40.29%. The overall incidence of cardiovascular events was 13.26% (469/3537. The logistic stepwise regression analysis showed that heart failure (OR, 3.707, 95% CI = 2.756–4.986, age ≥ 65 years old (OR, 2.007, 95% CI = 1.587–2.53, and myocardial infarction (OR, 1.649, 95% CI = 1.322–2.057 were the independent risk factors of others factors for cardiovascular events that occurred during followup of one-year period. Integrative medicine (IM therapy showed the beneficial tendency for decreasing incidence of cardiovascular events, although no statistical significance was found (OR, 0.797, 95% CI = 0.613~1.036. Conclusions. Heart failure, age ≥ 65 years old, and myocardial infarction were associated with an increase in incidence of cardiovascular events, and treatment with IM showed a tendency for decreasing incidence of cardiovascular events.

  4. HETEROPOLYACIDES AS GREEN AND REUSABLE CATALYSTS ...

    African Journals Online (AJOL)

    Preferred Customer

    phenacyl bromide in the presence of a catalytic amount of various .... The compounds 3 was characterised by NMR, IR spectroscopy and elemental analysis [40]. ... The present method does not involve any hazardous organic solvent.

  5. Probabilistic Dynamics for Integrated Analysis of Accident Sequences considering Uncertain Events

    Directory of Open Access Journals (Sweden)

    Robertas Alzbutas

    2015-01-01

    Full Text Available The analytical/deterministic modelling and simulation/probabilistic methods are used separately as a rule in order to analyse the physical processes and random or uncertain events. However, in the currently used probabilistic safety assessment this is an issue. The lack of treatment of dynamic interactions between the physical processes on one hand and random events on the other hand causes the limited assessment. In general, there are a lot of mathematical modelling theories, which can be used separately or integrated in order to extend possibilities of modelling and analysis. The Theory of Probabilistic Dynamics (TPD and its augmented version based on the concept of stimulus and delay are introduced for the dynamic reliability modelling and the simulation of accidents in hybrid (continuous-discrete systems considering uncertain events. An approach of non-Markovian simulation and uncertainty analysis is discussed in order to adapt the Stimulus-Driven TPD for practical applications. The developed approach and related methods are used as a basis for a test case simulation in view of various methods applications for severe accident scenario simulation and uncertainty analysis. For this and for wider analysis of accident sequences the initial test case specification is then extended and discussed. Finally, it is concluded that enhancing the modelling of stimulated dynamics with uncertainty and sensitivity analysis allows the detailed simulation of complex system characteristics and representation of their uncertainty. The developed approach of accident modelling and analysis can be efficiently used to estimate the reliability of hybrid systems and at the same time to analyze and possibly decrease the uncertainty of this estimate.

  6. New Approaches in Reusable Booster System Life Cycle Cost Modeling

    Science.gov (United States)

    Zapata, Edgar

    2013-01-01

    This paper presents the results of a 2012 life cycle cost (LCC) study of hybrid Reusable Booster Systems (RBS) conducted by NASA Kennedy Space Center (KSC) and the Air Force Research Laboratory (AFRL). The work included the creation of a new cost estimating model and an LCC analysis, building on past work where applicable, but emphasizing the integration of new approaches in life cycle cost estimation. Specifically, the inclusion of industry processes/practices and indirect costs were a new and significant part of the analysis. The focus of LCC estimation has traditionally been from the perspective of technology, design characteristics, and related factors such as reliability. Technology has informed the cost related support to decision makers interested in risk and budget insight. This traditional emphasis on technology occurs even though it is well established that complex aerospace systems costs are mostly about indirect costs, with likely only partial influence in these indirect costs being due to the more visible technology products. Organizational considerations, processes/practices, and indirect costs are traditionally derived ("wrapped") only by relationship to tangible product characteristics. This traditional approach works well as long as it is understood that no significant changes, and by relation no significant improvements, are being pursued in the area of either the government acquisition or industry?s indirect costs. In this sense then, most launch systems cost models ignore most costs. The alternative was implemented in this LCC study, whereby the approach considered technology and process/practices in balance, with as much detail for one as the other. This RBS LCC study has avoided point-designs, for now, instead emphasizing exploring the trade-space of potential technology advances joined with potential process/practice advances. Given the range of decisions, and all their combinations, it was necessary to create a model of the original model

  7. On the economics of staging for reusable launch vehicles

    Science.gov (United States)

    Griffin, Michael D.; Claybaugh, William R.

    1996-03-01

    There has been much recent discussion concerning possible replacement systems for the current U.S. fleet of launch vehicles, including both the shuttle and expendable vehicles. Attention has been focused upon the feasibility and potential benefits of reusable single-stage-to-orbit (SSTO) launch systems for future access to low Earth orbit (LEO). In this paper we assume the technical feasibility of such vehicles, as well as the benefits to be derived from system reusability. We then consider the benefits of launch vehicle staging from the perspective of economic advantage rather than performance necessity. Conditions are derived under which two-stage-to-orbit (TSTO) launch systems, utilizing SSTO-class vehicle technology, offer a relative economic advantage for access to LEO.

  8. Airframe Integration Trade Studies for a Reusable Launch Vehicle

    Science.gov (United States)

    Dorsey, John T.; Wu, Chauncey; Rivers, Kevin; Martin, Carl; Smith, Russell

    1999-01-01

    Future launch vehicles must be lightweight, fully reusable and easily maintained if low-cost access to space is to be achieved. The goal of achieving an economically viable Single-Stage-to-Orbit (SSTO) Reusable Launch Vehicle (RLV) is not easily achieved and success will depend to a large extent on having an integrated and optimized total system. A series of trade studies were performed to meet three objectives. First, to provide structural weights and parametric weight equations as inputs to configuration-level trade studies. Second, to identify, assess and quantify major weight drivers for the RLV airframe. Third, using information on major weight drivers, and considering the RLV as an integrated thermal structure (composed of thrust structures, tanks, thermal protection system, insulation and control surfaces), identify and assess new and innovative approaches or concepts that have the potential for either reducing airframe weight, improving operability, and/or reducing cost.

  9. Decomposition of business process models into reusable sub-diagrams

    Directory of Open Access Journals (Sweden)

    Wiśniewski Piotr

    2017-01-01

    Full Text Available In this paper, an approach to automatic decomposition of business process models is proposed. According to our method, an existing BPMN diagram is disassembled into reusable parts containing the desired number of elements. Such elements and structure can work as design patterns and be validated by a user in terms of correctness. In the next step, these component models are categorised considering their parameters such as resources used, as well as input and output data. The classified components may be considered a repository of reusable parts, that can be further applied in the design of new models. The proposed technique may play a significant role in facilitating the business process redesign procedure, which is of a great importance regarding engineering and industrial applications.

  10. Procedure proposed for performance of a probabilistic safety analysis for the event of ''Air plane crash''

    International Nuclear Information System (INIS)

    Hoffmann, H.H.

    1998-01-01

    A procedures guide for a probabilistic safety analysis for the external event 'Air plane crash' has been prepared. The method is based on analysis done within the framework of PSA for German NPPs as well as on international documents. Both crashes of military air planes and commercial air planes contribute to the plant risk. For the determination of the plant related crash rate the air traffic will be divided into 3 different categories of air traffic: - The landing and takeoff phase, - the airlane traffic and waiting loop traffic, - the free air traffic, and the air planes into different types and weight classes. (orig./GL) [de

  11. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    Science.gov (United States)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q- qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  12. Cost analysis of adverse events associated with non-small cell lung cancer management in France

    Directory of Open Access Journals (Sweden)

    Chouaid C

    2017-07-01

    , anemia (€5,752 per event, dehydration (€5,207 per event and anorexia (€4,349 per event. Costs were mostly driven by hospitalization costs.Conclusion: Among the AEs identified, a majority appeared to have an important economic impact, with a management cost of at least €2,000 per event mainly driven by hospitalization costs. This study may be of interest for economic evaluations of new interventions in NSCLC. Keywords: non-small cell lung cancer, adverse events, cost analysis, chemotherapy, immunotherapy

  13. Analysis and modeling of a hail event consequences on a building portfolio

    Science.gov (United States)

    Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel

    2014-05-01

    North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the

  14. Shock events and flood risk management: a media analysis of the institutional long-term effects of flood events in the Netherlands and Poland

    Directory of Open Access Journals (Sweden)

    Maria Kaufmann

    2016-12-01

    Full Text Available Flood events that have proven to create shock waves in society, which we will call shock events, can open windows of opportunity that allow different actor groups to introduce new ideas. Shock events, however, can also strengthen the status quo. We will take flood events as our object of study. Whereas others focus mainly on the immediate impact and disaster management, we will focus on the long-term impact on and resilience of flood risk governance arrangements. Over the last 25 years, both the Netherlands and Poland have suffered several flood-related events. These triggered strategic and institutional changes, but to different degrees. In a comparative analysis these endogenous processes, i.e., the importance of framing of the flood event, its exploitation by different actor groups, and the extent to which arrangements are actually changing, are examined. In line with previous research, our analysis revealed that shock events test the capacity to resist and bounce back and provide opportunities for adapting and learning. They "open up" institutional arrangements and make them more susceptible to change, increasing the opportunity for adaptation. In this way they can facilitate a shift toward different degrees of resilience, i.e., by adjusting the current strategic approach or by moving toward another strategic approach. The direction of change is influenced by the actors and the frames they introduce, and their ability to increase the resonance of the frame. The persistence of change seems to be influenced by the evolution of the initial management approach, the availability of resources, or the willingness to allocate resources.

  15. Prehospital Interventions During Mass-Casualty Events in Afghanistan: A Case Analysis.

    Science.gov (United States)

    Schauer, Steven G; April, Michael D; Simon, Erica; Maddry, Joseph K; Carter, Robert; Delorenzo, Robert A

    2017-08-01

    Mass-casualty (MASCAL) events are known to occur in the combat setting. There are very limited data at this time from the Joint Theater (Iraq and Afghanistan) wars specific to MASCAL events. The purpose of this report was to provide preliminary data for the development of prehospital planning and guidelines. Cases were identified using the Department of Defense (DoD; Virginia USA) Trauma Registry (DoDTR) and the Prehospital Trauma Registry (PHTR). These cases were identified as part of a research study evaluating Tactical Combat Casualty Care (TCCC) guidelines. Cases that were designated as or associated with denoted MASCAL events were included. Data Fifty subjects were identified during the course of this project. Explosives were the most common cause of injuries. There was a wide range of vital signs. Tourniquet placement and pressure dressings were the most common interventions, followed by analgesia administration. Oral transmucosal fentanyl citrate (OTFC) was the most common parenteral analgesic drug administered. Most were evacuated as "routine." Follow-up data were available for 36 of the subjects and 97% were discharged alive. The most common prehospital interventions were tourniquet and pressure dressing hemorrhage control, along with pain medication administration. Larger data sets are needed to guide development of MASCAL in-theater clinical practice guidelines. Schauer SG , April MD , Simon E , Maddry JK , Carter R III , Delorenzo RA . Prehospital interventions during mass-casualty events in Afghanistan: a case analysis. Prehosp Disaster Med. 2017;32(4):465-468.

  16. Psychiatric adverse events during treatment with brodalumab: Analysis of psoriasis clinical trials.

    Science.gov (United States)

    Lebwohl, Mark G; Papp, Kim A; Marangell, Lauren B; Koo, John; Blauvelt, Andrew; Gooderham, Melinda; Wu, Jashin J; Rastogi, Shipra; Harris, Susan; Pillai, Radhakrishnan; Israel, Robert J

    2018-01-01

    Individuals with psoriasis are at increased risk for psychiatric comorbidities, including suicidal ideation and behavior (SIB). To distinguish between the underlying risk and potential for treatment-induced psychiatric adverse events in patients with psoriasis being treated with brodalumab, a fully human anti-interleukin 17 receptor A monoclonal antibody. Data were evaluated from a placebo-controlled, phase 2 clinical trial; the open-label, long-term extension of the phase 2 clinical trial; and three phase 3, randomized, double-blind, controlled clinical trials (AMAGINE-1, AMAGINE-2, and AMAGINE-3) and their open-label, long-term extensions of patients with moderate-to-severe psoriasis. The analysis included 4464 patients with 9161.8 patient-years of brodalumab exposure. The follow-up time-adjusted incidence rates of SIB events were comparable between the brodalumab and ustekinumab groups throughout the 52-week controlled phases (0.20 vs 0.60 per 100 patient-years). In the brodalumab group, 4 completed suicides were reported, 1 of which was later adjudicated as indeterminate; all patients had underlying psychiatric disorders or stressors. There was no comparator arm past week 52. Controlled study periods were not powered to detect differences in rare events such as suicide. Comparison with controls and the timing of events do not indicate a causal relationship between SIB and brodalumab treatment. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  17. Revisiting Slow Slip Events Occurrence in Boso Peninsula, Japan, Combining GPS Data and Repeating Earthquakes Analysis

    Science.gov (United States)

    Gardonio, B.; Marsan, D.; Socquet, A.; Bouchon, M.; Jara, J.; Sun, Q.; Cotte, N.; Campillo, M.

    2018-02-01

    Slow slip events (SSEs) regularly occur near the Boso Peninsula, central Japan. Their time of recurrence has been decreasing from 6.4 to 2.2 years from 1996 to 2014. It is important to better constrain the slip history of this area, especially as models show that the recurrence intervals could become shorter prior to the occurrence of a large interplate earthquake nearby. We analyze the seismic waveforms of more than 2,900 events (M≥1.0) taking place in the Boso Peninsula, Japan, from 1 April 2004 to 4 November 2015, calculating the correlation and the coherence between each pair of events in order to define groups of repeating earthquakes. The cumulative number of repeating earthquakes suggests the existence of two slow slip events that have escaped detection so far. Small transient displacements observed in the time series of nearby GPS stations confirm these results. The detection scheme coupling repeating earthquakes and GPS analysis allow to detect small SSEs that were not seen before by classical methods. This work brings new information on the diversity of SSEs and demonstrates that the SSEs in Boso area present a more complex history than previously considered.

  18. Permeability Testing of Impacted Composite Laminates for Use on Reusable Launch Vehicles

    Science.gov (United States)

    Nettles, A. T.

    2001-01-01

    Since composite laminates are beginning to be identified for use in reusable launch vehicle propulsion systems, an understanding of their permeance is needed. A foreign object impact event can cause a localized area of permeability (leakage) in a polymer matrix composite, and it is the aim of this study to assess a method of quantifying permeability-after-impact results. A simple test apparatus is presented, and variables that could affect the measured values of permeability-after-impact were assessed. Once it was determined that valid numbers were being measured, a fiber/resin system was impacted at various impact levels and the resulting permeability measured, first with a leak check solution (qualitative) then using the new apparatus (quantitative). The results showed that as the impact level increased, so did the measured leakage. As the pressure to the specimen was increased, the leak rate was seen to increase in a nonlinear fashion for almost all the specimens tested.

  19. The Cost-Optimal Size of Future Reusable Launch Vehicles

    Science.gov (United States)

    Koelle, D. E.

    2000-07-01

    The paper answers the question, what is the optimum vehicle size — in terms of LEO payload capability — for a future reusable launch vehicle ? It is shown that there exists an optimum vehicle size that results in minimum specific transportation cost. The optimum vehicle size depends on the total annual cargo mass (LEO equivalent) enviseaged, which defines at the same time the optimum number of launches per year (LpA). Based on the TRANSCOST-Model algorithms a wide range of vehicle sizes — from 20 to 100 Mg payload in LEO, as well as launch rates — from 2 to 100 per year — have been investigated. It is shown in a design chart how much the vehicle size as well as the launch rate are influencing the specific transportation cost (in MYr/Mg and USS/kg). The comparison with actual ELVs (Expendable Launch Vehicles) and Semi-Reusable Vehicles (a combination of a reusable first stage with an expendable second stage) shows that there exists only one economic solution for an essential reduction of space transportation cost: the Fully Reusable Vehicle Concept, with rocket propulsion and vertical take-off. The Single-stage Configuration (SSTO) has the best economic potential; its feasibility is not only a matter of technology level but also of the vehicle size as such. Increasing the vehicle size (launch mass) reduces the technology requirements because the law of scale provides a better mass fraction and payload fraction — practically at no cost. The optimum vehicle design (after specification of the payload capability) requires a trade-off between lightweight (and more expensive) technology vs. more conventional (and cheaper) technology. It is shown that the the use of more conventional technology and accepting a somewhat larger vehicle is the more cost-effective and less risky approach.

  20. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    Science.gov (United States)

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  1. Urbanization and fertility: an event-history analysis of coastal Ghana.

    Science.gov (United States)

    White, Michael J; Muhidin, Salut; Andrzejewski, Catherine; Tagoe, Eva; Knight, Rodney; Reed, Holly

    2008-11-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field.

  2. Development of time dependent safety analysis code for plasma anomaly events in fusion reactors

    International Nuclear Information System (INIS)

    Honda, Takuro; Okazaki, Takashi; Bartels, H.W.; Uckan, N.A.; Seki, Yasushi.

    1997-01-01

    A safety analysis code SAFALY has been developed to analyze plasma anomaly events in fusion reactors, e.g., a loss of plasma control. The code is a hybrid code comprising a zero-dimensional plasma dynamics and a one-dimensional thermal analysis of in-vessel components. The code evaluates the time evolution of plasma parameters and temperature distributions of in-vessel components. As the plasma-safety interface model, we proposed a robust plasma physics model taking into account updated data for safety assessment. For example, physics safety guidelines for beta limit, density limit and H-L mode confinement transition threshold power, etc. are provided in the model. The model of the in-vessel components are divided into twenty temperature regions in the poloidal direction taking account of radiative heat transfer between each surface of each region. This code can also describe the coolant behavior under hydraulic accidents with the results by hydraulics code and treat vaporization (sublimation) from plasma facing components (PFCs). Furthermore, the code includes the model of impurity transport form PFCs by using a transport probability and a time delay. Quantitative analysis based on the model is possible for a scenario of plasma passive shutdown. We examined the possibility of the code as a safety analysis code for plasma anomaly events in fusion reactors and had a prospect that it would contribute to the safety analysis of the International Thermonuclear Experimental Reactor (ITER). (author)

  3. On Event/Time Triggered and Distributed Analysis of a WSN System for Event Detection, Using Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Sofia Maria Dima

    2016-01-01

    Full Text Available Event detection in realistic WSN environments is a critical research domain, while the environmental monitoring comprises one of its most pronounced applications. Although efforts related to the environmental applications have been presented in the current literature, there is a significant lack of investigation on the performance of such systems, when applied in wireless environments. Aiming at addressing this shortage, in this paper an advanced multimodal approach is followed based on fuzzy logic. The proposed fuzzy inference system (FIS is implemented on TelosB motes and evaluates the probability of fire detection while aiming towards power conservation. Additionally to a straightforward centralized approach, a distributed implementation of the above FIS is also proposed, aiming towards network congestion reduction while optimally distributing the energy consumption among network nodes so as to maximize network lifetime. Moreover this work proposes an event based execution of the aforementioned FIS aiming to further reduce the computational as well as the communication cost, compared to a periodical time triggered FIS execution. As a final contribution, performance metrics acquired from all the proposed FIS implementation techniques are thoroughly compared and analyzed with respect to critical network conditions aiming to offer realistic evaluation and thus objective conclusions’ extraction.

  4. Characterization of a Flood Event through a Sediment Analysis: The Tescio River Case Study

    Directory of Open Access Journals (Sweden)

    Silvia Di Francesco

    2016-07-01

    Full Text Available This paper presents the hydrological analysis and grain size characteristics of fluvial sediments in a river basin and their combination to characterize a flood event. The overall objective of the research is the development of a practical methodology based on experimental surveys to reconstruct the hydraulic history of ungauged river reaches on the basis of the modifications detected on the riverbed during the dry season. The grain size analysis of fluvial deposits usually requires great technical and economical efforts and traditional sieving based on physical sampling is not appropriate to adequately represent the spatial distribution of sediments in a wide area of a riverbed with a reasonable number of samples. The use of photographic sampling techniques, on the other hand, allows for the quick and effective determination of the grain size distribution, through the use of a digital camera and specific graphical algorithms in large river stretches. A photographic sampling is employed to characterize the riverbed in a 3 km ungauged reach of the Tescio River, a tributary of the Chiascio River, located in central Italy, representative of many rivers in the same geographical area. To this end, the particle size distribution is reconstructed through the analysis of digital pictures of the sediments taken on the riverbed in dry conditions. The sampling has been performed after a flood event of known duration, which allows for the identification of the removal of the armor in one section along the river reach under investigation. The volume and composition of the eroded sediments made it possible to calculate the average flow rate associated with the flood event which caused the erosion, by means of the sediment transport laws and the hydrological analysis of the river basin. A hydraulic analysis of the river stretch under investigation was employed to verify the validity of the proposed procedure.

  5. New reusable elastomer electrodes for assessing body composition

    International Nuclear Information System (INIS)

    Moreno, M-V; Chaset, L; Bittner, P A; Barthod, C; Passard, M

    2013-01-01

    The development of telemedicine requires finding solutions of reusable electrodes for use in patients' homes. The objective of this study is to evaluate the relevance of reusable elastomer electrodes for measuring body composition. We measured a population of healthy Caucasian (n = 17). A measurement was made with a reference device, the Xitron®, associated with AgCl Gel electrodes (Gel) and another measurement with a multifrequency impedancemeter Z-Metrix® associated with reusable elastomer electrodes (Elast). We obtained a low variability with an average error of repeatability of 0.39% for Re and 0.32% for Rinf. There is a non significantly difference (P T-test > 0.1) about 200 ml between extracellular water Ve measured with Gel and Elast in supine and in standing position. For total body water Vt, we note a non significantly difference (P T-test > 0.1) about 100 ml and 2.2 1 respectively in supine and standing position. The results give low dispersion, with R 2 superior to 0.90, with a 1.5% maximal error between Gel and Elast on Ve in standing position. It looks possible, taking a few precautions, using elastomer electrodes for assessing body composition.

  6. Adverse events following yellow fever immunization: Report and analysis of 67 neurological cases in Brazil.

    Science.gov (United States)

    Martins, Reinaldo de Menezes; Pavão, Ana Luiza Braz; de Oliveira, Patrícia Mouta Nunes; dos Santos, Paulo Roberto Gomes; Carvalho, Sandra Maria D; Mohrdieck, Renate; Fernandes, Alexandre Ribeiro; Sato, Helena Keico; de Figueiredo, Patricia Mandali; von Doellinger, Vanessa Dos Reis; Leal, Maria da Luz Fernandes; Homma, Akira; Maia, Maria de Lourdes S

    2014-11-20

    Neurological adverse events following administration of the 17DD substrain of yellow fever vaccine (YEL-AND) in the Brazilian population are described and analyzed. Based on information obtained from the National Immunization Program through passive surveillance or intensified passive surveillance, from 2007 to 2012, descriptive analysis, national and regional rates of YFV associated neurotropic, neurological autoimmune disease, and reporting rate ratios with their respective 95% confidence intervals were calculated for first time vaccinees stratified on age and year. Sixty-seven neurological cases were found, with the highest rate of neurological adverse events in the age group from 5 to 9 years (2.66 per 100,000 vaccine doses in Rio Grande do Sul state, and 0.83 per 100,000 doses in national analysis). Two cases had a combination of neurotropic and autoimmune features. This is the largest sample of YEL-AND already analyzed. Rates are similar to other recent studies, but on this study the age group from 5 to 9 years of age had the highest risk. As neurological adverse events have in general a good prognosis, they should not contraindicate the use of yellow fever vaccine in face of risk of infection by yellow fever virus. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Analysis of Loss-of-Offsite-Power Events 1997-2015

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Nancy Ellen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.

  8. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    Science.gov (United States)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  9. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    Science.gov (United States)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  10. Analysis of mutual events of Galilean satellites observed from VBO during 2014-2015

    Science.gov (United States)

    Vasundhara, R.; Selvakumar, G.; Anbazhagan, P.

    2017-06-01

    Results of analysis of 23 events of the 2014-2015 mutual event series from the Vainu Bappu Observatory are presented. Our intensity distribution model for the eclipsed/occulted satellite is based on the criterion that it simulates a rotational light curve that matches the ground-based light curve. Dichotomy in the scattering characteristics of the leading and trailing sides explains the basic shape of the rotational light curves of Europa, Ganymede and Callisto. In the case of Io, the albedo map (courtesy United States Geological Survey) along with global values of scattering parameters works well. Mean values of residuals in (O - C) along and perpendicular to the track are found to be -3.3 and -3.4 mas, respectively, compared to 'L2' theory for the seven 2E1/2O1 events. The corresponding rms values are 8.7 and 7.8 mas, respectively. For the five 1E3/1O3 events, the along and perpendicular to the track mean residuals are 5.6 and 3.2 mas, respectively. The corresponding rms residuals are 6.8 and 10.5 mas, respectively. We compare the results using the chosen model (Model 1) with a uniform but limb-darkened disc (Model 2). The residuals with Model 2 of the 2E1/2O1 and 1E3/1O3 events indicate a bias along the satellite track. The extent and direction of bias are consistent with the shift of the light centre from the geometric centre. Results using Model 1, which intrinsically takes into account the intensity distribution, show no such bias.

  11. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    Science.gov (United States)

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  12. Exploitation of a component event data bank for common cause failure analysis

    International Nuclear Information System (INIS)

    Games, A.M.; Amendola, A.; Martin, P.

    1985-01-01

    Investigations into using the European Reliability Data System Component Event Data Bank for common cause failure analysis have been carried out. Starting from early exercises where data were analyzed without computer aid, different types of linked multiple failures have been identified. A classification system is proposed based on this experience. It defines a multiple failure event space wherein each category defines causal, modal, temporal and structural links between failures. It is shown that a search algorithm which incorporates the specific interrogative procedures of the data bank can be developed in conjunction with this classification system. It is concluded that the classification scheme and the search algorithm are useful organizational tools in the field of common cause failures studies. However, it is also suggested that the use of the term common cause failure should be avoided since it embodies to many different types of linked multiple failures

  13. Turning a Private Story into a Public Event. Frame Analysis of Scandals in Television Performance

    Directory of Open Access Journals (Sweden)

    Olga Galanova

    2012-07-01

    Full Text Available It does not suffice to treat scandals only as supra-individual discourses on the macro level of  social communication. Rather we have to develop concrete methodical principles for the description of the practice of doing scandal in certain media. In this paper we look at these practices from a micro-sociological perspective and analyze how and through which concrete actions an event is staged as a scandal. Practices of scandal build a special frame of media communication, which allows  television producers to solve certain "communicative problems." Based on the detailed analysis of a video recording of a television show we exemplify how a private case turns to a public event by means of  scandal-framing. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120398

  14. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  15. Benchmark analysis of three main circulation pump sequential trip event at Ignalina NPP

    International Nuclear Information System (INIS)

    Uspuras, E.; Kaliatka, A.; Urbonas, R.

    2001-01-01

    The Ignalina Nuclear Power Plant is a twin-unit with two RBMK-1500 reactors. The primary circuit consists of two symmetrical loops. Eight Main Circulation Pumps (MCPs) at the Ignalina NPP are employed for the coolant water forced circulation through the reactor core. The MCPs are joined in groups of four pumps for each loop (three for normal operation and one on standby). This paper presents the benchmark analysis of three main circulation pump sequential trip event at RBMK-1500 using RELAP5 code. During this event all three MCPs in one circulation loop at Unit 2 Ignalina NPP were tripped one after another, because of inadvertent activation of the fire protection system. The comparison of calculated and measured parameters led us to establish realistic thermal hydraulic characteristics of different main circulation circuit components and to verify the model of drum separators pressure and water level controllers.(author)

  16. A sequential threshold cure model for genetic analysis of time-to-event data

    DEFF Research Database (Denmark)

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    In analysis of time-to-event data, classical survival models ignore the presence of potential nonsusceptible (cured) individuals, which, if present, will invalidate the inference procedures. Existence of nonsusceptible individuals is particularly relevant under challenge testing with specific...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...... survival data were generated through 2 underlying traits: susceptibility and endurance (risk of dying per time-unit), associated with 2 sets of underlying liabilities. Despite considerable phenotypic confounding, the proposed model was largely able to distinguish the 2 traits. Furthermore, if selection...

  17. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  18. Analysis of a potential meteorite-dropping event over the south of Spain in 2007

    Science.gov (United States)

    Madiedo, J. M.; Trigo-Rodríguez, J. M.

    2008-09-01

    the case of Puerto Lápice, there are no pictures or videos of the June 29, 2007 bolide and just some images of the distorted train taken several minutes later are available. A forth potential meteoritedropping bolide could be directly recorded by SPMN video cameras on March 25, 2007. We were lucky enough of having this event near to the zenith of two SPMN stations, exhibiting all its magnificence (Fig. 2). We focus here on the preliminary analysis of this event, which was observed over an

  19. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  20. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    Science.gov (United States)

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  1. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    International Nuclear Information System (INIS)

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J.

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PER) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs

  2. Statistical Analysis of Solar Events Associated with SSC over Year of Solar Maximum during Cycle 23: 1. Identification of Related Sun-Earth Events

    Science.gov (United States)

    Grison, B.; Bocchialini, K.; Menvielle, M.; Chambodut, A.; Cornilleau-Wehrlin, N.; Fontaine, D.; Marchaudon, A.; Pick, M.; Pitout, F.; Schmieder, B.; Regnier, S.; Zouganelis, Y.

    2017-12-01

    Taking the 32 sudden storm commencements (SSC) listed by the observatory de l'Ebre / ISGI over the year 2002 (maximal solar activity) as a starting point, we performed a statistical analysis of the related solar sources, solar wind signatures, and terrestrial responses. For each event, we characterized and identified, as far as possible, (i) the sources on the Sun (Coronal Mass Ejections -CME-), with the help of a series of herafter detailed criteria (velocities, drag coefficient, radio waves, polarity), as well as (ii) the structure and properties in the interplanetary medium, at L1, of the event associated to the SSC: magnetic clouds -MC-, non-MC interplanetary coronal mass ejections -ICME-, co-rotating/stream interaction regions -SIR/CIR-, shocks only and unclear events that we call "miscellaneous" events. The categorization of the events at L1 is made on published catalogues. For each potential CME/L1 event association we compare the velocity observed at L1 with the one observed at the Sun and the estimated balistic velocity. Observations of radio emissions (Type II, Type IV detected from the ground and /or by WIND) associated to the CMEs make the solar source more probable. We also compare the polarity of the magnetic clouds with the hemisphere of the solar source. The drag coefficient (estimated with the drag-based model) is calculated for each potential association and it is compared to the expected range values. We identified a solar source for 26 SSC related events. 12 of these 26 associations match all criteria. We finally discuss the difficulty to perform such associations.

  3. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    Science.gov (United States)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  4. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    Science.gov (United States)

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. An analysis on boron dilution events during SBLOCA for the KNGR

    International Nuclear Information System (INIS)

    Kim, Young In; Hwang, Young Dong; Park, Jong Kuen; Chung, Young Jong; Sim, Suk Gu

    1999-02-01

    An analysis on boron dilution events during small break loss of coolant accident (LOCA) for Korea Next Generation Reactor (KNGR) was performed using Computational Fluid Dynamic (CFD) computer program FLUENT code. The maximum size of the water slug was determined based on the source of un borated water slug and the possible flow paths. Axisymmetric computational fluid dynamic analysis model is applied for conservative scoping analysis of un borated water slug mixing with recirculation water of the reactor system following small break LOCA assuming one Reactor Coolant Pump (RCP) restart. The computation grid was determined through the sensitivity study on the grid size, which calculates the most conservative results, and the preliminary calculation for boron mixing was performed using the grid. (Author). 17 refs., 3 tabs., 26 figs

  6. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs

  7. Reusable launch vehicles, enabling technology for the development of advanced upper stages and payloads

    International Nuclear Information System (INIS)

    Metzger, John D.

    1998-01-01

    In the near future there will be classes of upper stages and payloads that will require initial operation at a high-earth orbit to reduce the probability of an inadvertent reentry that could result in a detrimental impact on humans and the biosphere. A nuclear propulsion system, such as was being developed under the Space Nuclear Thermal Propulsion (SNTP) Program, is an example of such a potential payload. This paper uses the results of a reusable launch vehicle (RLV) study to demonstrate the potential importance of a Reusable Launch Vehicle (RLV) to test and implement an advanced upper stage (AUS) or payload in a safe orbit and in a cost effective and reliable manner. The RLV is a horizontal takeoff and horizontal landing (HTHL), two-stage-to-orbit (TSTO) vehicle. The results of the study shows that an HTHL is cost effective because it implements airplane-like operation, infrastructure, and flight operations. The first stage of the TSTO is powered by Rocket-Based-Combined-Cycle (RBCC) engines, the second stage is powered by a LOX/LH rocket engine. The TSTO is used since it most effectively utilizes the capability of the RBCC engine. The analysis uses the NASA code POST (Program to Optimize Simulated Trajectories) to determine trajectories and weight in high-earth orbit for AUS/advanced payloads. Cost and reliability of an RLV versus current generation expandable launch vehicles are presented

  8. Reusable Solid Rocket Motor - Accomplishment, Lessons, and a Culture of Success

    Science.gov (United States)

    Moore, D. R.; Phelps, W. J.

    2011-01-01

    The Reusable Solid Rocket Motor (RSRM) represents the largest solid rocket motor (SRM) ever flown and the only human-rated solid motor. High reliability of the RSRM has been the result of challenges addressed and lessons learned. Advancements have resulted by applying attention to process control, testing, and postflight through timely and thorough communication in dealing with all issues. A structured and disciplined approach was taken to identify and disposition all concerns. Careful consideration and application of alternate opinions was embraced. Focus was placed on process control, ground test programs, and postflight assessment. Process control is mandatory for an SRM, because an acceptance test of the delivered product is not feasible. The RSRM maintained both full-scale and subscale test articles, which enabled continuous improvement of design and evaluation of process control and material behavior. Additionally RSRM reliability was achieved through attention to detail in post flight assessment to observe any shift in performance. The postflight analysis and inspections provided invaluable reliability data as it enables observation of actual flight performance, most of which would not be available if the motors were not recovered. RSRM reusability offered unique opportunities to learn about the hardware. NASA is moving forward with the Space Launch System that incorporates propulsion systems that takes advantage of the heritage Shuttle and Ares solid motor programs. These unique challenges, features of the RSRM, materials and manufacturing issues, and design improvements will be discussed in the paper.

  9. Hydroacoustic monitoring of a salt cavity: an analysis of precursory events of the collapse

    Science.gov (United States)

    Lebert, F.; Bernardie, S.; Mainsant, G.

    2011-09-01

    One of the main features of "post mining" research relates to available methods for monitoring mine-degradation processes that could directly threaten surface infrastructures. In this respect, GISOS, a French scientific interest group, is investigating techniques for monitoring the eventual collapse of underground cavities. One of the methods under investigation was monitoring the stability of a salt cavity through recording microseismic-precursor signals that may indicate the onset of rock failure. The data were recorded in a salt mine in Lorraine (France) when monitoring the controlled collapse of 2 000 000 m3 of rocks surrounding a cavity at 130 m depth. The monitoring in the 30 Hz to 3 kHz frequency range highlights the occurrence of events with high energy during periods of macroscopic movement, once the layers had ruptured; they appear to be the consequence of the post-rupture rock movements related to the intense deformation of the cavity roof. Moreover the analysis shows the presence of some interesting precursory signals before the cavity collapsed. They occurred a few hours before the failure phases, when the rocks were being weakened and damaged. They originated from the damaging and breaking process, when micro-cracks appear and then coalesce. From these results we expect that deeper signal analysis and statistical analysis on the complete event time distribution (several millions of files) will allow us to finalize a complete typology of each signal families and their relations with the evolution steps of the cavity over the five years monitoring.

  10. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    Science.gov (United States)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  11. Analysis of area events as part of probabilistic safety assessment for Romanian TRIGA SSR 14 MW reactor

    International Nuclear Information System (INIS)

    Mladin, D.; Stefan, I.

    2005-01-01

    The international experience has shown that the external events could be an important contributor to plant/ reactor risk. For this reason such events have to be included in the PSA studies. In the context of PSA for nuclear facilities, external events are defined as events originating from outside the plant, but with the potential to create an initiating event at the plant. To support plant safety assessment, PSA can be used to find methods for identification of vulnerable features of the plant and to suggest modifications in order to mitigate the impact of external events or the producing of initiating events. For that purpose, probabilistic assessment of area events concerning fire and flooding risk and impact is necessary. Due to the relatively large power level amongst research reactors, the approach to safety analysis of Romanian 14 MW TRIGA benefits from an ongoing PSA project. In this context, treatment of external events should be considered. The specific tasks proposed for the complete evaluation of area event analysis are: identify the rooms important for facility safety, determine a relative area event risk index for these rooms and a relative area event impact index if the event occurs, evaluate the rooms specific area event frequency, determine the rooms contribution to reactor hazard state frequencies, analyze power supply and room dependencies of safety components (as pumps, motor operated valves). The fire risk analysis methodology is based on Berry's method [1]. This approach provides a systematic procedure to carry out a relative index of different rooms. The factors, which affect the fire probability, are: personal presence in the room, number and type of ignition sources, type and area of combustibles, fuel available in the room, fuel location, and ventilation. The flooding risk analysis is based on the amount of piping in the room. For accuracy of the information regarding piping a facility walk-about is necessary. In case of flooding risk

  12. An analysis of potential costs of adverse events based on Drug Programs in Poland. Pulmonology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-06-01

    Full Text Available The project was performed within the Polish Society for Pharmacoeconomics (PTFE. The objective was to estimate the potential costs of treatment of side effects, which theoretically may occur as a result of treatment of selected diseases. We analyzed the Drug Programs financed by National Health Fund in Poland in 2012 and for the first analysis we selected those Programs where the same medicinal products were used. We based the adverse events selection on the Summary of Product Characteristics of the chosen products. We extracted all the potential adverse events defined as frequent and very frequent, grouping them according to therapeutic areas. This paper is related to the results in the pulmonology area. The events described as very common had an incidence of ≥ 1/10, and the common ones ≥ 1/100, <1/10. In order to identify the resources used, we performed a survey with the engagement of clinical experts. On the basis of the collected data we allocated direct costs incurred by the public payer. We used the costs valid in December 2013. The paper presents the estimated costs of treatment of side effects related to the pulmonology disease area. Taking into account the costs incurred by the NHF and the patient separately e calculated the total spending and the percentage of each component cost in detail. The treatment of adverse drug reactions generates a significant cost incurred by both the public payer and the patient.

  13. Preliminary analysis of beam trip and beam jump events in an ADS prototype

    International Nuclear Information System (INIS)

    D'Angelo, A.; Bianchini, G.; Carta, M.

    2001-01-01

    A core dynamics analysis relevant to some typical current transient events has been carried out on an 80 MW energy amplifier prototype (EAP) fuelled by mixed oxides and cooled by lead-bismuth. Fuel and coolant temperature trends relevant to recovered beam trip and beam jump events have been preliminary investigated. Beam trip results show that the drop in temperature of the core outlet coolant would be reduced a fair amount if the beam intensity could be recovered within few seconds. Due to the low power density in the EAP fuel, the beam jump from 50% of the nominal power transient evolves benignly. The worst thinkable current transient, beam jump with cold reactor, mainly depends on the coolant flow conditions. In the EAP design, the primary loop coolant flow is assured by natural convection and is enhanced by a particular system of cover gas injection into the bottom part of the riser. If this system of coolant flow enhancement is assumed in function, even the beam jump with cold reactor event evolves without severe consequences. (authors)

  14. Rare event computation in deterministic chaotic systems using genealogical particle analysis

    International Nuclear Information System (INIS)

    Wouters, J; Bouchet, F

    2016-01-01

    In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein–Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function. (paper)

  15. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    Science.gov (United States)

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  16. A systemic approach for managing extreme risk events-dynamic financial analysis

    Directory of Open Access Journals (Sweden)

    Ph.D.Student Rodica Ianole

    2011-12-01

    Full Text Available Following the Black Swan logic, it often happens that what we do not know becomes more relevant that what we (believe to know. The management of extreme risks falls under this paradigm in the sense that it cannot be limited to a static approach based only on objective and easily quantifiable variables. Making appeal to the operational tools developed primarily for the insurance industry, the present paper aims to investigate how dynamic financial analysis (DFA can be used within the framework of extreme risk events.

  17. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    Science.gov (United States)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture

  18. Development and evaluation of a computerbased instrument supporting event analysis in NPP. Final report

    International Nuclear Information System (INIS)

    Szameitat, S.

    2002-11-01

    Information technologies (IT) in safety management are seen as opportunity to control and reduce risks. The processing of safety-critical event information, a central function of safety management, could be more efficient using computers. But organization structures are different, the processes of experience transfer are complex and the opportunities of IT are broad. To implement a support system is to question about design criteria of computer support for an event-based safety management system (eSMS). Two studies were conducted. Study 1 compares the organizational, technical and legal conditions for Safety Management in the German Nuclear Industry and the Norwegian Offshore Industry. It could identify design criteria, which independent from the operator influence the eSMS. Study 2 compares the eSMS of different nuclear power plants analyzing the organization structures and processes. Those internal and external criteria have a significant influence on a efficient design of a system support for eSMS. The third study makes the options and impact of computer support on experience transfer in eSMS as subject of discussion. For this purpose a simulation environment has been created. Groups investigated typical event scenarios from a nuclear power plant for contributing factors. The communication medium has been manipulated (face-to-face versus computer-mediated). Results of 15 groups indicate, that the collection of event information was promoted by computer-mediated communication. This is the foundation of the identification of contributing factors. The depth of analysis has not been influenced. Computer mediation hampers the learning of facts. IT can support safety management effectively. (orig.) [de

  19. Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment

    International Nuclear Information System (INIS)

    Mercurio, D.

    2011-01-01

    Current Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) methodologies model the evolution of accident sequences in Nuclear Power Plants (NPPs) mainly based on Logic Trees. The evolution of these sequences is a result of the interactions between the crew and plant; in current PRA methodologies, simplified models of these complex interactions are used. In this study, the Accident Dynamic Simulator (ADS), a modeling framework based on the Discrete Dynamic Event Tree (DDET), has been used for the simulation of crew-plant interactions during potential accident scenarios in NPPs. In addition, an operator/crew model has been developed to treat the response of the crew to the plant. The 'crew model' is made up of three operators whose behavior is guided by a set of rules-of-behavior (which represents the knowledge and training of the operators) coupled with written and mental procedures. In addition, an approach for addressing the crew timing variability in DDETs has been developed and implemented based on a set of HRA data from a simulator study. Finally, grouping techniques were developed and applied to the analysis of the scenarios generated by the crew-plant simulation. These techniques support the post-simulation analysis by grouping similar accident sequences, identifying the key contributing events, and quantifying the conditional probability of the groups. These techniques are used to characterize the context of the crew actions in order to obtain insights for HRA. The model has been applied for the analysis of a Small Loss Of Coolant Accident (SLOCA) event for a Pressurized Water Reactor (PWR). The simulation results support an improved characterization of the performance conditions or context of operator actions, which can be used in an HRA, in the analysis of the reliability of the actions. By providing information on the evolution of system indications, dynamic of cues, crew timing in performing procedure steps, situation

  20. Modeling time-to-event (survival) data using classification tree analysis.

    Science.gov (United States)

    Linden, Ariel; Yarnold, Paul R

    2017-12-01

    Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.

  1. Analysis of brand personality to involve event involvement and loyalty: A case study of Jakarta Fashion Week 2017

    Science.gov (United States)

    Nasution, A. H.; Rachmawan, Y. A.

    2018-04-01

    Fashion trend in the world changed extremely fast. Fashion has become the one of people’s lifestyle in the world. Fashion week events in several areas can be a measurement of fahion trend nowadays. There was a fashion week event in Indonesia called Jakarta Fashion Week (JFW) aims to show fashion trend to people who want to improve their fashion style. People will join some events if the event has involvement to them, hence they will come to that event again and again. Annually and continuously event is really important to create loyalty among people who are involved in it, in order to increase positive development towards the organizer in organizing the next event. Saving a huge amount from the marketing budget, and creating a higher quality event. This study aims to know the effect of 5 brand personality dimension to event involvement and loyalty in Jakarta Fashion Week (JFW). This study use quantitative confirmative method with Structural Equation Model (SEM) analysis technique. The sample of this study is 150 respondents who became a participant of Jakarta Fashion Week 2017. Result show that there was significant effect of 5 brand personality dimension to 3 dimension of event involvement and loyalty. Meanwhile, there was one dimension of event involvement called personal self-expression that has not effect to loyalty.

  2. The analysis of competing events like cause-specific mortality--beware of the Kaplan-Meier method

    NARCIS (Netherlands)

    Verduijn, Marion; Grootendorst, Diana C.; Dekker, Friedo W.; Jager, Kitty J.; le Cessie, Saskia

    2011-01-01

    Kaplan-Meier analysis is a popular method used for analysing time-to-event data. In case of competing event analyses such as that of cardiovascular and non-cardiovascular mortality, however, the Kaplan-Meier method profoundly overestimates the cumulative mortality probabilities for each of the

  3. AN ANALYSIS OF RISK EVENTS IN THE OIL-TANKER MAINTENANCE BUSINESS

    Directory of Open Access Journals (Sweden)

    Roque Rabechini Junior

    2012-12-01

    Full Text Available This work presents the results of an investigation into risk events and their respective causes, carried out in ship maintenance undertakings in the logistical sector of the Brazilian oil industry. Its theoretical, conceptual positioning lies in those aspects related to risk management of the undertakings as instruments of support in decision making by executives in the tanker-maintenance business. The case-study method was used as an alternative methodology with a qualitative approach of an exploratory nature and, for the presentation of data, a descriptive format was chosen. Through the analysis of 75 risk events in projects of tanker docking it was possible to extract eight of the greatest relevance. The risk analysis facilitated the identification of actions aimed at their mitigation. As a conclusion it was possible to propose a risk-framework model in four categories, HSE (health, safety and the environment, technicians, externalities and management, designed to provide tanker-docking business executives and administrators, with evidence of actions to assist in their decision-making processes. Finally, the authors identified proposals for further study as well as showing the principal limitations of the study.

  4. Spectral analysis of time series of events: effect of respiration on heart rate in neonates

    International Nuclear Information System (INIS)

    Van Drongelen, Wim; Williams, Amber L; Lasky, Robert E

    2009-01-01

    Certain types of biomedical processes such as the heart rate generator can be considered as signals that are sampled by the occurring events, i.e. QRS complexes. This sampling property generates problems for the evaluation of spectral parameters of such signals. First, the irregular occurrence of heart beats creates an unevenly sampled data set which must either be pre-processed (e.g. by using trace binning or interpolation) prior to spectral analysis, or analyzed with specialized methods (e.g. Lomb's algorithm). Second, the average occurrence of events determines the Nyquist limit for the sampled time series. Here we evaluate different types of spectral analysis of recordings of neonatal heart rate. Coupling between respiration and heart rate and the detection of heart rate itself are emphasized. We examine both standard and data adaptive frequency bands of heart rate signals generated by models of coupled oscillators and recorded data sets from neonates. We find that an important spectral artifact occurs due to a mirror effect around the Nyquist limit of half the average heart rate. Further we conclude that the presence of respiratory coupling can only be detected under low noise conditions and if a data-adaptive respiratory band is used

  5. Soft error rate analysis methodology of multi-Pulse-single-event transients

    International Nuclear Information System (INIS)

    Zhou Bin; Huo Mingxue; Xiao Liyi

    2012-01-01

    As transistor feature size scales down, soft errors in combinational logic because of high-energy particle radiation is gaining more and more concerns. In this paper, a combinational logic soft error analysis methodology considering multi-pulse-single-event transients (MPSETs) and re-convergence with multi transient pulses is proposed. In the proposed approach, the voltage pulse produced at the standard cell output is approximated by a triangle waveform, and characterized by three parameters: pulse width, the transition time of the first edge, and the transition time of the second edge. As for the pulse with the amplitude being smaller than the supply voltage, the edge extension technique is proposed. Moreover, an efficient electrical masking model comprehensively considering transition time, delay, width and amplitude is proposed, and an approach using the transition times of two edges and pulse width to compute the amplitude of pulse is proposed. Finally, our proposed firstly-independently-propagating-secondly-mutually-interacting (FIP-SMI) is used to deal with more practical re-convergence gate with multi transient pulses. As for MPSETs, a random generation model of MPSETs is exploratively proposed. Compared to the estimates obtained using circuit level simulations by HSpice, our proposed soft error rate analysis algorithm has 10% errors in SER estimation with speed up of 300 when the single-pulse-single-event transient (SPSET) is considered. We have also demonstrated the runtime and SER decrease with the increment of P0 using designs from the ISCAS-85 benchmarks. (authors)

  6. Replica analysis of overfitting in regression models for time-to-event data

    Science.gov (United States)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  7. Potential Indoor Worker Exposure From Handling Area Leakage: Example Event Sequence Frequency Analysis

    International Nuclear Information System (INIS)

    Benke, Roland R.; Adams, George R.

    2008-01-01

    The U.S. Department of Energy (DOE) is currently considering design options for the facilities that will handle spent nuclear fuel and high-level radioactive waste at the potential nuclear waste repository at Yucca Mountain, Nevada. The license application must demonstrate compliance with the performance objectives of 10 CFR Part 63, which include occupational dose limits from 10 CFR Part 20. If DOE submits a license application under 10 CFR Part 63, the U.S. Nuclear Regulatory Commission (NRC) will conduct a risk-informed, performance-based review of the DOE license application and its preclosure safety analysis, in which in-depth technical evaluations are focused on technical areas that are significant to preclosure safety and risk. As part of pre-licensing activities, the Center for Nuclear Waste Regulatory Analyses (CNWRA) developed the Preclosure Safety Analysis Tool software to aid in the regulatory review of a DOE license application and support any independent confirmatory assessments that may be needed. Recent DOE information indicates a primarily canister-based handling approach that includes the wet transfer of individual assemblies where Heating, Ventilation, and Air Conditioning (HVAC) systems may be relied on to provide confinement and limit the spread of any airborne radioactive material from handling operations. Workers may be involved in manual and remote operations in handling transportation casks, canisters, waste packages, or bare spent nuclear fuel assemblies inside facility buildings. As part of routine operations within these facilities, radioactive material may potentially become airborne if canisters are opened or bare fuel assemblies are handled. Leakage of contaminated air from the handling area into adjacent occupied areas, therefore, represents a potential radiological exposure pathway for indoor workers. The objective of this paper is to demonstrate modeling capabilities that can be used by the regulator to estimate frequencies of

  8. Event rates, hospital utilization, and costs associated with major complications of diabetes: a multicountry comparative analysis.

    Directory of Open Access Journals (Sweden)

    Philip M Clarke

    2010-02-01

    Full Text Available Diabetes imposes a substantial burden globally in terms of premature mortality, morbidity, and health care costs. Estimates of economic outcomes associated with diabetes are essential inputs to policy analyses aimed at prevention and treatment of diabetes. Our objective was to estimate and compare event rates, hospital utilization, and costs associated with major diabetes-related complications in high-, middle-, and low-income countries.Incidence and history of diabetes-related complications, hospital admissions, and length of stay were recorded in 11,140 patients with type 2 diabetes participating in the Action in Diabetes and Vascular Disease (ADVANCE study (mean age at entry 66 y. The probability of hospital utilization and number of days in hospital for major events associated with coronary disease, cerebrovascular disease, congestive heart failure, peripheral vascular disease, and nephropathy were estimated for three regions (Asia, Eastern Europe, and Established Market Economies using multiple regression analysis. The resulting estimates of days spent in hospital were multiplied by regional estimates of the costs per hospital bed-day from the World Health Organization to compute annual acute and long-term costs associated with the different types of complications. To assist, comparability, costs are reported in international dollars (Int$, which represent a hypothetical currency that allows for the same quantities of goods or services to be purchased regardless of country, standardized on purchasing power in the United States. A cost calculator accompanying this paper enables the estimation of costs for individual countries and translation of these costs into local currency units. The probability of attending a hospital following an event was highest for heart failure (93%-96% across regions and lowest for nephropathy (15%-26%. The average numbers of days in hospital given at least one admission were greatest for stroke (17-32 d across

  9. Do climate extreme events foster violent civil conflicts? A coincidence analysis

    Science.gov (United States)

    Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.

    2014-05-01

    Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.

  10. Spousal communication and contraceptive use in rural Nepal: an event history analysis.

    Science.gov (United States)

    Link, Cynthia F

    2011-06-01

    This study analyzes longitudinal data from couples in rural Nepal to investigate the influence of spousal communication about family planning on their subsequent contraceptive use. The study expands current understanding of the communication-contraception link by (a) exploiting monthly panel data to conduct an event history analysis, (b) incorporating both wives' and husbands' perceptions of communication, and (c) distinguishing effects of spousal communication on the use of four contraceptive methods. The findings provide new evidence of a strong positive impact of spousal communication on contraceptive use, even when controlling for confounding variables. Wives' reports of communication are substantial explanatory factors in couples' initiation of all contraceptive methods examined. Husbands' reports of communication predict couples'subsequent use of male-controlled methods. This analysis advances our understanding of how marital dynamics--as well as husbands' perceptions of these dynamics--influence fertility behavior, and should encourage policies to promote greater integration of men into family planning programs.

  11. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    International Nuclear Information System (INIS)

    FORESTER, JOHN A.; BLEY, DENNIS C.; COOPER, SUSANE; KOLACZKOWSKI, ALAN M.; THOMPSON, CATHERINE; RAMEY-SMITH, ANN; WREATHALL, JOHN

    2000-01-01

    This paper describes the most recent version of a human reliability analysis (HRA) method called ''A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed

  12. Defining the Costs of Reusable Flexible Ureteroscope Reprocessing Using Time-Driven Activity-Based Costing.

    Science.gov (United States)

    Isaacson, Dylan; Ahmad, Tessnim; Metzler, Ian; Tzou, David T; Taguchi, Kazumi; Usawachintachit, Manint; Zetumer, Samuel; Sherer, Benjamin; Stoller, Marshall; Chi, Thomas

    2017-10-01

    Careful decontamination and sterilization of reusable flexible ureteroscopes used in ureterorenoscopy cases prevent the spread of infectious pathogens to patients and technicians. However, inefficient reprocessing and unavailability of ureteroscopes sent out for repair can contribute to expensive operating room (OR) delays. Time-driven activity-based costing (TDABC) was applied to describe the time and costs involved in reprocessing. Direct observation and timing were performed for all steps in reprocessing of reusable flexible ureteroscopes following operative procedures. Estimated times needed for each step by which damaged ureteroscopes identified during reprocessing are sent for repair were characterized through interviews with purchasing analyst staff. Process maps were created for reprocessing and repair detailing individual step times and their variances. Cost data for labor and disposables used were applied to calculate per minute and average step costs. Ten ureteroscopes were followed through reprocessing. Process mapping for ureteroscope reprocessing averaged 229.0 ± 74.4 minutes, whereas sending a ureteroscope for repair required an estimated 143 minutes per repair. Most steps demonstrated low variance between timed observations. Ureteroscope drying was the longest and highest variance step at 126.5 ± 55.7 minutes and was highly dependent on manual air flushing through the ureteroscope working channel and ureteroscope positioning in the drying cabinet. Total costs for reprocessing totaled $96.13 per episode, including the cost of labor and disposable items. Utilizing TDABC delineates the full spectrum of costs associated with ureteroscope reprocessing and identifies areas for process improvement to drive value-based care. At our institution, ureteroscope drying was one clearly identified target area. Implementing training in ureteroscope drying technique could save up to 2 hours per reprocessing event, potentially preventing expensive OR delays.

  13. Event based neutron activation spectroscopy and analysis algorithm using MLE and meta-heuristics

    International Nuclear Information System (INIS)

    Wallace, B.

    2014-01-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes involved was used to create a statistical model. Maximum likelihood estimation was combined with meta-heuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research. (author)

  14. Clinical usefulness and feasibility of time-frequency analysis of chemosensory event-related potentials.

    Science.gov (United States)

    Huart, C; Rombaux, Ph; Hummel, T; Mouraux, A

    2013-09-01

    The clinical usefulness of olfactory event-related brain potentials (OERPs) to assess olfactory function is limited by the relatively low signal-to-noise ratio of the responses identified using conventional time-domain averaging. Recently, it was shown that time-frequency analysis of the obtained EEG signals can markedly improve the signal-to-noise ratio of OERPs in healthy controls, because it enhances both phase-locked and non phase-locked EEG responses. The aim of the present study was to investigate the clinical usefulness of this approach and evaluate its feasibility in a clinical setting. We retrospectively analysed EEG recordings obtained from 45 patients (15 anosmic, 15 hyposmic and 15 normos- mic). The responses to olfactory stimulation were analysed using conventional time-domain analysis and joint time-frequency analysis. The ability of the two methods to discriminate between anosmic, hyposmic and normosmic patients was assessed using a Receiver Operating Characteristic analysis. The discrimination performance of OERPs identified using conventional time-domain averaging was poor. In contrast, the discrimination performance of the EEG response identified in the time-frequency domain was relatively high. Furthermore, we found a significant correlation between the magnitude of this response and the psychophysical olfactory score. Time-frequency analysis of the EEG responses to olfactory stimulation could be used as an effective and reliable diagnostic tool for the objective clinical evaluation of olfactory function in patients.

  15. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    Science.gov (United States)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  16. Trend analysis of human error events and assessment of their proactive prevention measure at Rokkasho reprocessing plant

    International Nuclear Information System (INIS)

    Yamazaki, Satoru; Tanaka, Izumi; Wakabayashi, Toshio

    2012-01-01

    A trend analysis of human error events is important for preventing the recurrence of human error events. We propose a new method for identifying the common characteristics from results of trend analysis, such as the latent weakness of organization, and a management process for strategic error prevention. In this paper, we describe a trend analysis method for human error events that have been accumulated in the organization and the utilization of the results of trend analysis to prevent accidents proactively. Although the systematic analysis of human error events, the monitoring of their overall trend, and the utilization of the analyzed results have been examined for the plant operation, such information has never been utilized completely. Sharing information on human error events and analyzing their causes lead to the clarification of problems in the management and human factors. This new method was applied to the human error events that occurred in the Rokkasho reprocessing plant from 2010 October. Results revealed that the output of this method is effective in judging the error prevention plan and that the number of human error events is reduced to about 50% those observed in 2009 and 2010. (author)

  17. Towards a DNA Nanoprocessor: Reusable Tile-Integrated DNA Circuits.

    Science.gov (United States)

    Gerasimova, Yulia V; Kolpashchikov, Dmitry M

    2016-08-22

    Modern electronic microprocessors use semiconductor logic gates organized on a silicon chip to enable efficient inter-gate communication. Here, arrays of communicating DNA logic gates integrated on a single DNA tile were designed and used to process nucleic acid inputs in a reusable format. Our results lay the foundation for the development of a DNA nanoprocessor, a small and biocompatible device capable of performing complex analyses of DNA and RNA inputs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. CLARAty: Challenges and Steps Toward Reusable Robotic Software

    Directory of Open Access Journals (Sweden)

    Richard Madison

    2008-11-01

    Full Text Available We present in detail some of the challenges in developing reusable robotic software. We base that on our experience in developing the CLARAty robotics software, which is a generic object-oriented framework used for the integration of new algorithms in the areas of motion control, vision, manipulation, locomotion, navigation, localization, planning and execution. CLARAty was adapted to a number of heterogeneous robots with different mechanisms and hardware control architectures. In this paper, we also describe how we addressed some of these challenges in the development of the CLARAty software.

  19. CLARAty: Challenges and Steps toward Reusable Robotic Software

    Directory of Open Access Journals (Sweden)

    Issa A.D. Nesnas

    2006-03-01

    Full Text Available We present in detail some of the challenges in developing reusable robotic software. We base that on our experience in developing the CLARAty robotics software, which is a generic object-oriented framework used for the integration of new algorithms in the areas of motion control, vision, manipulation, locomotion, navigation, localization, planning and execution. CLARAty was adapted to a number of heterogeneous robots with different mechanisms and hardware control architectures. In this paper, we also describe how we addressed some of these challenges in the development of the CLARAty software.

  20. A reusable suture anchor for arthroscopy psychomotor skills training.

    Science.gov (United States)

    Tillett, Edward D; Rogers, Rainie; Nyland, John

    2003-03-01

    For residents to adequately develop the early arthroscopy psychomotor skills required to better learn how to manage the improvisational situations they will encounter during actual patient cases, they need to experience sufficient practice repetitions within a contextually relevant environment. Unfortunately, the cost of suture anchors can be a practice repetition-limiting factor in learning arthroscopic knot-tying techniques. We describe a technique for creating inexpensive reusable suture anchors and provide an example of their application to repair the anterior glenoid labrum during an arthroscopy psychomotor skills laboratory training session.

  1. Analysis of the events on the operating of the wrong compartment of NPPs

    International Nuclear Information System (INIS)

    Zheng Lixin; Zhou Hong; Zhang Hao; Che Shuwei; Zhang Jiajun

    2013-01-01

    In this paper, an operational event that unit trip caused by the operating of the wrong compartment, due to the personnel error is introduced. Through in-depth research on this kind of events the causes of the events are found, some suggestions are put forward. It can provide a reference for preventing the similar events from recurring to other NPPs. (authors)

  2. Statistical methods for the time-to-event analysis of individual participant data from multiple epidemiological studies

    DEFF Research Database (Denmark)

    Thompson, Simon; Kaptoge, Stephen; White, Ian

    2010-01-01

    Meta-analysis of individual participant time-to-event data from multiple prospective epidemiological studies enables detailed investigation of exposure-risk relationships, but involves a number of analytical challenges....

  3. Analysis of the highest transverse energy events seen in the UAl detector at the Spp-barS collider

    International Nuclear Information System (INIS)

    1987-06-01

    The first full solid angle analysis is presented of large transverse energy events in pp-bar collisions at the CERN collider. Events with transverse energies in excess of 200 GeV at √s = 630 GeV are studied for any non-standard physics and quantitatively compared with expectations from perturbative QCD Monte Carlo models. A corrected differential cross section is presented. A detailed examination is made of jet profiles, event jet multiplicities and the fraction of the transverse energy carried by the two jets with the highest transverse jet energies. There is good agreement with standard theory for events with transverse energies up to the largest observed values (approx. √s/2) and the analysis shows no evidence for any non-QCD mechanism to account for the event characteristics. (author)

  4. Exploratory trend and pattern analysis of 1981 through 1983 Licensee Event Report data. Main report. Volume 1

    International Nuclear Information System (INIS)

    Hester, O.V.; Groh, M.R.; Farmer, F.G.

    1986-10-01

    This report presents an overview of the 1981 through 1983 Sequence Coding and Search System (SCSS) data base that contains nuclear power plant operational data derived from Licensee Event Reports (LERs) submitted to the United States Nuclear Regulatory Commission (USNRC). Both overall event reporting and events related to specific components, subsystems, systems, and personnel are discussed. At all of these levels of information, software is used to generate count data for contingency tables. Contingency table analysis is the main tool for the trend and pattern analysis. The tables focus primarily on faults associated with various components and other items of interest across different plants. The abstracts and other SCSS information on the LERs accounting for unusual counts in the tables were examined to gain insights from the events. Trends and patterns in LER reporting and reporting of events for various component groups were examined through log-linear modeling techniques

  5. Analysis of the highest transverse energy events seen in the UA1 detector at the Spanti pS collider

    International Nuclear Information System (INIS)

    Albajar, C.; Bezaguet, A.; Cennini, P.

    1987-01-01

    This is the first full solid angle analysis of large transverse energy events in panti p collisions at the CERN collider. Events with transverse energies in excess of 200 GeV at √s=630 GeV are studied for any non-standard physics and quantitatively compared with expectations from perturbative QCD Monte Carlo models. A corrected differential cross section is presented. A detailed examination is made of jet profiles, event jet multiplicities and the fraction of the transverse energy carried by the two jets with the highest transverse jet energies. There is good agreement with standard theory for events with transverse energies up to the largest observed values (≅ √s/2) and the analysis shows no evidence for any non-QCD mechanism to account for the event characteristics. (orig.)

  6. SENTINEL EVENTS

    Directory of Open Access Journals (Sweden)

    Andrej Robida

    2004-09-01

    Full Text Available Background. The Objective of the article is a two year statistics on sentinel events in hospitals. Results of a survey on sentinel events and the attitude of hospital leaders and staff are also included. Some recommendations regarding patient safety and the handling of sentinel events are given.Methods. In March 2002 the Ministry of Health introduce a voluntary reporting system on sentinel events in Slovenian hospitals. Sentinel events were analyzed according to the place the event, its content, and root causes. To show results of the first year, a conference for hospital directors and medical directors was organized. A survey was conducted among the participants with the purpose of gathering information about their view on sentinel events. One hundred questionnaires were distributed.Results. Sentinel events. There were 14 reports of sentinel events in the first year and 7 in the second. In 4 cases reports were received only after written reminders were sent to the responsible persons, in one case no reports were obtained. There were 14 deaths, 5 of these were in-hospital suicides, 6 were due to an adverse event, 3 were unexplained. Events not leading to death were a suicide attempt, a wrong side surgery, a paraplegia after spinal anaesthesia, a fall with a femoral neck fracture, a damage of the spleen in the event of pleural space drainage, inadvertent embolization with absolute alcohol into a femoral artery and a physical attack on a physician by a patient. Analysis of root causes of sentinel events showed that in most cases processes were inadequate.Survey. One quarter of those surveyed did not know about the sentinel events reporting system. 16% were having actual problems when reporting events and 47% beleived that there was an attempt to blame individuals. Obstacles in reporting events openly were fear of consequences, moral shame, fear of public disclosure of names of participants in the event and exposure in mass media. The majority of

  7. A framework for analysis of sentinel events in medical student education.

    Science.gov (United States)

    Cohen, Daniel M; Clinchot, Daniel M; Werman, Howard A

    2013-11-01

    Although previous studies have addressed student factors contributing to dismissal or withdrawal from medical school for academic reasons, little information is available regarding institutional factors that may hinder student progress. The authors describe the development and application of a framework for sentinel event (SE) root cause analysis to evaluate cases in which students are dismissed or withdraw because of failure to progress in the medical school curriculum. The SE in medical student education (MSE) framework was piloted at the Ohio State University College of Medicine (OSUCOM) during 2010-2012. Faculty presented cases using the framework during academic oversight committee discussions. Nine SEs in MSE were presented using the framework. Major institution-level findings included the need for improved communication, documentation of cognitive and noncognitive (e.g., mental health) issues, clarification of requirements for remediation and fitness for duty, and additional psychological services. Challenges related to alternative and combined programs were identified as well. The OSUCOM undertook system changes based on the action plans developed through the discussions of these SEs. An SE analysis process appears to be a useful method for making system changes in response to institutional issues identified in evaluation of cases in which students fail to progress in the medical school curriculum. The authors plan to continue to refine the SE in MSE framework and analysis process. Next steps include assessing whether analysis using this framework yields improved student outcomes with universal applications for other institutions.

  8. Recent adaptive events in human brain revealed by meta-analysis of positively selected genes.

    Directory of Open Access Journals (Sweden)

    Yue Huang

    Full Text Available BACKGROUND AND OBJECTIVES: Analysis of positively-selected genes can help us understand how human evolved, especially the evolution of highly developed cognitive functions. However, previous works have reached conflicting conclusions regarding whether human neuronal genes are over-represented among genes under positive selection. METHODS AND RESULTS: We divided positively-selected genes into four groups according to the identification approaches, compiling a comprehensive list from 27 previous studies. We showed that genes that are highly expressed in the central nervous system are enriched in recent positive selection events in human history identified by intra-species genomic scan, especially in brain regions related to cognitive functions. This pattern holds when different datasets, parameters and analysis pipelines were used. Functional category enrichment analysis supported these findings, showing that synapse-related functions are enriched in genes under recent positive selection. In contrast, immune-related functions, for instance, are enriched in genes under ancient positive selection revealed by inter-species coding region comparison. We further demonstrated that most of these patterns still hold even after controlling for genomic characteristics that might bias genome-wide identification of positively-selected genes including gene length, gene density, GC composition, and intensity of negative selection. CONCLUSION: Our rigorous analysis resolved previous conflicting conclusions and revealed recent adaptation of human brain functions.

  9. Analysis of syntactic and semantic features for fine-grained event-spatial understanding in outbreak news reports

    Directory of Open Access Journals (Sweden)

    Chanlekha Hutchatai

    2010-03-01

    Full Text Available Abstract Background Previous studies have suggested that epidemiological reasoning needs a fine-grained modelling of events, especially their spatial and temporal attributes. While the temporal analysis of events has been intensively studied, far less attention has been paid to their spatial analysis. This article aims at filling the gap concerning automatic event-spatial attribute analysis in order to support health surveillance and epidemiological reasoning. Results In this work, we propose a methodology that provides a detailed analysis on each event reported in news articles to recover the most specific locations where it occurs. Various features for recognizing spatial attributes of the events were studied and incorporated into the models which were trained by several machine learning techniques. The best performance for spatial attribute recognition is very promising; 85.9% F-score (86.75% precision/85.1% recall. Conclusions We extended our work on event-spatial attribute recognition by focusing on machine learning techniques, which are CRF, SVM, and Decision tree. Our approach avoided the costly development of an external knowledge base by employing the feature sources that can be acquired locally from the analyzed document. The results showed that the CRF model performed the best. Our study indicated that the nearest location and previous event location are the most important features for the CRF and SVM model, while the location extracted from the verb's subject is the most important to the Decision tree model.

  10. Multi dimensional analysis of Design Basis Events using MARS-LMR

    International Nuclear Information System (INIS)

    Woo, Seung Min; Chang, Soon Heung

    2012-01-01

    Highlights: ► The one dimensional analyzed sodium hot pool is modified to a three dimensional node system, because the one dimensional analysis cannot represent the phenomena of the inside pool of a big size pool with many compositions. ► The results of the multi-dimensional analysis compared with the one dimensional analysis results in normal operation, TOP (Transient of Over Power), LOF (Loss of Flow), and LOHS (Loss of Heat Sink) conditions. ► The difference of the sodium flow pattern due to structure effect in the hot pool and mass flow rates in the core lead the different sodium temperature and temperature history under transient condition. - Abstract: KALIMER-600 (Korea Advanced Liquid Metal Reactor), which is a pool type SFR (Sodium-cooled Fast Reactor), was developed by KAERI (Korea Atomic Energy Research Institute). DBE (Design Basis Events) for KALIMER-600 has been analyzed in the one dimension. In this study, the one dimensional analyzed sodium hot pool is modified to a three dimensional node system, because the one dimensional analysis cannot represent the phenomena of the inside pool of a big size pool with many compositions, such as UIS (Upper Internal Structure), IHX (Intermediate Heat eXchanger), DHX (Decay Heat eXchanger), and pump. The results of the multi-dimensional analysis compared with the one dimensional analysis results in normal operation, TOP (Transient of Over Power), LOF (Loss of Flow), and LOHS (Loss of Heat Sink) conditions. First, the results in normal operation condition show the good agreement between the one and multi-dimensional analysis. However, according to the sodium temperatures of the core inlet, outlet, the fuel central line, cladding and PDRC (Passive Decay heat Removal Circuit), the temperatures of the one dimensional analysis are generally higher than the multi-dimensional analysis in conditions except the normal operation state, and the PDRC operation time in the one dimensional analysis is generally longer than

  11. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    Science.gov (United States)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  12. Monitoring As A Helpful Means In Forensic Analysis Of Dams Static Instability Events

    Science.gov (United States)

    Solimene, Pellegrino

    2013-04-01

    Monitoring is a means of controlling the behavior of a structure, which during its operational life is subject to external actions as ordinary loading conditions and disturbing ones; these factors overlap with the random manner defined by the statistical parameter of the return period. The analysis of the monitoring data is crucial to gain a reasoned opinion on the reliability of the structure and its components, and also allows to identify, in the overall operational scenario, the time when preparing interventions aimed at maintaining the optimum levels of functionality and safety. The concept of monitoring in terms of prevention is coupled with the activity of Forensic Engineer who, by Judiciary appointment for the occurrence of an accident, turns its experience -the "Scientific knowledge"- in an "inverse analysis" in which he summed up the results of a survey, which also draws on data sets arising in the course of the constant control of the causes and effects, so to determine the correlations between these factors. His activity aims at giving a contribution to the identification of the typicality of an event, which represents, together with "causal link" between the conduct and events and contra-juridical, the factors judging if there an hypothesis of crime, and therefore liable according to law. In Italy there are about 10,000 dams of varying sizes, but only a small portion of them are considered "large dams" and subjected to a rigorous program of regular inspections and monitoring, in application of specific rules. The rest -"small" dams, conventionally defined as such by the standard, but not for the impact on the area- is affected by a heterogeneous response from the local authorities entrusted with this task: there is therefore a high potential risk scenario, as determined by the presence of not completely controlled structures that insist even on areas heavily populated. Risk can be traced back to acceptable levels if they were implemented with the

  13. Probabilistic safety analysis on an SBWR 72 hours after the initiating event

    International Nuclear Information System (INIS)

    Dominguez Bautista, M.T.; Peinador Veira, M.

    1996-01-01

    Passive plants, including SBWRs, are designed to carry out safety functions with passive systems during the first 72 hours after the initiation event with no need for manual actions or external support. After this period, some recovery actions are required to enable the passive systems to continue performing their safety functions. The study was carried out by the INITEC-Empresarios Agrupados Joint Venture within the framework of the international group collaborating with GE on this project. Its purpose has been to assess, by means of probabilistic criteria, the importance to safety of each of these support actions, in order to define possible requirements to be considered in the design in respect of said recovery actions. In brief, the methodology developed for this objective consists of (1) quantifying success event trees from the PSA up to 72 hours, (2) determining the actions required in each sequence to maintain Steady State after 72 hours, (3) identifying available alternative core cooling methods in each sequence, (4) establishing the approximate (order of magnitude) realizability of each alternative method, (5) calculating the frequency of core damage as a function of the failure probability of post-72-hour actions and (6) analysing the importance of post-72-hour actions. The results of this analysis permit the establishment, right from the conceptual design phase, of the requirements that will arise to ensure these actions in the long term, enhancing their reliability and preventing the accident from continuing beyond this period. (Author)

  14. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    Science.gov (United States)

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  15. ERPLAB: An Open-Source Toolbox for the Analysis of Event-Related Potentials

    Directory of Open Access Journals (Sweden)

    Javier eLopez-Calderon

    2014-04-01

    Full Text Available ERPLAB Toolbox is a freely available, open-source toolbox for processing and analyzing event-related potential (ERP data in the MATLAB environment. ERPLAB is closely integrated with EEGLAB, a popular open-source toolbox that provides many EEG preprocessing steps and an excellent user interface design. ERPLAB adds to EEGLAB’s EEG processing functions, providing additional tools for filtering, artifact detection, re-referencing, and sorting of events, among others. ERPLAB also provides robust tools for averaging EEG segments together to create averaged ERPs, for creating difference waves and other recombinations of ERP waveforms through algebraic expressions, for filtering and re-referencing the averaged ERPs, for plotting ERP waveforms and scalp maps, and for quantifying several types of amplitudes and latencies. ERPLAB’s tools can be accessed either from an easy-to-learn graphical user interface or from MATLAB scripts, and a command history function makes it easy for users with no programming experience to write scripts. Consequently, ERPLAB provides both ease of use and virtually unlimited power and flexibility, making it appropriate for the analysis of both simple and complex ERP experiments. Several forms of documentation are available, including a detailed user’s guide, a step-by-step tutorial, a scripting guide, and a set of video-based demonstrations.

  16. Reusable LH2 tank technology demonstration through ground test

    Science.gov (United States)

    Bianca, C.; Greenberg, H. S.; Johnson, S. E.

    1995-01-01

    The paper presents the project plan to demonstrate, by March 1997, the reusability of an integrated composite LH2 tank structure, cryogenic insulation, and thermal protection system (TPS). The plan includes establishment of design requirements and a comprehensive trade study to select the most suitable Reusable Hydrogen Composite Tank system (RHCTS) within the most suitable of 4 candidate structural configurations. The 4 vehicles are winged body with the capability to deliver 25,000 lbs of payload to a circular 220 nm, 51.6 degree inclined orbit (also 40,000 lbs to a 28.5 inclined 150 nm orbit). A prototype design of the selected RHCTS is established to identify the construction, fabrication, and stress simulation and test requirements necessary in an 8 foot diameter tank structure/insulation/TPS test article. A comprehensive development test program supports the 8 foot test article development and involves the composite tank itself, cryogenic insulation, and integrated tank/insulation/TPS designs. The 8 foot diameter tank will contain the integrated cryogenic insulation and TPS designs resulting from this development and that of the concurrent lightweight durable TPS program. Tank ground testing will include 330 cycles of LH2 filling, pressurization, body loading, depressurization, draining, and entry heating.

  17. Russian aluminum-lithium alloys for advanced reusable spacecraft

    International Nuclear Information System (INIS)

    Charette, Ray O.; Leonard, Bruce G.; Bozich, William F.; Deamer, David A.

    1998-01-01

    Cryotanks that are cost-affordable, robust, fuel-compatible, and lighter weight than current aluminum design are needed to support next-generation launch system performance and operability goals. The Boeing (McDonnell Douglas Aerospace-MDA) and NASA's Delta Clipper-Experimental Program (DC-XA) flight demonstrator test bed vehicle provided the opportunity for technology transfer of Russia's extensive experience base with weight-efficient, highly weldable aluminum-lithium (Al-Li) alloys for cryogenic tank usage. As part of NASA's overall reusable launch vehicle (RLV) program to help provide technology and operations data for use in advanced RLVs, MDA contracted with the Russian Academy of Sciences (RAS/IMASH) for design, test, and delivery of 1460 Al-Li alloy liquid oxygen (LO 2 ) cryotanks: one for development, one for ground tests, and one for DC-XA flight tests. This paper describes the development of Al-Li 1460 alloy for reusable LO 2 tanks, including alloy composition tailoring, mechanical properties database, forming, welding, chemical milling, dissimilar metal joining, corrosion protection, completed tanks proof, and qualification testing. Mechanical properties of the parent and welded materials exceeded expectations, particularly the fracture toughness, which promise excellent reuse potential. The LO 2 cryotank was successfully demonstrated in DC-XA flight tests

  18. Reusable Software Usability Specifications for mHealth Applications.

    Science.gov (United States)

    Cruz Zapata, Belén; Fernández-Alemán, José Luis; Toval, Ambrosio; Idri, Ali

    2018-01-25

    One of the key factors for the adoption of mobile technologies, and in particular of mobile health applications, is usability. A usable application will be easier to use and understand by users, and will improve user's interaction with it. This paper proposes a software requirements catalog for usable mobile health applications, which can be used for the development of new applications, or the evaluation of existing ones. The catalog is based on the main identified sources in literature on usability and mobile health applications. Our catalog was organized according to the ISO/IEC/IEEE 29148:2011 standard and follows the SIREN methodology to create reusable catalogs. The applicability of the catalog was verified by the creation of an audit method, which was used to perform the evaluation of a real app, S Health, application created by Samsung Electronics Co. The usability requirements catalog, along with the audit method, identified several usability flaws on the evaluated app, which scored 83%. Some flaws were detected in the app related to the navigation pattern. Some more issues related to the startup experience, empty screens or writing style were also found. The way a user navigates through an application improves or deteriorates user's experience with the application. We proposed a reusable usability catalog and an audit method. This proposal was used to evaluate a mobile health application. An audit report was created with the usability issues identified on the evaluated application.

  19. Mountain Rivers and Climate Change: Analysis of hazardous events in torrents of small alpine watersheds

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    events dating back several decades is analysed. Precipitation thresholds varying in space and time are established using highly resolved INCA data of the Austrian weather service. Parameters possibly controlling the basic susceptibility of catchments are evaluated in a regional GIS analysis (vegetation, geology, topography, stream network, proxies for sediment availability). Similarity measures are then used to group catchments into sensitivity classes. Applying different climate scenarios, the spatiotemporal distribution of catchments sensitive towards heavier and more frequent precipitation can be determined giving valuable advice for planning and managing mountain protection zones.

  20. Time compression of soil erosion by the effect of largest daily event. A regional analysis of USLE database.

    Science.gov (United States)

    Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.

    2009-04-01

    When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE

  1. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    Science.gov (United States)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  2. Investigation of Lab Fire Prevention Management System of Combining Root Cause Analysis and Analytic Hierarchy Process with Event Tree Analysis

    Directory of Open Access Journals (Sweden)

    Cheng-Chan Shih

    2016-01-01

    Full Text Available This paper proposed a new approach, combining root cause analysis (RCA, analytic hierarchy process (AHP, and event tree analysis (ETA in a loop to systematically evaluate various laboratory safety prevention strategies. First, 139 fire accidents were reviewed to identify the root causes and draw out prevention strategies. Most fires were caused due to runaway reactions, operation error and equipment failure, and flammable material release. These mostly occurred in working places of no prompt fire protection. We also used AHP to evaluate the priority of these strategies and found that chemical fire prevention strategy is the most important control element, and strengthening maintenance and safety inspection intensity is the most important action. Also together with our surveys results, we proposed that equipment design is also critical for fire prevention. Therefore a technical improvement was propounded: installing fire detector, automatic sprinkler, and manual extinguisher in the lab hood as proactive fire protections. ETA was then used as a tool to evaluate laboratory fire risks. The results indicated that the total risk of a fire occurring decreases from 0.0351 to 0.0042 without/with equipment taking actions. Establishing such system can make Environment, Health and Safety (EH&S office not only analyze and prioritize fire prevention policies more practically, but also demonstrate how effective protective equipment improvement can achieve and the probabilities of the initiating event developing into a serious accident or controlled by the existing safety system.

  3. Single-Event Effects in High-Frequency Linear Amplifiers: Experiment and Analysis

    Science.gov (United States)

    Zeinolabedinzadeh, Saeed; Ying, Hanbin; Fleetwood, Zachary E.; Roche, Nicolas J.-H.; Khachatrian, Ani; McMorrow, Dale; Buchner, Stephen P.; Warner, Jeffrey H.; Paki-Amouzou, Pauline; Cressler, John D.

    2017-01-01

    The single-event transient (SET) response of two different silicon-germanium (SiGe) X-band (8-12 GHz) low noise amplifier (LNA) topologies is fully investigated in this paper. The two LNAs were designed and implemented in 130nm SiGe HBT BiCMOS process technology. Two-photon absorption (TPA) laser pulses were utilized to induce transients within various devices in these LNAs. Impulse response theory is identified as a useful tool for predicting the settling behavior of the LNAs subjected to heavy ion strikes. Comprehensive device and circuit level modeling and simulations were performed to accurately simulate the behavior of the circuits under ion strikes. The simulations agree well with TPA measurements. The simulation, modeling and analysis presented in this paper can be applied for any other circuit topologies for SET modeling and prediction.

  4. A Study on Degree of Conservatism of PZR Inventory during Event Analysis

    International Nuclear Information System (INIS)

    Lee, Sang Seob; Park, Min Soo; Huh, Jae Yong; Lee, Gyu Cheon

    2016-01-01

    The pressurizer safety valves (PSVs) are installed in OPR1000 plants. While the pressurizer pilot operated safety relief valve (POSRV) of APR1400 is designed to discharge steam and/or water, the PSV is designed to discharge steam only. To check degree of conservatism of a PZR water level during PSV operation, a study has been performed using a computer code, RELAP5/ MOD3.3. Degree of conservatism is described herein, and the results are shown to evaluate degree of conservatism. Degree of conservatism is evaluated with respect to the PZR inventory for OPR1000 plant. It could be concluded that there is no possibility the liquid goes through PSVs during PLCS malfunction, because the expected maximum PZR inventory would remain below PSV nozzle based on the conservative assumptions. With the site specific PSV characteristics, a degree of conservatism would be determined to guarantee the PSV integrity during the event. To guarantee the PSV integrity, an independent analysis is recommended

  5. A Study on Degree of Conservatism of PZR Inventory during Event Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Seob; Park, Min Soo; Huh, Jae Yong; Lee, Gyu Cheon [KEPCO Engineering and Construction Co. Ltd., Deajeon (Korea, Republic of)

    2016-10-15

    The pressurizer safety valves (PSVs) are installed in OPR1000 plants. While the pressurizer pilot operated safety relief valve (POSRV) of APR1400 is designed to discharge steam and/or water, the PSV is designed to discharge steam only. To check degree of conservatism of a PZR water level during PSV operation, a study has been performed using a computer code, RELAP5/ MOD3.3. Degree of conservatism is described herein, and the results are shown to evaluate degree of conservatism. Degree of conservatism is evaluated with respect to the PZR inventory for OPR1000 plant. It could be concluded that there is no possibility the liquid goes through PSVs during PLCS malfunction, because the expected maximum PZR inventory would remain below PSV nozzle based on the conservative assumptions. With the site specific PSV characteristics, a degree of conservatism would be determined to guarantee the PSV integrity during the event. To guarantee the PSV integrity, an independent analysis is recommended.

  6. Emergency Load Shedding Strategy Based on Sensitivity Analysis of Relay Operation Margin against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun Sun

    2012-01-01

    the runtime emergent states of related system component. Based on sensitivity analysis between the relay operation margin and power system state variables, an optimal load shedding strategy is applied to adjust the emergent states timely before the unwanted relay operation. Load dynamics is also taken...... into account to compensate load shedding amount calculation. And the multi-agent technology is applied for the whole strategy implementation. A test system is built in real time digital simulator (RTDS) and has demonstrated the effectiveness of the proposed strategy.......In order to prevent long term voltage instability and induced cascading events, a load shedding strategy based on the sensitivity of relay operation margin to load powers is discussed and proposed in this paper. The operation margin of critical impedance backup relay is defined to identify...

  7. Superposed ruptile deformational events revealed by field and VOM structural analysis

    Science.gov (United States)

    Kumaira, Sissa; Guadagnin, Felipe; Keller Lautert, Maiara

    2017-04-01

    Virtual outcrop models (VOM) is becoming an important application in the analysis of geological structures due to the possibility of obtaining the geometry and in some cases kinematic aspects of analyzed structures in a tridimensional photorealistic space. These data are used to gain quantitative information on the deformational features which coupled with numeric models can assist in understands deformational processes. Old basement units commonly register superposed deformational events either ductile or ruptile along its evolution. The Porongos Belt, located at southern Brazil, have a complex deformational history registering at least five ductile and ruptile deformational events. In this study, we presents a structural analysis of a quarry in the Porongos Belt, coupling field and VOM structural information to understand process involved in the last two deformational events. Field information was acquired using traditional structural methods for analysis of ruptile structures, such as the descriptions, drawings, acquisition of orientation vectors and kinematic analysis. VOM was created from the image-based modeling method through photogrammetric data acquisition and orthorectification. Photogrammetric data acquisition was acquired using Sony a3500 camera and a total of 128 photographs were taken from ca. 10-20 m from the outcrop in different orientations. Thirty two control point coordinates were acquired using a combination of RTK dGPS surveying and total station work, providing a precision of few millimeters for x, y and z. Photographs were imported into the Photo Scan software to create a 3D dense point cloud from structure from-motion algorithm, which were triangulated and textured to generate the VOM. VOM was georreferenced (oriented and scaled) using the ground control points, and later analyzed in OpenPlot software to extract structural information. Data was imported in Wintensor software to obtain tensor orientations, and Move software to process and

  8. Analysis of core-concrete interaction event with flooding for the Advanced Neutron Source reactor

    International Nuclear Information System (INIS)

    Kim, S.H.; Taleyarkhan, R.P.; Georgevich, V.; Navarro-Valenti, S.

    1993-01-01

    This paper discusses salient aspects of the methodology, assumptions, and modeling of various features related to estimation of source terms from an accident involving a molten core-concrete interaction event (with and without flooding) in the Advanced Neutron Source (ANS) reactor at the Oak Ridge National Laboratory. Various containment configurations are considered for this postulated severe accident. Several design features (such as rupture disks) are examined to study containment response during this severe accident. Also, thermal-hydraulic response of the containment and radionuclide transport and retention in the containment are studied. The results are described as transient variations of source terms, which are then used for studying off-site radiological consequences and health effects for the support of the Conceptual Safety Analysis Report for ANS. The results are also to be used to examine the effectiveness of subpile room flooding during this type of severe accident

  9. Retrospective Analysis of Communication Events - Understanding the Dynamics of Collaborative Multi-Party Discourse

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, Andrew J.; Haack, Jereme N.; McColgin, Dave W.

    2006-06-08

    This research is aimed at understanding the dynamics of collaborative multi-party discourse across multiple communication modalities. Before we can truly make sig-nificant strides in devising collaborative communication systems, there is a need to understand how typical users utilize com-putationally supported communications mechanisms such as email, instant mes-saging, video conferencing, chat rooms, etc., both singularly and in conjunction with traditional means of communication such as face-to-face meetings, telephone calls and postal mail. Attempting to un-derstand an individual’s communications profile with access to only a single modal-ity is challenging at best and often futile. Here, we discuss the development of RACE – Retrospective Analysis of Com-munications Events – a test-bed prototype to investigate issues relating to multi-modal multi-party discourse.

  10. Transition-Region Ultraviolet Explosive Events in IRIS Si IV: A Statistical Analysis

    Science.gov (United States)

    Bartz, Allison

    2018-01-01

    Explosive events (EEs) in the solar transition region are characterized by broad, non-Gaussian line profiles with wings at Doppler velocities exceeding the speed of sound. We present a statistical analysis of 23 IRIS (Interface Region Imaging Spectrograph) sit-and-stare observations, observed between April 2014 and March 2017. Using the IRIS Si IV 1394 Å and 1403 Å spectral windows and the 1400Å Slit Jaw images we have identified 581 EEs. We found that most EEs last less than 20 min. and have a spatial scale on the slit less than 10”, agreeing with measurements in previous work. We observed most EEs in active regions, regardless of date of observation, but selection bias of IRIS observations cannot be ruled out. We also present preliminary findings of optical depth effects from our statistical study.

  11. An Entry Point for Formal Methods: Specification and Analysis of Event Logs

    Directory of Open Access Journals (Sweden)

    Howard Barringer

    2010-03-01

    Full Text Available Formal specification languages have long languished, due to the grave scalability problems faced by complete verification methods. Runtime verification promises to use formal specifications to automate part of the more scalable art of testing, but has not been widely applied to real systems, and often falters due to the cost and complexity of instrumentation for online monitoring. In this paper we discuss work in progress to apply an event-based specification system to the logging mechanism of the Mars Science Laboratory mission at JPL. By focusing on log analysis, we exploit the "instrumentation" already implemented and required for communicating with the spacecraft. We argue that this work both shows a practical method for using formal specifications in testing and opens interesting research avenues, including a challenging specification learning problem.

  12. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  13. Analysis of factors associated with hiccups based on the Japanese Adverse Drug Event Report database.

    Science.gov (United States)

    Hosoya, Ryuichiro; Uesawa, Yoshihiro; Ishii-Nozawa, Reiko; Kagaya, Hajime

    2017-01-01

    Hiccups are occasionally experienced by most individuals. Although hiccups are not life-threatening, they may lead to a decline in quality of life. Previous studies showed that hiccups may occur as an adverse effect of certain medicines during chemotherapy. Furthermore, a male dominance in hiccups has been reported. However, due to the limited number of studies conducted on this phenomenon, debate still surrounds the few factors influencing hiccups. The present study aimed to investigate the influence of medicines and patient characteristics on hiccups using a large-sized adverse drug event report database and, specifically, the Japanese Adverse Drug Event Report (JADER) database. Cases of adverse effects associated with medications were extracted from JADER, and Fisher's exact test was performed to assess the presence or absence of hiccups for each medication. In a multivariate analysis, we conducted a multiple logistic regression analysis using medication and patient characteristic variables exhibiting significance. We also examined the role of dexamethasone in inducing hiccups during chemotherapy. Medicines associated with hiccups included dexamethasone, levofolinate, fluorouracil, oxaliplatin, carboplatin, and irinotecan. Patient characteristics associated with hiccups included a male gender and greater height. The combination of anti-cancer agent and dexamethasone use was noted in more than 95% of patients in the dexamethasone-use group. Hiccups also occurred in patients in the anti-cancer agent-use group who did not use dexamethasone. Most of the medications that induce hiccups are used in chemotherapy. The results of the present study suggest that it is possible to predict a high risk of hiccups using patient characteristics. We confirmed that dexamethasone was the drug that has the strongest influence on the induction of hiccups. However, the influence of anti-cancer agents on the induction of hiccups cannot be denied. We consider the results of the present

  14. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    Science.gov (United States)

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  15. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs

  16. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  17. Top-down and bottom-up definitions of human failure events in human reliability analysis

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2014-01-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down - defined as a subset of the PRA - whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up - derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  18. Top-down proteomics for the analysis of proteolytic events - Methods, applications and perspectives.

    Science.gov (United States)

    Tholey, Andreas; Becker, Alexander

    2017-11-01

    Mass spectrometry based proteomics is an indispensable tool for almost all research areas relevant for the understanding of proteolytic processing, ranging from the identification of substrates, products and cleavage sites up to the analysis of structural features influencing protease activity. The majority of methods for these studies are based on bottom-up proteomics performing analysis at peptide level. As this approach is characterized by a number of pitfalls, e.g. loss of molecular information, there is an ongoing effort to establish top-down proteomics, performing separation and MS analysis both at intact protein level. We briefly introduce major approaches of bottom-up proteomics used in the field of protease research and highlight the shortcomings of these methods. We then discuss the present state-of-the-art of top-down proteomics. Together with the discussion of known challenges we show the potential of this approach and present a number of successful applications of top-down proteomics in protease research. This article is part of a Special Issue entitled: Proteolysis as a Regulatory Event in Pathophysiology edited by Stefan Rose-John. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Cardiopulmonary resuscitation in the elderly: analysis of the events in the emergency department

    Directory of Open Access Journals (Sweden)

    Augusto Tricerri

    2013-10-01

    Full Text Available With the increasing number of old people in all western countries and increasing life expectancy at birth, many seniors spend the last period of their life with various afflictions that may lead to cardiac arrest. Bystander cardiopulmonary resuscitation (CPR increases survival rates. Octogenarians are the fastest growing segment of the population and despite empirical evidence that CPR is of questionable effectiveness in seniors with comorbidities, it is still the only treatment among life-sustaining ones. Cardiopulmonary resuscitation is frequently unsuccessful, but if survival is achieved, a fairly good quality of life can be expected. Various papers analyzed the effect of CPR in hospitalized patients or in cardiac arrest occurring at home or in public places, while less is known about events occurring in the emergency room (ER. We performed a retrospective analysis of cardiac arrest events occurred in ER within 54 months: we analyzed 415,001 records of ER visits (from 01/01/1999 to 30/06/2003 in San Giovanni Addolorata Hospital. Data were analyzed in terms of age and outcome. We identified 475 records with the outcome of death in ER or death on arrival. Out of them, we selected 290 medical records which had sufficient data to be analyzed. Of the 290 patients evaluated, 225 died in ER, 18 were deemed to die on arrival, and 47 survived the cardiac arrest and were admitted to intensive care unit (ICU. The overall mortality was 0.11%, while the incidence of the selected events was 0.072%. The mean age of the analyzed population was 71.3 years. The only possible diagnosis was often cardiac arrest, though most of the times we could specify and group the diagnosis even better. The analysis of the procedures showed that cardiac arrest treated by direct current (DC shock was similarly distributed in different age groups, and no difference was detectable between the two groups. The mean age of the patients who underwent tracheal intubation (TI was

  20. Climate change impacts on extreme events in the United States: an uncertainty analysis

    Science.gov (United States)

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  1. A CORBA BASED ARCHITECTURE FOR ACCESSING REUSABLE SOFTWARE COMPONENTS ON THE WEB.

    Directory of Open Access Journals (Sweden)

    R. Cenk ERDUR

    2003-01-01

    Full Text Available In a very near future, as a result of the continious growth of Internet and advances in networking technologies, Internet will become the common software repository for people and organizations who employ component based reuse approach in their software development life cycles. In order to use the reusable components such as source codes, analysis, designs, design patterns during new software development processes, environments that support the identification of the components over Internet are needed. Basic elements of such an environment are the coordinator programs which deliver user requests to appropriate component libraries, user interfaces for querying, and programs that wrap the component libraries. First, a CORBA based architecture is proposed for such an environment. Then, an alternative architecture that is based on the Java 2 platform technologies is given for the same environment. Finally, the two architectures are compared.

  2. Managing Complexity in Activity Specifications by Separation of Concerns and Reusability

    Directory of Open Access Journals (Sweden)

    Peter Forbrig

    2016-10-01

    Full Text Available The specification of activities of the different stakeholders is an important activity for software development. Currently, a lot of specification languages like task models, activity diagrams, state charts, and business specifications are used to document the results of the analysis of the domain in most projects. The paper discusses the aspect of reusability by considering generic submodels. This approach increases the quality of models. Additionally, the separation of concerns of cooperation and individual work by subject-oriented specifications is discussed. It will be demonstrated how task models can be used to support subject-oriented specification by so called team models and role models in a more precise way than S-BPM specifications. More precise restrictions on instances of roles can be specified.

  3. RAGE Reusable Game Software Components and Their Integration into Serious Game Engines

    NARCIS (Netherlands)

    Van der Vegt, Wim; Nyamsuren, Enkhbold; Westera, Wim

    2016-01-01

    This paper presents and validates a methodology for integrating reusable software components in diverse game engines. While conforming to the RAGE com-ponent-based architecture described elsewhere, the paper explains how the interac-tions and data exchange processes between a reusable software

  4. 14 CFR 437.95 - Inspection of additional reusable suborbital rockets.

    Science.gov (United States)

    2010-01-01

    ... suborbital rockets. 437.95 Section 437.95 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL... of an Experimental Permit § 437.95 Inspection of additional reusable suborbital rockets. A permittee may launch or reenter additional reusable suborbital rockets of the same design under the permit after...

  5. Coordination activities of human planners during rescheduling: Case analysis and event handling procedure

    OpenAIRE

    2010-01-01

    Abstract This paper addresses the process of event handling and rescheduling in manufacturing practice. Firms are confronted with many diverse events, like new or changed orders, machine breakdowns, and material shortages. These events influence the feasibility and optimality of schedules, and thus induce rescheduling. In many manufacturing firms, schedules are created by several human planners. Coordination between them is needed to respond to events adequately. In this paper,...

  6. 6C polarization analysis - seismic direction finding in coherent noise, automated event identification, and wavefield separation

    Science.gov (United States)

    Schmelzbach, C.; Sollberger, D.; Greenhalgh, S.; Van Renterghem, C.; Robertsson, J. O. A.

    2017-12-01

    Polarization analysis of standard three-component (3C) seismic data is an established tool to determine the propagation directions of seismic waves recorded by a single station. A major limitation of seismic direction finding methods using 3C recordings, however, is that a correct propagation-direction determination is only possible if the wave mode is known. Furthermore, 3C polarization analysis techniques break down in the presence of coherent noise (i.e., when more than one event is present in the analysis time window). Recent advances in sensor technology (e.g., fibre-optical, magnetohydrodynamic angular rate sensors, and ring laser gyroscopes) have made it possible to accurately measure all three components of rotational ground motion exhibited by seismic waves, in addition to the conventionally recorded three components of translational motion. Here, we present an extension of the theory of single station 3C polarization analysis to six-component (6C) recordings of collocated translational and rotational ground motions. We demonstrate that the information contained in rotation measurements can help to overcome some of the main limitations of standard 3C seismic direction finding, such as handling multiple arrivals simultaneously. We show that the 6C polarisation of elastic waves measured at the Earth's free surface does not only depend on the seismic wave type and propagation direction, but also on the local P- and S-wave velocities just beneath the recording station. Using an adaptation of the multiple signal classification algorithm (MUSIC), we demonstrate how seismic events can univocally be identified and characterized in terms of their wave type. Furthermore, we show how the local velocities can be inferred from single-station 6C data, in addition to the direction angles (inclination and azimuth) of seismic arrivals. A major benefit of our proposed 6C method is that it also allows the accurate recovery of the wave type, propagation directions, and phase

  7. Economic impact and market analysis of a special event: The Great New England Air Show

    Science.gov (United States)

    Rodney B. Warnick; David C. Bojanic; Atul Sheel; Apurv Mather; Deepak Ninan

    2010-01-01

    We conducted a post-event evaluation for the Great New England Air Show to assess its general economic impact and to refine economic estimates where possible. In addition to the standard economic impact variables, we examined travel distance, purchase decision involvement, event satisfaction, and frequency of attendance. Graphic mapping of event visitors' home ZIP...

  8. Analysis of economic and social costs of adverse events associated with blood transfusions in Spain

    Directory of Open Access Journals (Sweden)

    Borja Ribed-Sánchez

    2018-05-01

    Full Text Available Objective: To calculate, for the first time, the direct and social costs of transfusion-related adverse events in order to include them in the National Healthcare System's budget, calculation and studies. In Spain more than 1,500 patients yearly are diagnosed with such adverse events. Method: Blood transfusion-related adverse events recorded yearly in Spanish haemovigilance reports were studied retrospectively (2010-2015. The adverse events were coded according to the classification of Diagnosis-Related Groups. The direct healthcare costs were obtained from public information sources. The productivity loss (social cost associated with adverse events was calculated using the human capital and hedonic salary methodologies. Results: In 2015, 1,588 patients had adverse events that resulted in direct health care costs (4,568,914€ and social costs due to hospitalization (200,724€. Three adverse reactions resulted in patient death (at a social cost of 1,364,805€. In total, the cost of blood transfusion-related adverse events was 6,134,443€ in Spain. For the period 2010-2015: the trends show a reduction in the total amount of transfusions (2 vs. 1.91 M€; -4.4%. The number of adverse events increased (822 vs. 1,588; +93%, as well as their related direct healthcare cost (3.22 vs. 4.57M€; +42% and the social cost of hospitalization (110 vs 200M€; +83%. Mortality costs decreased (2.65 vs. 1.36M€; -48%. Discussion: This is the first time that the costs of post-transfusion adverse events have been calculated in Spain. These new figures and trends should be taken into consideration in any cost-effectiveness study or trial of new surgical techniques or sanitary policies that influence blood transfusion activities. Resumen: Objetivo: Calcular por primera vez los costes económicos y sociales relacionados con las reacciones adversas postransfusionales para actualizar estudios e incluirlos en los presupuestos del Sistema Nacional de Salud. En Espa

  9. The use of geoinformatic data and spatial analysis to predict faecal pollution during extreme precipitation events

    Science.gov (United States)

    Ward, Ray; Purnell, Sarah; Ebdon, James; Nnane, Daniel; Taylor, Huw

    2013-04-01

    be a major factor contributing to increased levels of FIO. This study identifies areas within the catchment that are likely to demonstrate elevated erosion rates during extreme precipitation events, which are likely to result in raised levels of FIO. The results also demonstrate that increases in the human faecal marker were associated with the discharge points of wastewater treatment works, and that levels of the marker increased whenever the works discharged untreated wastewaters during extreme precipitation. Spatial analysis also highlighted locations where human faecal pollution was present in areas away from wastewater treatment plants, highlighting the potential significance of inputs from septic tanks and other un-sewered domestic wastewater systems. Increases in the frequency of extreme precipitation events in many parts of Europe are likely to result in increased levels of water pollution from both point- and diffuse-sources, increasing the input of pathogens into surface waters, and elevating the health risks to downstream consumers of abstracted drinking water. This study suggests an approach that integrates water microbiology and geoinformatic data to support a 'prediction and prevention' approach, in place of the traditional focus on water quality monitoring. This work may therefore make a significant contribution to future European water resource management and health protection.

  10. The Association of Unfavorable Traffic Events and Cannabis Usage: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Sorin Hostiuc

    2018-02-01

    Full Text Available Background: In the last years were published many epidemiological articles aiming to link driving under the influence of cannabis (DUIC with the risk of various unfavorable traffic events (UTEs, with sometimes contradictory results.Aim: The primary objective of this study was to analyze whether there is a significant association between DUIC and UTEs.Materials and Methods: We used two meta-analytical methods to assess the statistical significance of the effect size: random-effects model and inverse variance heterogeneity model.Results: Twenty-four studies were included in the meta-analysis. We obtained significant increases in the effect size for DUIC tested through blood analysis, with an odds ratio (OR of 1.97 and a confidence interval (CI between 1.35 and 2.87; death as an outcome, with an OR of 1.56 and a CI between 1.16 and 2.09; and case–control as the type of study, with an OR of 1.99 and a CI between 1.05 and 3.80. Publication bias was very high.Conclusion: Our analysis suggests that the overall effect size for DUIC on UTEs is not statistically significant, but there are significant differences obtained through subgroup analysis. This result might be caused by either methodological flaws (which are often encountered in articles on this topic, the indiscriminate employment of the term “cannabis use,” or an actual absence of an adverse effect. When a driver is found, in traffic, with a positive reaction suggesting cannabis use, the result should be corroborated by either objective data regarding marijuana usage (like blood analyses, with clear cut-off values, or a clinical assessment of the impairment, before establishing his/her fitness to drive.

  11. The Association of Unfavorable Traffic Events and Cannabis Usage: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Sorin Hostiuc

    2018-02-01

    Full Text Available Background: In the last years were published many epidemiological articles aiming to link driving under the influence of cannabis (DUIC with the risk of various unfavorable traffic events (UTEs, with sometimes contradictory results.Aim: The primary objective of this study was to analyze whether there is a significant association between DUIC and UTEs.Materials and Methods: We used two meta-analytical methods to assess the statistical significance of the effect size: random-effects model and inverse variance heterogeneity model.Results: Twenty-four studies were included in the meta-analysis. We obtained significant increases in the effect size for DUIC tested through blood analysis, with an odds ratio (OR of 2.27 and a confidence interval (CI between 1.36 and 3.80; death as an outcome, with an OR of 1.56 and a CI between 1.16 and 2.09; and case–control as the type of study, with an OR of 1.99 and a CI between 1.05 and 3.80. Publication bias was very high.Conclusion: Our analysis suggests that the overall effect size for DUIC on UTEs is not statistically significant, but there are significant differences obtained through subgroup analysis. This result might be caused by either methodological flaws (which are often encountered in articles on this topic, the indiscriminate employment of the term “cannabis use,” or an actual absence of an adverse effect. When a driver is found, in traffic, with a positive reaction suggesting cannabis use, the result should be corroborated by either objective data regarding marijuana usage (like blood analyses, with clear cut-off values, or a clinical assessment of the impairment, before establishing his/her fitness to drive.

  12. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.

  13. Evaluation of Visual Field Progression in Glaucoma: Quasar Regression Program and Event Analysis.

    Science.gov (United States)

    Díaz-Alemán, Valentín T; González-Hernández, Marta; Perera-Sanz, Daniel; Armas-Domínguez, Karintia

    2016-01-01

    To determine the sensitivity, specificity and agreement between the Quasar program, glaucoma progression analysis (GPA II) event analysis and expert opinion in the detection of glaucomatous progression. The Quasar program is based on linear regression analysis of both mean defect (MD) and pattern standard deviation (PSD). Each series of visual fields was evaluated by three methods; Quasar, GPA II and four experts. The sensitivity, specificity and agreement (kappa) for each method was calculated, using expert opinion as the reference standard. The study included 439 SITA Standard visual fields of 56 eyes of 42 patients, with a mean of 7.8 ± 0.8 visual fields per eye. When suspected cases of progression were considered stable, sensitivity and specificity of Quasar, GPA II and the experts were 86.6% and 70.7%, 26.6% and 95.1%, and 86.6% and 92.6% respectively. When suspected cases of progression were considered as progressing, sensitivity and specificity of Quasar, GPA II and the experts were 79.1% and 81.2%, 45.8% and 90.6%, and 85.4% and 90.6% respectively. The agreement between Quasar and GPA II when suspected cases were considered stable or progressing was 0.03 and 0.28 respectively. The degree of agreement between Quasar and the experts when suspected cases were considered stable or progressing was 0.472 and 0.507. The degree of agreement between GPA II and the experts when suspected cases were considered stable or progressing was 0.262 and 0.342. The combination of MD and PSD regression analysis in the Quasar program showed better agreement with the experts and higher sensitivity than GPA II.

  14. Multiple daytime nucleation events in semi-clean savannah and industrial environments in South Africa: analysis based on observations

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2013-06-01

    Full Text Available Recent studies have shown very high frequencies of atmospheric new particle formation in different environments in South Africa. Our aim here was to investigate the causes for two or three consecutive daytime nucleation events, followed by subsequent particle growth during the same day. We analysed 108 and 31 such days observed in a polluted industrial and moderately polluted rural environments, respectively, in South Africa. The analysis was based on two years of measurements at each site. After rejecting the days having notable changes in the air mass origin or local wind direction, i.e. two major reasons for observed multiple nucleation events, we were able to investigate other factors causing this phenomenon. Clouds were present during, or in between most of the analysed multiple particle formation events. Therefore, some of these events may have been single events, interrupted somehow by the presence of clouds. From further analysis, we propose that the first nucleation and growth event of the day was often associated with the mixing of a residual air layer rich in SO2 (oxidized to sulphuric acid into the shallow surface-coupled layer. The second nucleation and growth event of the day usually started before midday and was sometimes associated with renewed SO2 emissions from industrial origin. However, it was also evident that vapours other than sulphuric acid were required for the particle growth during both events. This was especially the case when two simultaneously growing particle modes were observed. Based on our analysis, we conclude that the relative contributions of estimated H2SO4 and other vapours on the first and second nucleation and growth events of the day varied from day to day, depending on anthropogenic and natural emissions, as well as atmospheric conditions.

  15. SYSTEMS SAFETY ANALYSIS FOR FIRE EVENTS ASSOCIATED WITH THE ECRB CROSS DRIFT

    International Nuclear Information System (INIS)

    R. J. Garrett

    2001-01-01

    The purpose of this analysis is to systematically identify and evaluate fire hazards related to the Yucca Mountain Site Characterization Project (YMP) Enhanced Characterization of the Repository Block (ECRB) East-West Cross Drift (commonly referred to as the ECRB Cross-Drift). This analysis builds upon prior Exploratory Studies Facility (ESF) System Safety Analyses and incorporates Topopah Springs (TS) Main Drift fire scenarios and ECRB Cross-Drift fire scenarios. Accident scenarios involving the fires in the Main Drift and the ECRB Cross-Drift were previously evaluated in ''Topopah Springs Main Drift System Safety Analysis'' (CRWMS M and O 1995) and the ''Yucca Mountain Site Characterization Project East-West Drift System Safety Analysis'' (CRWMS M and O 1998). In addition to listing required mitigation/control features, this analysis identifies the potential need for procedures and training as part of defense-in-depth mitigation/control features. The inclusion of this information in the System Safety Analysis (SSA) is intended to assist the organization(s) (e.g., Construction, Environmental Safety and Health, Design) responsible for these aspects of the ECRB Cross-Drift in developing mitigation/control features for fire events, including Emergency Refuge Station(s). This SSA was prepared, in part, in response to Condition/Issue Identification and Reporting/Resolution System (CIRS) item 1966. The SSA is an integral part of the systems engineering process, whereby safety is considered during planning, design, testing, and construction. A largely qualitative approach is used which incorporates operating experiences and recommendations from vendors, the constructor and the operating contractor. The risk assessment in this analysis characterizes the scenarios associated with fires in terms of relative risk and includes recommendations for mitigating all identified hazards. The priority for recommending and implementing mitigation control features is: (1) Incorporate

  16. Analysis of core damage frequency due to external events at the DOE [Department of Energy] N-Reactor

    International Nuclear Information System (INIS)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L.; Baxter, J.T.; Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P.; Brosseau, D.A.

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs

  17. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. (Sandia National Labs., Albuquerque, NM (USA)); Baxter, J.T. (Westinghouse Hanford Co., Richland, WA (USA)); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. (EQE, Inc., San Francisco, CA (USA)); Brosseau, D.A. (ERCE, Inc., Albuquerque, NM (USA))

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  18. No rationale for 1 variable per 10 events criterion for binary logistic regression analysis

    Directory of Open Access Journals (Sweden)

    Maarten van Smeden

    2016-11-01

    Full Text Available Abstract Background Ten events per variable (EPV is a widely advocated minimal criterion for sample size considerations in logistic regression analysis. Of three previous simulation studies that examined this minimal EPV criterion only one supports the use of a minimum of 10 EPV. In this paper, we examine the reasons for substantial differences between these extensive simulation studies. Methods The current study uses Monte Carlo simulations to evaluate small sample bias, coverage of confidence intervals and mean square error of logit coefficients. Logistic regression models fitted by maximum likelihood and a modified estimation procedure, known as Firth’s correction, are compared. Results The results show that besides EPV, the problems associated with low EPV depend on other factors such as the total sample size. It is also demonstrated that simulation results can be dominated by even a few simulated data sets for which the prediction of the outcome by the covariates is perfect (‘separation’. We reveal that different approaches for identifying and handling separation leads to substantially different simulation results. We further show that Firth’s correction can be used to improve the accuracy of regression coefficients and alleviate the problems associated with separation. Conclusions The current evidence supporting EPV rules for binary logistic regression is weak. Given our findings, there is an urgent need for new research to provide guidance for supporting sample size considerations for binary logistic regression analysis.

  19. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis

  20. Analysis on ingress of coolant event in vacuum vessel using modified TRAC-BF1 code

    International Nuclear Information System (INIS)

    Ajima, Toshio; Kurihara, Ryoichi; Seki, Yasushi

    1999-08-01

    The Transient Reactor Analysis Code (TRAC-BF1) was modified on the basis of ICE experimental results so as to analyze the Ingress of Coolant Event (ICE) in the vacuum vessel of a nuclear fusion reactor. In the previous report, the TRAC-BF1 code, which was originally developed for the safety analysis of a light water reactor, had been modified for the ICE of the fusion reactor. And the addition of the flat structural plate model to the VESSEL component and arbitrary appointment of the gravity direction had been added in the TRAC-BF1 code. This TRAC-BF1 code was further modified. The flat structural plate model of the VESSEL component was enabled to divide in multi layers having different materials, and a part of the multi layers could take a buried heater into consideration. Moreover, the TRAC-BF1 code was modified to analyze under the low-pressure condition close to vacuum within range of the steam table. This paper describes additional functions of the modified TRAC-BF1 code, analytical evaluation using ICE experimental data and the ITER model with final design report (FDR) data. (author)