WorldWideScience

Sample records for key events analysis

  1. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  2. Decision Trajectories in Dementia Care Networks: Decisions and Related Key Events.

    Science.gov (United States)

    Groen-van de Ven, Leontine; Smits, Carolien; Oldewarris, Karen; Span, Marijke; Jukema, Jan; Eefsting, Jan; Vernooij-Dassen, Myrra

    2017-10-01

    This prospective multiperspective study provides insight into the decision trajectories of people with dementia by studying the decisions made and related key events. This study includes three waves of interviews, conducted between July 2010 and July 2012, with 113 purposefully selected respondents (people with beginning to advanced stages of dementia and their informal and professional caregivers) completed in 12 months (285 interviews). Our multilayered qualitative analysis consists of content analysis, timeline methods, and constant comparison. Four decision themes emerged-managing daily life, arranging support, community living, and preparing for the future. Eight key events delineate the decision trajectories of people with dementia. Decisions and key events differ between people with dementia living alone and living with a caregiver. Our study clarifies that decisions relate not only to the disease but to living with the dementia. Individual differences in decision content and sequence may effect shared decision-making and advance care planning.

  3. Predicting Key Events in the Popularity Evolution of Online Information.

    Science.gov (United States)

    Hu, Ying; Hu, Changjun; Fu, Shushen; Fang, Mingzhe; Xu, Wenwen

    2017-01-01

    The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.

  4. Predicting Key Events in the Popularity Evolution of Online Information.

    Directory of Open Access Journals (Sweden)

    Ying Hu

    Full Text Available The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.

  5. Power quality event classification: an overview and key issues ...

    African Journals Online (AJOL)

    ... used for PQ events' classifications. Various artificial intelligent techniques which are used in PQ event classification are also discussed. Major Key issues and challenges in classifying PQ events are critically examined and outlined. Keywords: Power quality, PQ event classifiers, artificial intelligence techniques, PQ noise, ...

  6. Negated bio-events: analysis and identification

    Science.gov (United States)

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  7. Genetic stratigraphy of key demographic events in Arabia.

    Science.gov (United States)

    Fernandes, Verónica; Triska, Petr; Pereira, Joana B; Alshamali, Farida; Rito, Teresa; Machado, Alison; Fajkošová, Zuzana; Cavadas, Bruno; Černý, Viktor; Soares, Pedro; Richards, Martin B; Pereira, Luísa

    2015-01-01

    At the crossroads between Africa and Eurasia, Arabia is necessarily a melting pot, its peoples enriched by successive gene flow over the generations. Estimating the timing and impact of these multiple migrations are important steps in reconstructing the key demographic events in the human history. However, current methods based on genome-wide information identify admixture events inefficiently, tending to estimate only the more recent ages, as here in the case of admixture events across the Red Sea (~8-37 generations for African input into Arabia, and 30-90 generations for "back-to-Africa" migrations). An mtDNA-based founder analysis, corroborated by detailed analysis of the whole-mtDNA genome, affords an alternative means by which to identify, date and quantify multiple migration events at greater time depths, across the full range of modern human history, albeit for the maternal line of descent only. In Arabia, this approach enables us to infer several major pulses of dispersal between the Near East and Arabia, most likely via the Gulf corridor. Although some relict lineages survive in Arabia from the time of the out-of-Africa dispersal, 60 ka, the major episodes in the peopling of the Peninsula took place from north to south in the Late Glacial and, to a lesser extent, the immediate post-glacial/Neolithic. Exchanges across the Red Sea were mainly due to the Arab slave trade and maritime dominance (from ~2.5 ka to very recent times), but had already begun by the early Holocene, fuelled by the establishment of maritime networks since ~8 ka. The main "back-to-Africa" migrations, again undetected by genome-wide dating analyses, occurred in the Late Glacial period for introductions into eastern Africa, whilst the Neolithic was more significant for migrations towards North Africa.

  8. Key events in the history of sustainable development

    OpenAIRE

    Sustainable Development Commission

    2005-01-01

    This document is a table which summaries the key events in the history of sustainable development, adapted from International Institute for Sustainable Development's sustainable development timeline. Publisher PDF

  9. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    Science.gov (United States)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to

  10. Advanced event reweighting using multivariate analysis

    International Nuclear Information System (INIS)

    Martschei, D; Feindt, M; Honc, S; Wagner-Kuhr, J

    2012-01-01

    Multivariate analysis (MVA) methods, especially discrimination techniques such as neural networks, are key ingredients in modern data analysis and play an important role in high energy physics. They are usually trained on simulated Monte Carlo (MC) samples to discriminate so called 'signal' from 'background' events and are then applied to data to select real events of signal type. We here address procedures that improve this work flow. This will be the enhancement of data / MC agreement by reweighting MC samples on a per event basis. Then training MVAs on real data using the sPlot technique will be discussed. Finally we will address the construction of MVAs whose discriminator is independent of a certain control variable, i.e. cuts on this variable will not change the discriminator shape.

  11. Finite key analysis in quantum cryptography

    International Nuclear Information System (INIS)

    Meyer, T.

    2007-01-01

    finite number of input signals, without making any approximations. As an application, we investigate the so-called ''Tomographic Protocol'', which is based on the Six-State Protocol and where Alice and Bob can obtain the additional information which quantum state they share after the distribution step of the protocol. We calculate the obtainable secret key rate under the assumption that the eavesdropper only conducts collective attacks and give a detailed analysis of the dependence of the key rate on various parameters: The number of input signals (the block size), the error rate in the sifted key (the QBER), and the security parameter. Furthermore, we study the influence of multi-photon events which naturally occur in a realistic implementation (orig.)

  12. Finite key analysis in quantum cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, T.

    2007-10-31

    the obtainable key rate for any finite number of input signals, without making any approximations. As an application, we investigate the so-called ''Tomographic Protocol'', which is based on the Six-State Protocol and where Alice and Bob can obtain the additional information which quantum state they share after the distribution step of the protocol. We calculate the obtainable secret key rate under the assumption that the eavesdropper only conducts collective attacks and give a detailed analysis of the dependence of the key rate on various parameters: The number of input signals (the block size), the error rate in the sifted key (the QBER), and the security parameter. Furthermore, we study the influence of multi-photon events which naturally occur in a realistic implementation (orig.)

  13. Key events and their effects on cycling behaviour in Dar-es-Salaam : abstract + powerpoint

    NARCIS (Netherlands)

    Nkurunziza, A.; Zuidgeest, M.H.P.; Brussel, M.J.G.; van Maarseveen, M.F.A.M.

    2012-01-01

    The paper explores key events and investigates their effects on cycling behaviour in the city of Dar-es-Salaam, Tanzania. The objective of the study is to identify specific key events during a person’s life course with a significant effect on change of travel behaviour towards cycling in relation to

  14. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  15. Key events and their effects on cycling behaviour in Dar-es-Salaam : abstract + powerpoint

    OpenAIRE

    Nkurunziza, A.; Zuidgeest, M.H.P.; Brussel, M.J.G.; van Maarseveen, M.F.A.M.

    2012-01-01

    The paper explores key events and investigates their effects on cycling behaviour in the city of Dar-es-Salaam, Tanzania. The objective of the study is to identify specific key events during a person’s life course with a significant effect on change of travel behaviour towards cycling in relation to stage of change. Stage of change is a key construct of the transtheoretical model of behaviour change that defines behavioural readiness (intentions and actions) into six distinct categories (i.e....

  16. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  17. Preparedness of newly qualified midwives to deliver clinical care: an evaluation of pre-registration midwifery education through an analysis of key events.

    Science.gov (United States)

    Skirton, Heather; Stephen, Nicole; Doris, Faye; Cooper, Maggie; Avis, Mark; Fraser, Diane M

    2012-10-01

    this study was part of a larger project commissioned to ascertain whether midwife teachers bring a unique contribution to the preparation of midwives for practice. The aim of this phase was to determine whether the student midwives' educational programme had equipped them to practise competently after entry to the professional register. this was a prospective, longitudinal qualitative study, using participant diaries to collect data. data were collected from newly qualified midwives during the initial six months after they commenced their first post as a qualified midwife. the potential participants were all student midwives who were completing their education at one of six Universities (three in England, one in Scotland, one in Wales and one in Northern Ireland). Diary data were submitted by 35 newly qualified midwives; 28 were graduates of the three year programme and seven of the shortened programme. diary entries were analysed using thematic analysis (Braun and Clarke, 2006), with a focus on identification of key events in the working lives of the newly qualified midwives. A total of 263 key events were identified, under three main themes: (1) impact of the event on confidence, (2) gaps in knowledge or experience and (3) articulated frustration, conflict or distress. essentially, pre-registration education, delivered largely by midwife teachers and supported by clinical mentors, has been shown to equip newly qualified midwives to work effectively as autonomous practitioners caring for mothers and babies. While newly qualified midwives are able to cope with a range of challenging clinical situations in a safe manner, they lack confidence in key areas. Positive reinforcement by supportive colleagues plays a significant role in enabling them to develop as practitioners. whilst acknowledging the importance of normality in childbearing there is a need within the curriculum to enable midwives to recognise and respond to complex care situations by providing theory

  18. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  19. Mining the key predictors for event outbreaks in social networks

    Science.gov (United States)

    Yi, Chengqi; Bao, Yuanyuan; Xue, Yibo

    2016-04-01

    It will be beneficial to devise a method to predict a so-called event outbreak. Existing works mainly focus on exploring effective methods for improving the accuracy of predictions, while ignoring the underlying causes: What makes event go viral? What factors that significantly influence the prediction of an event outbreak in social networks? In this paper, we proposed a novel definition for an event outbreak, taking into account the structural changes to a network during the propagation of content. In addition, we investigated features that were sensitive to predicting an event outbreak. In order to investigate the universality of these features at different stages of an event, we split the entire lifecycle of an event into 20 equal segments according to the proportion of the propagation time. We extracted 44 features, including features related to content, users, structure, and time, from each segment of the event. Based on these features, we proposed a prediction method using supervised classification algorithms to predict event outbreaks. Experimental results indicate that, as time goes by, our method is highly accurate, with a precision rate ranging from 79% to 97% and a recall rate ranging from 74% to 97%. In addition, after applying a feature-selection algorithm, the top five selected features can considerably improve the accuracy of the prediction. Data-driven experimental results show that the entropy of the eigenvector centrality, the entropy of the PageRank, the standard deviation of the betweenness centrality, the proportion of re-shares without content, and the average path length are the key predictors for an event outbreak. Our findings are especially useful for further exploring the intrinsic characteristics of outbreak prediction.

  20. Human performance analysis of industrial radiography radiation exposure events

    International Nuclear Information System (INIS)

    Reece, W.J.; Hill, S.G.

    1995-01-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures

  1. Artist concept illustrating key events on day by day basis during Apollo 9

    Science.gov (United States)

    1969-01-01

    Artist concept illustrating key events on day by day basis during Apollo 9 mission. First photograph illustrates activities on the first day of the mission, including flight crew preparation, orbital insertion, 103 north mile orbit, separations, docking and docked Service Propulsion System Burn (19792); Second day events include landmark tracking, pitch maneuver, yaw-roll maneuver, and high apogee orbits (19793); Third day events include crew transfer and Lunar Module system evaluation (19794); Fourth day events include use of camera, day-night extravehicular activity, use of golden slippers, and television over Texas and Louisiana (19795); Fifth day events include vehicles undocked, Lunar Module burns for rendezvous, maximum separation, ascent propulsion system burn, formation flying and docking, and Lunar Module jettison ascent burn (19796); Sixth thru ninth day events include service propulsion system burns and landmark sightings, photograph special tests (19797); Tenth day events i

  2. Application of Key Events Dose Response Framework to defining the upper intake level of leucine in young men.

    Science.gov (United States)

    Pencharz, Paul B; Russell, Robert M

    2012-12-01

    Leucine is sold in large doses in health food stores and is ingested by weight-training athletes. The safety of ingestion of large doses of leucine is unknown. Before designing chronic high-dose leucine supplementation experiments, we decided to determine the effect of graded doses of leucine in healthy participants. The Key Events Dose Response Framework is an organizational and analytical framework that dissects the various biologic steps (key events) that occur between exposure to a substance and an eventual adverse effect. Each biologic event is looked at for its unique dose-response characteristics. For nutrients, there are a number of biologic homeostatic mechanisms that work to keep circulating/tissue levels in a safe, nontoxic range. If a response mechanism at a particular key event is especially vulnerable and easily overwhelmed, this is known as a determining event, because this event drives the overall slope or shape of the dose-response relationship. In this paper, the Key Events Dose Framework has been applied to the problem of leucine toxicity and leucine's tolerable upper level. After analyzing the experimental data vis a vis key events for leucine leading to toxicity, it became evident that the rate of leucine oxidation was the determining event. A dose-response study has been conducted to graded intakes of leucine in healthy human adult male volunteers. All participants were started at the mean requirement level of leucine [50 mg/(kg · d)] and the highest leucine intake was 1250 mg/( kg · d), which is 25 times the mean requirement. No gut intolerance was seen. Blood glucose fell progressively but remained within normal values without any changes in plasma insulin. Maximal leucine oxidation levels occurred at an intake of 550 mg leucine/( kg · d), after which plasma leucine progressively increased and plasma ammonia also increased in response to leucine intakes >500 mg/( kg · d). Thus, the "key determining event" appears to be when the

  3. Management of investment-construction projects basing on the matrix of key events

    Directory of Open Access Journals (Sweden)

    Morozenko Andrey Aleksandrovich

    2016-11-01

    Full Text Available The article considers the current problematic issues in the management of investment-construction projects, examines the questions of efficiency increase of construction operations on the basis of the formation of a reflex-adaptive organizational structure. The authors analyzed the necessity of forming a matrix of key events in the investment-construction project (ICP, which will create the optimal structure of the project, basing on the work program for its implementation. For convenience of representing programs of the project implementation in time the authors make recommendations to consolidate the works into separate, economically independent functional blocks. It is proposed to use an algorithm of forming the matrix of an investment-construction project, considering the economic independence of the functional blocks and stages of the ICP implementation. The use of extended network model is justified, which is supplemented by organizational and structural constraints at different stages of the project, highlighting key events fundamentally influencing the further course of the ICP implementation.

  4. ELIMINATION OF THE DISADVANTAGES OF SCHEDULING-NETWORK PLANNING BY APPLYING THE MATRIX OF KEY PROJECT EVENTS

    Directory of Open Access Journals (Sweden)

    Morozenko Andrey Aleksandrovich

    2017-07-01

    Full Text Available The article discusses the current disadvantages of the scheduling-network planning in the management of the terms of investment-construction project. Problems associated with the construction of the schedule and the definitions of the duration of the construction project are being studied. The problems of project management for the management apparatus are shown, which consists in the absence of mechanisms for prompt response to deviations in the parameters of the scheduling-network diagram. A new approach to planning the implementation of an investment-construction project based on a matrix of key events and a rejection of the current practice of determining the duration based on inauthentic regulatory data. An algorithm for determining the key events of the project is presented. For increase the reliability of the organizational structure, the load factor of the functional block in the process of achieving the key event is proposed. Recommendations for improving the interaction of the participants in the investment-construction project are given.

  5. Safety Analysis for Key Design Features of KALIMER-600 Design Concept

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Bum; Kwon, Y. M.; Kim, E. K.; Suk, S. D.; Chang, W. P.; Jeong, H. Y.; Ha, K. S

    2007-02-15

    This report contains the safety analyses of the KALIMER-600 conceptual design which KAERI has been developing under the Long-term Nuclear R and D Program. The analyses have been performed reflecting the design developments during the second year of the 4th design phase in the program. The specific presentations are the key design features with the safety principles for achieving the safety objectives, the event categorization and safety criteria, and results on the safety analyses for the DBAs and ATWS events, the containment performance, and the channel blockages. The safety analyses for both the DBAs and ATWS events have been performed using SSC-K version 1.3., and the results have shown the fulfillment of the safety criteria for DBAs with conservative assumptions. The safety margins as well as the inherent safety also have been confirmed for the ATWS events. For the containment performance analysis, ORIGEN-2.1 and CONTAIN-LMR have been used. In results, the structural integrity has been acceptable and the evaluated exposure dose rate has been complied with 10 CFR 100 and PAG limits. The analysis results for flow blockages of 6-subchannels, 24-subchannels, and 54- subchannels with the MATRA-LMR-FB code, have assured the integrity of subassemblies.

  6. MGR External Events Hazards Analysis

    International Nuclear Information System (INIS)

    Booth, L.

    1999-01-01

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses

  7. Probabilistic analysis of external events with focus on the Fukushima event

    International Nuclear Information System (INIS)

    Kollasko, Heiko; Jockenhoevel-Barttfeld, Mariana; Klapp, Ulrich

    2014-01-01

    External hazards are those natural or man-made hazards to a site and facilities that are originated externally to both the site and its processes, i.e. the duty holder may have very little or no control over the hazard. External hazards can have the potential of causing initiating events at the plant, typically transients like e.g., loss of offsite power. Simultaneously, external events may affect safety systems required to control the initiating event and, where applicable, also back-up systems implemented for risk-reduction. The plant safety may especially be threatened when loads from external hazards exceed the load assumptions considered in the design of safety-related systems, structures and components. Another potential threat is given by hazards inducing initiating events not considered in the safety demonstration otherwise. An example is loss of offsite power combined with prolonged plant isolation. Offsite support, e.g., delivery of diesel fuel oil, usually credited in the deterministic safety analysis may not be possible in this case. As the Fukushima events have shown, the biggest threat is likely given by hazards inducing both effects. Such hazards may well be dominant risk contributors even if their return period is very high. In order to identify relevant external hazards for a certain Nuclear Power Plant (NPP) location, a site specific screening analysis is performed, both for single events and for combinations of external events. As a result of the screening analysis, risk significant and therefore relevant (screened-in) single external events and combinations of them are identified for a site. The screened-in events are further considered in a detailed event tree analysis in the frame of the Probabilistic Safety Analysis (PSA) to calculate the core damage/large release frequency resulting from each relevant external event or from each relevant combination. Screening analyses of external events performed at AREVA are based on the approach provided

  8. Safety analysis for key design features of KALIMER-600 design concept

    International Nuclear Information System (INIS)

    Lee, Yong-Bum; Kwon, Y. M.; Kim, E. K.; Suk, S. D.; Chang, W. P.; Joeng, H. Y.; Ha, K. S.; Heo, S.

    2005-03-01

    KAERI is developing the conceptual design of a Liquid Metal Reactor, KALIMER-600 (Korea Advanced LIquid MEtal Reactor) under the Long-term Nuclear R and D Program. KALIMER-600 addresses key issues regarding future nuclear power plants such as plant safety, economics, proliferation, and waste. In this report, key safety design features are described and safety analyses results for typical ATWS accidents, containment design basis accidents, and flow blockages in the KALIMER design are presented. First, the basic approach to achieve the safety goal and main design features of KALIMER-600 are introduced in Chapter 1, and the event categorization and acceptance criteria for the KALIMER-600 safety analysis are described in Chapter 2, In Chapter 3, results of inherent safety evaluations for the KALIMER-600 conceptual design are presented. The KALIMER-600 core and plant system are designed to assure benign performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram (ATWS) have been performed using the SSC-K code to investigate the KALIMER-600 system response to the events. The objectives of Chapter 4, are to assess the response of KALIMER-600 containment to the design basis accidents and to evaluate whether the consequences are acceptable or not in the aspect of structural integrity and the exposure dose rate. In Chapter 5, the analysis of flow blockage for KALIMER-600 with the MATRA-LMR-FB code, which has been developed for the internal flow blockage in a LMR subassembly, are described. The cases with a blockage of 6-subchannel, 24-subchannel, and 54-subchannel are analyzed

  9. Using variable transformations to perform common event analysis

    International Nuclear Information System (INIS)

    Worrell, R.B.

    1977-01-01

    Any analytical method for studying the effect of common events on the behavior of a system is considered as being a form of common event analysis. The particular common events that are involved often represent quite different phenomena, and this has led to the development of different kinds of common event analysis. For example, common mode failure analysis, common cause analysis, critical location analysis, etc., are all different kinds of common event analysis for which the common events involved represent different phenomena. However, the problem that must be solved for each of these different kinds of common event analysis is essentially the same: Determine the effect of common events on the behavior of a system. Thus, a technique that is useful in achieving one kind of common event analysis is often useful in achieving other kinds of common event analysis

  10. Application and Use of PSA-based Event Analysis in Belgium

    International Nuclear Information System (INIS)

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  11. The Unfolding of LGBT Lives: Key Events Associated With Health and Well-being in Later Life

    Science.gov (United States)

    Fredriksen-Goldsen, Karen I.; Bryan, Amanda E. B.; Jen, Sarah; Goldsen, Jayn; Kim, Hyun-Jun; Muraco, Anna

    2017-01-01

    Purpose of the Study: Life events are associated with the health and well-being of older adults. Using the Health Equity Promotion Model, this article explores historical and environmental context as it frames life experiences and adaptation of lesbian, gay, bisexual, and transgender (LGBT) older adults. Design and Methods: This was the largest study to date of LGBT older adults to identify life events related to identity development, work, and kin relationships and their associations with health and quality of life (QOL). Using latent profile analysis (LPA), clusters of life events were identified and associations between life event clusters were tested. Results: On average, LGBT older adults first disclosed their identities in their 20s; many experienced job-related discrimination. More had been in opposite-sex marriage than in same-sex marriage. Four clusters emerged: “Retired Survivors” were the oldest and one of the most prevalent groups; “Midlife Bloomers” first disclosed their LGBT identities in mid-40s, on average; “Beleaguered At-Risk” had high rates of job-related discrimination and few social resources; and “Visibly Resourced” had a high degree of identity visibility and were socially and economically advantaged. Clusters differed significantly in mental and physical health and QOL, with the Visibly Resourced faring best and Beleaguered At-Risk faring worst on most indicators; Retired Survivors and Midlife Bloomers showed similar health and QOL. Implications: Historical and environmental contexts frame normative and non-normative life events. Future research will benefit from the use of longitudinal data and an assessment of timing and sequencing of key life events in the lives of LGBT older adults. PMID:28087792

  12. The Unfolding of LGBT Lives: Key Events Associated With Health and Well-being in Later Life.

    Science.gov (United States)

    Fredriksen-Goldsen, Karen I; Bryan, Amanda E B; Jen, Sarah; Goldsen, Jayn; Kim, Hyun-Jun; Muraco, Anna

    2017-02-01

    Life events are associated with the health and well-being of older adults. Using the Health Equity Promotion Model, this article explores historical and environmental context as it frames life experiences and adaptation of lesbian, gay, bisexual, and transgender (LGBT) older adults. This was the largest study to date of LGBT older adults to identify life events related to identity development, work, and kin relationships and their associations with health and quality of life (QOL). Using latent profile analysis (LPA), clusters of life events were identified and associations between life event clusters were tested. On average, LGBT older adults first disclosed their identities in their 20s; many experienced job-related discrimination. More had been in opposite-sex marriage than in same-sex marriage. Four clusters emerged: "Retired Survivors" were the oldest and one of the most prevalent groups; "Midlife Bloomers" first disclosed their LGBT identities in mid-40s, on average; "Beleaguered At-Risk" had high rates of job-related discrimination and few social resources; and "Visibly Resourced" had a high degree of identity visibility and were socially and economically advantaged. Clusters differed significantly in mental and physical health and QOL, with the Visibly Resourced faring best and Beleaguered At-Risk faring worst on most indicators; Retired Survivors and Midlife Bloomers showed similar health and QOL. Historical and environmental contexts frame normative and non-normative life events. Future research will benefit from the use of longitudinal data and an assessment of timing and sequencing of key life events in the lives of LGBT older adults. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  14. Human reliability analysis using event trees

    International Nuclear Information System (INIS)

    Heslinga, G.

    1983-01-01

    The shut-down procedure of a technologically complex installation as a nuclear power plant consists of a lot of human actions, some of which have to be performed several times. The procedure is regarded as a chain of modules of specific actions, some of which are analyzed separately. The analysis is carried out by making a Human Reliability Analysis event tree (HRA event tree) of each action, breaking down each action into small elementary steps. The application of event trees in human reliability analysis implies more difficulties than in the case of technical systems where event trees were mainly used until now. The most important reason is that the operator is able to recover a wrong performance; memory influences play a significant role. In this study these difficulties are dealt with theoretically. The following conclusions can be drawn: (1) in principle event trees may be used in human reliability analysis; (2) although in practice the operator will recover his fault partly, theoretically this can be described as starting the whole event tree again; (3) compact formulas have been derived, by which the probability of reaching a specific failure consequence on passing through the HRA event tree after several times of recovery is to be calculated. (orig.)

  15. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  16. PRISM reactor system design and analysis of postulated unscrammed events

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.; Slovik, G.C.

    1991-01-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristic and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. (author)

  17. The key events of 2012

    International Nuclear Information System (INIS)

    2013-01-01

    The article reviews the main events or changes or issues that occurred in 2012 in France in the different sectors of activities of the ASN (control, public information, management of accidental situations, and international cooperation) or that had an impact on the activities of ASN (changes in national or european regulations for instance)

  18. External event analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Bohn, M.P.; Lambright, J.A.

    1989-01-01

    The US Nuclear Regulatory Commission is sponsoring probabilistic risk assessments of six operating commercial nuclear power plants as part of a major update of the understanding of risk as provided by the original WASH-1400 risk assessments. In contrast to the WASH-1400 studies, at least two of the NUREG-1150 risk assessments will include an analysis of risks due to earthquakes, fires, floods, etc., which are collectively known as eternal events. This paper summarizes the methods to be used in the external event analysis for NUREG-1150 and the results obtained to date. The two plants for which external events are being considered are Surry and Peach Bottom, a PWR and BWR respectively. The external event analyses (through core damage frequency calculations) were completed in June 1989, with final documentation available in September. In contrast to most past external event analyses, wherein rudimentary systems models were developed reflecting each external event under consideration, the simplified NUREG-1150 analyses are based on the availability of the full internal event PRA systems models (event trees and fault trees) and make use of extensive computer-aided screening to reduce them to sequence cut sets important to each external event. This provides two major advantages in that consistency and scrutability with respect to the internal event analysis is achieved, and the full gamut of random and test/maintenance unavailabilities are automatically included, while only those probabilistically important survive the screening process. Thus, full benefit of the internal event analysis is obtained by performing the internal and external event analyses sequentially

  19. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  20. Preliminary safety analysis for key design features of KALIMER with breakeven core

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, Do Hee; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, Y. B.; Jeong, K. S

    2001-06-01

    KAERI is currently developing the conceptual design of a Liquid Metal Reactor, KALIMER (Korea Advanced Liquid MEtal Reactor) under the Long-term Nuclear R and D Program. KALIMER addresses key issues regarding future nuclear power plants such as plant safety, economics, proliferation, and waste. In this report, descriptions of safety design features and safety analyses results for selected ATWS accidents for the breakeven core KALIMER are presented. First, the basic approach to achieve the safety goal is introduced in Chapter 1, and the safety evaluation procedure for the KALIMER design is described in Chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events.In Chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure benign performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram (ATWS) have been performed to investigate the KALIMER system response to the events. In Chapter 4, the design of the KALIMER containment dome and the results of its performance analyses are presented. The design of the existing containment and the KALIMER containment dome are compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core energetics behavior during HCDA in Chapter 5. Sensitivity analyses have been performed for the KALIMER core behavior during super-prompt critical excursions, using mathematical formulations developed in the framework of the Modified Bethe-Tait method. Work energy potential was then calculated based on the isentropic fuel expansion model.

  1. EVENT PLANNING USING FUNCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  2. Key terms for the assessment of the safety of vaccines in pregnancy: Results of a global consultative process to initiate harmonization of adverse event definitions.

    Science.gov (United States)

    Munoz, Flor M; Eckert, Linda O; Katz, Mark A; Lambach, Philipp; Ortiz, Justin R; Bauwens, Jorgen; Bonhoeffer, Jan

    2015-11-25

    The variability of terms and definitions of Adverse Events Following Immunization (AEFI) represents a missed opportunity for optimal monitoring of safety of immunization in pregnancy. In 2014, the Brighton Collaboration Foundation and the World Health Organization (WHO) collaborated to address this gap. Two Brighton Collaboration interdisciplinary taskforces were formed. A landscape analysis included: (1) a systematic literature review of adverse event definitions used in vaccine studies during pregnancy; (2) a worldwide stakeholder survey of available terms and definitions; (3) and a series of taskforce meetings. Based on available evidence, taskforces proposed key terms and concept definitions to be refined, prioritized, and endorsed by a global expert consultation convened by WHO in Geneva, Switzerland in July 2014. Using pre-specified criteria, 45 maternal and 62 fetal/neonatal events were prioritized, and key terms and concept definitions were endorsed. In addition recommendations to further improve safety monitoring of immunization in pregnancy programs were specified. This includes elaboration of disease concepts into standardized case definitions with sufficient applicability and positive predictive value to be of use for monitoring the safety of immunization in pregnancy globally, as well as the development of guidance, tools, and datasets in support of a globally concerted approach. There is a need to improve the safety monitoring of immunization in pregnancy programs. A consensus list of terms and concept definitions of key events for monitoring immunization in pregnancy is available. Immediate actions to further strengthen monitoring of immunization in pregnancy programs are identified and recommended. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Satellite-Observed Black Water Events off Southwest Florida: Implications for Coral Reef Health in the Florida Keys National Marine Sanctuary

    OpenAIRE

    Zhao, Jun; Hu, Chuanmin; Lapointe, Brian; Melo, Nelson; Johns, Elizabeth; Smith, Ryan

    2013-01-01

    A “black water” event, as observed from satellites, occurred off southwest Florida in 2012. Satellite observations suggested that the event started in early January and ended in mid-April 2012. The black water patch formed off central west Florida and advected southward towards Florida Bay and the Florida Keys with the shelf circulation, which was confirmed by satellite-tracked surface drifter trajectories. Compared with a previous black water event in 2002, the 2012 event was weaker in terms...

  4. Satellite-Observed Black Water Events off Southwest Florida: Implications for Coral Reef Health in the Florida Keys National Marine Sanctuary

    Directory of Open Access Journals (Sweden)

    Brian Lapointe

    2013-01-01

    Full Text Available A “black water” event, as observed from satellites, occurred off southwest Florida in 2012. Satellite observations suggested that the event started in early January and ended in mid-April 2012. The black water patch formed off central west Florida and advected southward towards Florida Bay and the Florida Keys with the shelf circulation, which was confirmed by satellite-tracked surface drifter trajectories. Compared with a previous black water event in 2002, the 2012 event was weaker in terms of spatial and temporal coverage. An in situ survey indicated that the 2012 black water patch contained toxic K. brevis and had relatively low CDOM (colored dissolved organic matter and turbidity but high chlorophyll-a concentrations, while salinity was somewhat high compared with historical values. Further analysis revealed that the 2012 black water was formed by the K. brevis bloom initiated off central west Florida in late September 2011, while river runoff, Trichodesmium and possibly submarine groundwater discharge also played important roles in its formation. Black water patches can affect benthic coral reef communities by decreasing light availability at the bottom, and enhanced nutrient concentrations from black water patches support massive macroalgae growth that can overgrow coral reefs. It is thus important to continue the integrated observations where satellites provide synoptic and repeated observations of such adverse water quality events.

  5. The Run 2 ATLAS Analysis Event Data Model

    CERN Document Server

    SNYDER, S; The ATLAS collaboration; NOWAK, M; EIFERT, T; BUCKLEY, A; ELSING, M; GILLBERG, D; MOYSE, E; KOENEKE, K; KRASZNAHORKAY, A

    2014-01-01

    During the LHC's first Long Shutdown (LS1) ATLAS set out to establish a new analysis model, based on the experience gained during Run 1. A key component of this is a new Event Data Model (EDM), called the xAOD. This format, which is now in production, provides the following features: A separation of the EDM into interface classes that the user code directly interacts with, and data storage classes that hold the payload data. The user sees an Array of Structs (AoS) interface, while the data is stored in a Struct of Arrays (SoA) format in memory, thus making it possible to efficiently auto-vectorise reconstruction code. A simple way of augmenting and reducing the information saved for different data objects. This makes it possible to easily decorate objects with new properties during data analysis, and to remove properties that the analysis does not need. A persistent file format that can be explored directly with ROOT, either with or without loading any additional libraries. This allows fast interactive naviga...

  6. Analysis and Modelling of Taste and Odour Events in a Shallow Subtropical Reservoir

    Directory of Open Access Journals (Sweden)

    Edoardo Bertone

    2016-08-01

    Full Text Available Understanding and predicting Taste and Odour events is as difficult as critical for drinking water treatment plants. Following a number of events in recent years, a comprehensive statistical analysis of data from Lake Tingalpa (Queensland, Australia was conducted. Historical manual sampling data, as well as data remotely collected by a vertical profiler, were collected; regression analysis and self-organising maps were the used to determine correlations between Taste and Odour compounds and potential input variables. Results showed that the predominant Taste and Odour compound was geosmin. Although one of the main predictors was the occurrence of cyanobacteria blooms, it was noticed that the cyanobacteria species was also critical. Additionally, water temperature, reservoir volume and oxidised nitrogen availability, were key inputs determining the occurrence and magnitude of the geosmin peak events. Based on the results of the statistical analysis, a predictive regression model was developed to provide indications on the potential occurrence, and magnitude, of peaks in geosmin concentration. Additionally, it was found that the blue green algae probe of the lake’s vertical profiler has the potential to be used as one of the inputs for an automated geosmin early warning system.

  7. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  8. Prism reactor system design and analysis of postulated unscrammed events

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.; Slovik, G.C.

    1991-08-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristic and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. 6 refs., 4 figs

  9. PRISM reactor system design and analysis of postulated unscrammed events

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.; Slovik, G.C.; Rosztoczy, Z.; Lane, J.

    1991-01-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristics and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. 6 refs., 4 figs

  10. The Key Events Dose-Response Framework: a cross-disciplinary mode-of-action based approach to examining dose-response and thresholds.

    Science.gov (United States)

    Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S

    2009-09-01

    The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.

  11. Preliminary safety analysis of unscrammed events for KLFR

    International Nuclear Information System (INIS)

    Kim, S.J.; Ha, G.S.

    2005-01-01

    The report presents the design features of KLFR; Safety Analysis Code; steady-state calculation results and analysis results of unscrammed events. The calculations of the steady-state and unscrammed events have been performed for the conceptual design of KLFR using SSC-K code. UTOP event results in no fuel damage and no centre-line melting. The inherent safety features are demonstrated through the analysis of ULOHS event. Although the analysis of ULOF has much uncertainties in the pump design, the analysis results show the inherent safety characteristics. 6% flow of rated flow of natural circulation is formed in the case of ULOF. In the metallic fuel rod, the cladding temperature is somewhat high due to the low heat transfer coefficient of lead. ULOHS event should be considered in design of RVACS for long-term cooling

  12. Analysis of external events - Nuclear Power Plant Dukovany

    International Nuclear Information System (INIS)

    Hladky, Milan

    2000-01-01

    PSA of external events at level 1 covers internal events, floods, fires, other external events are not included yet. Shutdown PSA takes into account internal events, floods, fires, heavy load drop, other external events are not included yet. Final safety analysis report was conducted after 10 years of operation for all Dukovany operational units. Probabilistic approach was used for analysis of aircraft drop and external man-induced events. The risk caused by man-induced events was found to be negligible and was accepted by State Office for Nuclear Safety (SONS)

  13. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  14. Risk and sensitivity analysis in relation to external events

    International Nuclear Information System (INIS)

    Alzbutas, R.; Urbonas, R.; Augutis, J.

    2001-01-01

    This paper presents risk and sensitivity analysis of external events impacts on the safe operation in general and in particular the Ignalina Nuclear Power Plant safety systems. Analysis is based on the deterministic and probabilistic assumptions and assessment of the external hazards. The real statistic data are used as well as initial external event simulation. The preliminary screening criteria are applied. The analysis of external event impact on the NPP safe operation, assessment of the event occurrence, sensitivity analysis, and recommendations for safety improvements are performed for investigated external hazards. Such events as aircraft crash, extreme rains and winds, forest fire and flying parts of the turbine are analysed. The models are developed and probabilities are calculated. As an example for sensitivity analysis the model of aircraft impact is presented. The sensitivity analysis takes into account the uncertainty features raised by external event and its model. Even in case when the external events analysis show rather limited danger, the sensitivity analysis can determine the highest influence causes. These possible variations in future can be significant for safety level and risk based decisions. Calculations show that external events cannot significantly influence the safety level of the Ignalina NPP operation, however the events occurrence and propagation can be sufficiently uncertain.(author)

  15. Joint Attributes and Event Analysis for Multimedia Event Detection.

    Science.gov (United States)

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  16. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    Science.gov (United States)

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new

  17. External events analysis of the Ignalina Nuclear Power Plant

    International Nuclear Information System (INIS)

    Liaukonis, Mindaugas; Augutis, Juozas

    1999-01-01

    This paper presents analysis of external events impact on the safe operation of the Ignalina Nuclear Power Plant (INPP) safety systems. Analysis was based on the probabilistic estimation and modelling of the external hazards. The screening criteria were applied to the number of external hazards. The following external events such as aircraft failure on the INPP, external flooding, fire, extreme winds requiring further bounding study were analysed. Mathematical models were developed and event probabilities were calculated. External events analysis showed rather limited external events danger to Ignalina NPP. Results of the analysis were compared to analogous analysis in western NPPs and no great differences were specified. Calculations performed show that external events can not significantly influence the safety level of the Ignalina NPP operation. (author)

  18. Crop damage by primates: quantifying the key parameters of crop-raiding events.

    Directory of Open Access Journals (Sweden)

    Graham E Wallace

    Full Text Available Human-wildlife conflict often arises from crop-raiding, and insights regarding which aspects of raiding events determine crop loss are essential when developing and evaluating deterrents. However, because accounts of crop-raiding behaviour are frequently indirect, these parameters are rarely quantified or explicitly linked to crop damage. Using systematic observations of the behaviour of non-human primates on farms in western Uganda, this research identifies number of individuals raiding and duration of raid as the primary parameters determining crop loss. Secondary factors include distance travelled onto farm, age composition of the raiding group, and whether raids are in series. Regression models accounted for greater proportions of variation in crop loss when increasingly crop and species specific. Parameter values varied across primate species, probably reflecting differences in raiding tactics or perceptions of risk, and thereby providing indices of how comfortable primates are on-farm. Median raiding-group sizes were markedly smaller than the typical sizes of social groups. The research suggests that key parameters of raiding events can be used to measure the behavioural impacts of deterrents to raiding. Furthermore, farmers will benefit most from methods that discourage raiding by multiple individuals, reduce the size of raiding groups, or decrease the amount of time primates are on-farm. This study demonstrates the importance of directly relating crop loss to the parameters of raiding events, using systematic observations of the behaviour of multiple primate species.

  19. Crop Damage by Primates: Quantifying the Key Parameters of Crop-Raiding Events

    Science.gov (United States)

    Wallace, Graham E.; Hill, Catherine M.

    2012-01-01

    Human-wildlife conflict often arises from crop-raiding, and insights regarding which aspects of raiding events determine crop loss are essential when developing and evaluating deterrents. However, because accounts of crop-raiding behaviour are frequently indirect, these parameters are rarely quantified or explicitly linked to crop damage. Using systematic observations of the behaviour of non-human primates on farms in western Uganda, this research identifies number of individuals raiding and duration of raid as the primary parameters determining crop loss. Secondary factors include distance travelled onto farm, age composition of the raiding group, and whether raids are in series. Regression models accounted for greater proportions of variation in crop loss when increasingly crop and species specific. Parameter values varied across primate species, probably reflecting differences in raiding tactics or perceptions of risk, and thereby providing indices of how comfortable primates are on-farm. Median raiding-group sizes were markedly smaller than the typical sizes of social groups. The research suggests that key parameters of raiding events can be used to measure the behavioural impacts of deterrents to raiding. Furthermore, farmers will benefit most from methods that discourage raiding by multiple individuals, reduce the size of raiding groups, or decrease the amount of time primates are on-farm. This study demonstrates the importance of directly relating crop loss to the parameters of raiding events, using systematic observations of the behaviour of multiple primate species. PMID:23056378

  20. External events analysis for experimental fusion facilities

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1990-01-01

    External events are those off-normal events that threaten facilities either from outside or inside the building. These events, such as floods, fires, and earthquakes, are among the leading risk contributors for fission power plants, and the nature of fusion facilities indicates that they may also lead fusion risk. This paper gives overviews of analysis methods, references good analysis guidance documents, and gives design tips for mitigating the effects of floods and fires, seismic events, and aircraft impacts. Implications for future fusion facility siting are also discussed. Sites similar to fission plant sites are recommended. 46 refs

  1. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  2. Analysis of extreme events

    CSIR Research Space (South Africa)

    Khuluse, S

    2009-04-01

    Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...

  3. LOSP-initiated event tree analysis for BWR

    International Nuclear Information System (INIS)

    Watanabe, Norio; Kondo, Masaaki; Uno, Kiyotaka; Chigusa, Takeshi; Harami, Taikan

    1989-03-01

    As a preliminary study of 'Japanese Model Plant PSA', a LOSP (loss of off-site power)-initiated Event Tree Analysis for a Japanese typical BWR was carried out solely based on the open documents such as 'Safety Analysis Report'. The objectives of this analysis are as follows; - to delineate core-melt accident sequences initiated by LOSP, - to evaluate the importance of core-melt accident sequences in terms of occurrence frequency, and - to develop a foundation of plant information and analytical procedures for efficiently performing further 'Japanese Model Plant PSA'. This report describes the procedure and results of the LOSP-initiated Event Tree Analysis. In this analysis, two types of event trees, Functional Event Tree and Systemic Event Tree, were developed to delineate core-melt accident sequences and to quantify their frequencies. Front-line System Event Tree was prepared as well to provide core-melt sequence delineation for accident progression analysis of Level 2 PSA which will be followed in a future. Applying U.S. operational experience data such as component failure rates and a LOSP frequency, we obtained the following results; - The total frequency of core-melt accident sequences initiated by LOSP is estimated at 5 x 10 -4 per reactor-year. - The dominant sequences are 'Loss of Decay Heat Removal' and 'Loss of Emergency Electric Power Supply', which account for more than 90% of the total core-melt frequency. In this analysis, a higher value of 0.13/R·Y was used for the LOSP frequency than experiences in Japan and any recovery action was not considered. In fact, however, there has been no experience of LOSP event in Japanese nuclear power plants so far and it is also expected that offsite power and/or PCS would be recovered before core melt. Considering Japanese operating experience and recovery factors will reduce the total core-melt frequency to less than 10 -6 per reactor-year. (J.P.N.)

  4. Key components of financial-analysis education for clinical nurses.

    Science.gov (United States)

    Lim, Ji Young; Noh, Wonjung

    2015-09-01

    In this study, we identified key components of financial-analysis education for clinical nurses. We used a literature review, focus group discussions, and a content validity index survey to develop key components of financial-analysis education. First, a wide range of references were reviewed, and 55 financial-analysis education components were gathered. Second, two focus group discussions were performed; the participants were 11 nurses who had worked for more than 3 years in a hospital, and nine components were agreed upon. Third, 12 professionals, including professors, nurse executive, nurse managers, and an accountant, participated in the content validity index. Finally, six key components of financial-analysis education were selected. These key components were as follows: understanding the need for financial analysis, introduction to financial analysis, reading and implementing balance sheets, reading and implementing income statements, understanding the concepts of financial ratios, and interpretation and practice of financial ratio analysis. The results of this study will be used to develop an education program to increase financial-management competency among clinical nurses. © 2015 Wiley Publishing Asia Pty Ltd.

  5. Gait Event Detection in Real-World Environment for Long-Term Applications: Incorporating Domain Knowledge Into Time-Frequency Analysis.

    Science.gov (United States)

    Khandelwal, Siddhartha; Wickstrom, Nicholas

    2016-12-01

    Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.

  6. Screening key candidate genes and pathways involved in insulinoma by microarray analysis.

    Science.gov (United States)

    Zhou, Wuhua; Gong, Li; Li, Xuefeng; Wan, Yunyan; Wang, Xiangfei; Li, Huili; Jiang, Bin

    2018-06-01

    Insulinoma is a rare type tumor and its genetic features remain largely unknown. This study aimed to search for potential key genes and relevant enriched pathways of insulinoma.The gene expression data from GSE73338 were downloaded from Gene Expression Omnibus database. Differentially expressed genes (DEGs) were identified between insulinoma tissues and normal pancreas tissues, followed by pathway enrichment analysis, protein-protein interaction (PPI) network construction, and module analysis. The expressions of candidate key genes were validated by quantitative real-time polymerase chain reaction (RT-PCR) in insulinoma tissues.A total of 1632 DEGs were obtained, including 1117 upregulated genes and 514 downregulated genes. Pathway enrichment results showed that upregulated DEGs were significantly implicated in insulin secretion, and downregulated DEGs were mainly enriched in pancreatic secretion. PPI network analysis revealed 7 hub genes with degrees more than 10, including GCG (glucagon), GCGR (glucagon receptor), PLCB1 (phospholipase C, beta 1), CASR (calcium sensing receptor), F2R (coagulation factor II thrombin receptor), GRM1 (glutamate metabotropic receptor 1), and GRM5 (glutamate metabotropic receptor 5). DEGs involved in the significant modules were enriched in calcium signaling pathway, protein ubiquitination, and platelet degranulation. Quantitative RT-PCR data confirmed that the expression trends of these hub genes were similar to the results of bioinformatic analysis.The present study demonstrated that candidate DEGs and enriched pathways were the potential critical molecule events involved in the development of insulinoma, and these findings were useful for better understanding of insulinoma genesis.

  7. Root cause analysis of critical events in neurosurgery, New South Wales.

    Science.gov (United States)

    Perotti, Vanessa; Sheridan, Mark M P

    2015-09-01

    Adverse events reportedly occur in 5% to 10% of health care episodes. Not all adverse events are the result of error; they may arise from systemic faults in the delivery of health care. Catastrophic events are not only physically devastating to patients, but they also attract medical liability and increase health care costs. Root cause analysis (RCA) has become a key tool for health care services to understand those adverse events. This study is a review of all the RCA case reports involving neurosurgical patients in New South Wales between 2008 and 2013. The case reports and data were obtained from the Clinical Excellence Commission database. The data was then categorized by the root causes identified and the recommendations suggested by the RCA committees. Thirty-two case reports were identified in the RCA database. Breaches in policy account for the majority of root causes identified, for example, delays in transfer of patients or wrong-site surgery, which always involved poor adherence to correct patient and site identification procedures. The RCA committees' recommendations included education for staff, and improvements in rostering and procedural guidelines. RCAs have improved the patient safety profile; however, the RCA committees have no power to enforce any recommendation or ensure compliance. A single RCA may provide little learning beyond the unit and staff involved. However, through aggregation of RCA data and dissemination strategies, health care workers can learn from adverse events and prevent future events from occurring. © 2015 Royal Australasian College of Surgeons.

  8. Analysis for Human-related Events during the Overhaul

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Tae; Kim, Min Chull; Choi, Dong Won; Lee, Durk Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2011-10-15

    The event frequency due to human error is decreasing among 20 operating Nuclear Power Plants (NPPs) excluding the NPP (Shin-Kori unit-1) in the commissioning stage since 2008. However, the events due to human error during an overhaul (O/H) occur annually (see Table I). An analysis for human-related events during the O/H was performed. Similar problems were identified for each event from the analysis and also, organizational and safety cultural factors were also identified

  9. TEMAC, Top Event Sensitivity Analysis

    International Nuclear Information System (INIS)

    Iman, R.L.; Shortencarier, M.J.

    1988-01-01

    1 - Description of program or function: TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude of risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates. 2 - Method of solution: Sensitivity and uncertainty analyses associated with top events involve mathematical operations on the corresponding Boolean expression for the top event, as well as repeated evaluations of the top event in a Monte Carlo fashion. TEMAC employs a general matrix approach which provides a convenient general form for Boolean expressions, is computationally efficient, and allows large problems to be analyzed. 3 - Restrictions on the complexity of the problem - Maxima of: 4000 cut sets, 500 events, 500 values in a Monte Carlo sample, 16 characters in an event name. These restrictions are implemented through the FORTRAN 77 PARAMATER statement

  10. Statistical analysis of solar proton events

    Directory of Open Access Journals (Sweden)

    V. Kurt

    2004-06-01

    Full Text Available A new catalogue of 253 solar proton events (SPEs with energy >10MeV and peak intensity >10 protons/cm2.s.sr (pfu at the Earth's orbit for three complete 11-year solar cycles (1970-2002 is given. A statistical analysis of this data set of SPEs and their associated flares that occurred during this time period is presented. It is outlined that 231 of these proton events are flare related and only 22 of them are not associated with Ha flares. It is also noteworthy that 42 of these events are registered as Ground Level Enhancements (GLEs in neutron monitors. The longitudinal distribution of the associated flares shows that a great number of these events are connected with west flares. This analysis enables one to understand the long-term dependence of the SPEs and the related flare characteristics on the solar cycle which are useful for space weather prediction.

  11. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  12. NPP unusual events: data, analysis and application

    International Nuclear Information System (INIS)

    Tolstykh, V.

    1990-01-01

    Subject of the paper are the IAEA cooperative patterns of unusual events data treatment and utilization of the operating safety experience feedback. The Incident Reporting System (IRS) and the Analysis of Safety Significant Event Team (ASSET) are discussed. The IRS methodology in collection, handling, assessment and dissemination of data on NPP unusual events (deviations, incidents and accidents) occurring during operations, surveillance and maintenance is outlined by the reports gathering and issuing practice, the experts assessment procedures and the parameters of the system. After 7 years of existence the IAEA-IRS contains over 1000 reports and receives 1.5-4% of the total information on unusual events. The author considers the reports only as detailed technical 'records' of events requiring assessment. The ASSET approaches implying an in-depth occurrences analysis directed towards level-1 PSA utilization are commented on. The experts evaluated root causes for the reported events and some trends are presented. Generally, internal events due to unexpected paths of water in the nuclear installations, occurrences related to the integrity of the primary heat transport systems, events associated with the engineered safety systems and events involving human factor represent the large groups deserving close attention. Personal recommendations on how to use the events related information use for NPP safety improvement are given. 2 tabs (R.Ts)

  13. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  14. Sentiment analysis on tweets for social events

    DEFF Research Database (Denmark)

    Zhou, Xujuan; Tao, Xiaohui; Yong, Jianming

    2013-01-01

    Sentiment analysis or opinion mining is an important type of text analysis that aims to support decision making by extracting and analyzing opinion oriented text, identifying positive and negative opinions, and measuring how positively or negatively an entity (i.e., people, organization, event......, location, product, topic, etc.) is regarded. As more and more users express their political and religious views on Twitter, tweets become valuable sources of people's opinions. Tweets data can be efficiently used to infer people's opinions for marketing or social studies. This paper proposes a Tweets...... Sentiment Analysis Model (TSAM) that can spot the societal interest and general people's opinions in regard to a social event. In this paper, Australian federal election 2010 event was taken as an example for sentiment analysis experiments. We are primarily interested in the sentiment of the specific...

  15. Interpretation Analysis as a Competitive Event.

    Science.gov (United States)

    Nading, Robert M.

    Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…

  16. Cogeneration: Key feasibility analysis parameters

    International Nuclear Information System (INIS)

    Coslovi, S.; Zulian, A.

    1992-01-01

    This paper first reviews the essential requirements, in terms of scope, objectives and methods, of technical/economic feasibility analyses applied to cogeneration systems proposed for industrial plants in Italy. Attention is given to the influence on overall feasibility of the following factors: electric power and fuel costs, equipment coefficients of performance, operating schedules, maintenance costs, Italian Government taxes and financial and legal incentives. Through an examination of several feasibility studies that were done on cogeneration proposals relative to different industrial sectors, a sensitivity analysis is performed on the effects of varying the weights of different cost benefit analysis parameters. With the use of statistical analyses, standard deviations are then determined for key analysis parameters, and guidelines are suggested for analysis simplifications

  17. Surface Management System Departure Event Data Analysis

    Science.gov (United States)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  18. Second-order analysis of semiparametric recurrent event processes.

    Science.gov (United States)

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  19. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  20. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  1. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    Science.gov (United States)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  2. Analysis of the differential-phase-shift-keying protocol in the quantum-key-distribution system

    International Nuclear Information System (INIS)

    Rong-Zhen, Jiao; Chen-Xu, Feng; Hai-Qiang, Ma

    2009-01-01

    The analysis is based on the error rate and the secure communication rate as functions of distance for three quantum-key-distribution (QKD) protocols: the Bennett–Brassard 1984, the Bennett–Brassard–Mermin 1992, and the coherent differential-phase-shift keying (DPSK) protocols. We consider the secure communication rate of the DPSK protocol against an arbitrary individual attack, including the most commonly considered intercept-resend and photon-number splitting attacks, and concluded that the simple and efficient differential-phase-shift-keying protocol allows for more than 200 km of secure communication distance with high communication rates. (general)

  3. Preliminary safety analysis for key design features of KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, D. H.; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, S. O.; Lee, Y. B.; Jeong, K. S

    2000-07-01

    KAERI is currently developing the conceptual design of a liquid metal reactor, KALIMER(Korea Advanced Liquid Metal Reactor) under the long-term nuclear R and D program. In this report, descriptions of the KALIMER safety design features and safety analyses results for selected ATWS accidents are presented. First, the basic approach to achieve the safety goal is introduced in chapter 1, and the safety evaluation procedure for the KALIMER design is described in chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events. In chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure design performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram(ATWS) have been performed to investigate the KALIMER system response to the events. They are categorized as bounding events(BEs) because of their low probability of occurrence. In chapter 4, the design of the KALIMER containment dome and the results of its performance analysis are presented. The designs of the existing LMR containment and the KALIMER containment dome have been compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core kinetics and hydraulic behavior during HCDA in chapter 5. Mathematical formulations have been developed in the framework of the modified bethe-tait method, and scoping analyses have been performed for the KALIMER core behavior during super-prompt critical excursions.

  4. Application of the International Life Sciences Institute Key Events Dose-Response Framework to food contaminants.

    Science.gov (United States)

    Fenner-Crisp, Penelope A

    2012-12-01

    Contaminants are undesirable constituents in food. They may be formed during production of a processed food, present as a component in a source material, deliberately added to substitute for the proper substance, or the consequence of poor food-handling practices. Contaminants may be chemicals or pathogens. Chemicals generally degrade over time and become of less concern as a health threat. Pathogens have the ability to multiply, potentially resulting in an increased threat level. Formal structures have been lacking for systematically generating and evaluating hazard and exposure data for bioactive agents when problem situations arise. We need to know what the potential risk may be to determine whether intervention to reduce or eliminate contact with the contaminant is warranted. We need tools to aid us in assembling and assessing all available relevant information in an expeditious and scientifically sound manner. One such tool is the International Life Sciences Institute (ILSI) Key Events Dose-Response Framework (KEDRF). Developed as an extension of the WHO's International Program on Chemical Safety/ILSI mode of action/human relevance framework, it allows risk assessors to understand not only how a contaminant exerts its toxicity but also the dose response(s) for each key event and the ultimate outcome, including whether a threshold exists. This presentation will illustrate use of the KEDRF with case studies included in its development (chloroform and Listeriaonocytogenes) after its publication in the peer-reviewed scientific literature (chromium VI) and in a work in progress (3-monochloro-1, 2-propanediol).

  5. Dynamic Event Tree Analysis Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  6. An application of different dioids in public key cryptography

    International Nuclear Information System (INIS)

    Durcheva, Mariana I.

    2014-01-01

    Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme is presented

  7. An application of different dioids in public key cryptography

    Energy Technology Data Exchange (ETDEWEB)

    Durcheva, Mariana I., E-mail: mdurcheva66@gmail.com [Technical University of Sofia, Faculty of Applied Mathematics and Informatics, 8 Kliment Ohridski St., Sofia 1000 (Bulgaria)

    2014-11-18

    Dioids provide a natural framework for analyzing a broad class of discrete event dynamical systems such as the design and analysis of bus and railway timetables, scheduling of high-throughput industrial processes, solution of combinatorial optimization problems, the analysis and improvement of flow systems in communication networks. They have appeared in several branches of mathematics such as functional analysis, optimization, stochastic systems and dynamic programming, tropical geometry, fuzzy logic. In this paper we show how to involve dioids in public key cryptography. The main goal is to create key – exchange protocols based on dioids. Additionally the digital signature scheme is presented.

  8. Top event prevention analysis: A deterministic use of PRA

    International Nuclear Information System (INIS)

    Worrell, R.B.; Blanchard, D.P.

    1996-01-01

    This paper describes the application of Top Event Prevention Analysis. The analysis finds prevention sets which are combinations of basic events that can prevent the occurrence of a fault tree top event such as core damage. The problem analyzed in this application is that of choosing a subset of Motor-Operated Valves (MOVs) for testing under the Generic Letter 89-10 program such that the desired level of safety is achieved while providing economic relief from the burden of testing all safety-related valves. A brief summary of the method is given, and the process used to produce a core damage expression from Level 1 PRA models for a PWR is described. The analysis provides an alternative to the use of importance measures for finding the important combination of events in a core damage expression. This application of Top Event Prevention Analysis to the MOV problem was achieve with currently available software

  9. Spacecraft-to-Earth Communications for Juno and Mars Science Laboratory Critical Events

    Science.gov (United States)

    Soriano, Melissa; Finley, Susan; Jongeling, Andre; Fort, David; Goodhart, Charles; Rogstad, David; Navarro, Robert

    2012-01-01

    Deep Space communications typically utilize closed loop receivers and Binary Phase Shift Keying (BPSK) or Quadrature Phase Shift Keying (QPSK). Critical spacecraft events include orbit insertion and entry, descent, and landing.---Low gain antennas--> low signal -to-noise-ratio.---High dynamics such as parachute deployment or spin --> Doppler shift. During critical events, open loop receivers and Multiple Frequency Shift Keying (MFSK) used. Entry, Descent, Landing (EDL) Data Analysis (EDA) system detects tones in real-time.

  10. Synopsis of key persons, events, and associations in the history of Latino psychology.

    Science.gov (United States)

    Padilla, Amado M; Olmedo, Esteban

    2009-10-01

    In this article, we present a brief synopsis of six early Latino psychologists, several key conferences, the establishment of research centers, and early efforts to create an association for Latino psychologists. Our chronology runs from approximately 1930 to 2000. This history is a firsthand account of how these early leaders, conferences, and efforts to bring Latinos and Latinas together served as a backdrop to current research and practice in Latino psychology. This history of individuals and events is also intertwined with the American Psychological Association and the National Institute of Mental Health and efforts by Latino psychologists to obtain the professional support necessary to lay down the roots of a Latino presence in psychology. Copyright 2009 APA, all rights reserved.

  11. Event analysis in a primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Jaerventausta, P; Paulasaari, H [Tampere Univ. of Technology (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    The target of the project was to develop applications which observe the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: communication with the SCADA, the internal clock to produce time stamped event data, and the capability to register certain values during the fault. Using the above features some new functions for event analysis were developed in the project

  12. Data analysis of event tape and connection

    International Nuclear Information System (INIS)

    Gong Huili

    1995-01-01

    The data analysis on the VAX-11/780 computer is briefly described, the data is from the recorded event tape of JUHU data acquisition system on the PDP-11/44 computer. The connection of the recorded event tapes of the XSYS data acquisition system on VAX computer is also introduced

  13. ANALYSIS OF EVENT TOURISM IN RUSSIA, ITS FUNCTIONS, WAYS TO IMPROVE THE EFFICIENCY OF EVENT

    Directory of Open Access Journals (Sweden)

    Mikhail Yur'evich Grushin

    2016-01-01

    Full Text Available This article considers one of the important directions of development of the national economy in the area of tourist services – development of event tourism in the Russian Federation. Today the market of event management in Russia is in the process of formation, therefore its impact on the socio-economic development of regions and Russia as a whole is minimal, and the analysis of the influence is not performed. This problem comes to the fore in the regions of Russia, specializing in the creation of event-direction tourist-recreational cluster. The article provides an analysis of the existing market of event management and event tourism functions. Providing the ways to improve the efficiency of event management and recommendations for the organizer of events in the regions. The article shows the specific role of event tourism in the national tourism and provides direction for the development of organizational and methodical recommendations on its formation in the regions of Russia and the creation of an effective management system at the regional level. The purpose of this article is to analyze the emerging in Russia event tourism market and its specifics. On the basis of these studies are considered folding patterns of the new market and the assessment of its impact on the modern national tourism industry. Methodology. To complete this article are used comparative and economic and statistical analysis methods. Conclusions/significance. The practical importance of this article is in the elimination of existing in the national tourism industry contradictions: on the one hand, in the Russian Federation is annually held a large amount events activities, including world-class in all regions say about tourist trips to the event, but the event tourism does not exist yet. In all regions, there is an internal and inbound tourism, but it has nothing to do with the event tourism. The article has a practical conclusions demonstrate the need to adapt the

  14. Repeated Time-to-event Analysis of Consecutive Analgesic Events in Postoperative Pain

    DEFF Research Database (Denmark)

    Juul, Rasmus Vestergaard; Rasmussen, Sten; Kreilgaard, Mads

    2015-01-01

    BACKGROUND: Reduction in consumption of opioid rescue medication is often used as an endpoint when investigating analgesic efficacy of drugs by adjunct treatment, but appropriate methods are needed to analyze analgesic consumption in time. Repeated time-to-event (RTTE) modeling is proposed as a way...... to describe analgesic consumption by analyzing the timing of consecutive analgesic events. METHODS: Retrospective data were obtained from 63 patients receiving standard analgesic treatment including morphine on request after surgery following hip fracture. Times of analgesic events up to 96 h after surgery...... were extracted from hospital medical records. Parametric RTTE analysis was performed with exponential, Weibull, or Gompertz distribution of analgesic events using NONMEM®, version 7.2 (ICON Development Solutions, USA). The potential influences of night versus day, sex, and age were investigated...

  15. Computer-aided event tree analysis by the impact vector method

    International Nuclear Information System (INIS)

    Lima, J.E.P.

    1984-01-01

    In the development of the Probabilistic Risk Analysis of Angra I, the ' large event tree/small fault tree' approach was adopted for the analysis of the plant behavior in an emergency situation. In this work, the event tree methodology is presented along with the adaptations which had to be made in order to attain a correct description of the safety system performances according to the selected analysis method. The problems appearing in the application of the methodology and their respective solutions are presented and discussed, with special emphasis to the impact vector technique. A description of the ETAP code ('Event Tree Analysis Program') developed for constructing and quantifying event trees is also given in this work. A preliminary version of the small-break LOCA analysis for Angra 1 is presented as an example of application of the methodology and of the code. It is shown that the use of the ETAP code sigmnificantly contributes to decreasing the time spent in event tree analyses, making it viable the practical application of the analysis approach referred above. (author) [pt

  16. Trending analysis of precursor events

    International Nuclear Information System (INIS)

    Watanabe, Norio

    1998-01-01

    The Accident Sequence Precursor (ASP) Program of United States Nuclear Regulatory Commission (U.S.NRC) identifies and categorizes operational events at nuclear power plants in terms of the potential for core damage. The ASP analysis has been performed on yearly basis and the results have been published in the annual reports. This paper describes the trends in initiating events and dominant sequences for 459 precursors identified in the ASP Program during the 1969-94 period and also discusses a comparison with dominant sequences predicted in the past Probabilistic Risk Assessment (PRA) studies. These trends were examined for three time periods, 1969-81, 1984-87 and 1988-94. Although the different models had been used in the ASP analyses for these three periods, the distribution of precursors by dominant sequences show similar trends to each other. For example, the sequences involving loss of both main and auxiliary feedwater were identified in many PWR events and those involving loss of both high and low coolant injection were found in many BWR events. Also, it was found that these dominant sequences were comparable to those determined to be dominant in the predictions by the past PRAs. As well, a list of the 459 precursors identified are provided in Appendix, indicating initiating event types, unavailable systems, dominant sequences, conditional core damage probabilities, and so on. (author)

  17. Extracting useful knowledge from event logs

    DEFF Research Database (Denmark)

    Djenouri, Youcef; Belhadi, Asma; Fournier-Viger, Philippe

    2018-01-01

    Business process analysis is a key activity that aims at increasing the efficiency of business operations. In recent years, several data mining based methods have been designed for discovering interesting patterns in event logs. A popular type of methods consists of applying frequent itemset mini...

  18. Serious adverse events with infliximab: analysis of spontaneously reported adverse events.

    Science.gov (United States)

    Hansen, Richard A; Gartlehner, Gerald; Powell, Gregory E; Sandler, Robert S

    2007-06-01

    Serious adverse events such as bowel obstruction, heart failure, infection, lymphoma, and neuropathy have been reported with infliximab. The aims of this study were to explore adverse event signals with infliximab by using a long period of post-marketing experience, stratifying by indication. The relative reporting of infliximab adverse events to the U.S. Food and Drug Administration (FDA) was assessed with the public release version of the adverse event reporting system (AERS) database from 1968 to third quarter 2005. On the basis of a systematic review of adverse events, Medical Dictionary for Regulatory Activities (MedDRA) terms were mapped to predefined categories of adverse events, including death, heart failure, hepatitis, infection, infusion reaction, lymphoma, myelosuppression, neuropathy, and obstruction. Disproportionality analysis was used to calculate the empiric Bayes geometric mean (EBGM) and corresponding 90% confidence intervals (EB05, EB95) for adverse event categories. Infliximab was identified as the suspect medication in 18,220 reports in the FDA AERS database. We identified a signal for lymphoma (EB05 = 6.9), neuropathy (EB05 = 3.8), infection (EB05 = 2.9), and bowel obstruction (EB05 = 2.8). The signal for granulomatous infections was stronger than the signal for non-granulomatous infections (EB05 = 12.6 and 2.4, respectively). The signals for bowel obstruction and infusion reaction were specific to patients with IBD; this suggests potential confounding by indication, especially for bowel obstruction. In light of this additional evidence of risk of lymphoma, neuropathy, and granulomatous infections, clinicians should stress this risk in the shared decision-making process.

  19. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  20. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  1. Multi-Unit Initiating Event Analysis for a Single-Unit Internal Events Level 1 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong San; Park, Jin Hee; Lim, Ho Gon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The Fukushima nuclear accident in 2011 highlighted the importance of considering the risks from multi-unit accidents at a site. The ASME/ANS probabilistic risk assessment (PRA) standard also includes some requirements related to multi-unit aspects, one of which (IE-B5) is as follows: 'For multi-unit sites with shared systems, DO NOT SUBSUME multi-unit initiating events if they impact mitigation capability [1].' However, the existing single-unit PSA models do not explicitly consider multi-unit initiating events and hence systems shared by multiple units (e.g., alternate AC diesel generator) are fully credited for the single unit and ignores the need for the shared systems by other units at the same site [2]. This paper describes the results of the multi-unit initiating event (IE) analysis performed as a part of the at-power internal events Level 1 probabilistic safety assessment (PSA) for an OPR1000 single unit ('reference unit'). In this study, a multi-unit initiating event analysis for a single-unit PSA was performed, and using the results, dual-unit LOOP initiating event was added to the existing PSA model for the reference unit (OPR1000 type). Event trees were developed for dual-unit LOOP and dual-unit SBO which can be transferred from dual- unit LOOP. Moreover, CCF basic events for 5 diesel generators were modelled. In case of simultaneous SBO occurrences in both units, this study compared two different assumptions on the availability of the AAC D/G. As a result, when dual-unit LOOP initiating event was added to the existing single-unit PSA model, the total CDF increased by 1∼ 2% depending on the probability that the AAC D/G is available to a specific unit in case of simultaneous SBO in both units.

  2. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  3. Analysis of Paks NPP Personnel Activity during Safety Related Event Sequences

    International Nuclear Information System (INIS)

    Bareith, A.; Hollo, Elod; Karsa, Z.; Nagy, S.

    1998-01-01

    Within the AGNES Project (Advanced Generic and New Evaluation of Safety) the Level-1 PSA model of the Paks NPP Unit 3 was developed in form of a detailed event tree/fault tree structure (53 initiating events, 580 event sequences, 6300 basic events are involved). This model gives a good basis for quantitative evaluation of potential consequences of actually occurred safety-related events, i.e. for precursor event studies. To make these studies possible and efficient, the current qualitative event analysis practice should be reviewed and a new additional quantitative analysis procedure and system should be developed and applied. The present paper gives an overview of the method outlined for both qualitative and quantitative analyses of the operator crew activity during off-normal situations. First, the operator performance experienced during past operational events is discussed. Sources of raw information, the qualitative evaluation process, the follow-up actions, as well as the documentation requirements are described. Second, the general concept of the proposed precursor event analysis is described. Types of modeled interactions and the considered performance influences are presented. The quantification of the potential consequences of the identified precursor events is based on the task-oriented, Level-1 PSA model of the plant unit. A precursor analysis system covering the evaluation of operator activities is now under development. Preliminary results gained during a case study evaluation of a past historical event are presented. (authors)

  4. Cause analysis and preventives for human error events in Daya Bay NPP

    International Nuclear Information System (INIS)

    Huang Weigang; Zhang Li

    1998-01-01

    Daya Bay Nuclear Power Plant is put into commercial operation in 1994 Until 1996, there are 368 human error events in operating and maintenance area, occupying 39% of total events. These events occurred mainly in the processes of maintenance, test equipment isolation and system on-line, in particular in refuelling and maintenance. The author analyses root causes for human errorievents, which are mainly operator omission or error procedure deficiency; procedure not followed; lack of training; communication failures; work management inadequacy. The protective measures and treatment principle for human error events are also discussed, and several examples applying them are given. Finally, it is put forward that key to prevent human error event lies in the coordination and management, person in charge of work, and good work habits of staffs

  5. Parallel processor for fast event analysis

    International Nuclear Information System (INIS)

    Hensley, D.C.

    1983-01-01

    Current maximum data rates from the Spin Spectrometer of approx. 5000 events/s (up to 1.3 MBytes/s) and minimum analysis requiring at least 3000 operations/event require a CPU cycle time near 70 ns. In order to achieve an effective cycle time of 70 ns, a parallel processing device is proposed where up to 4 independent processors will be implemented in parallel. The individual processors are designed around the Am2910 Microsequencer, the AM29116 μP, and the Am29517 Multiplier. Satellite histogramming in a mass memory system will be managed by a commercial 16-bit μP system

  6. Analysis of event-mode data with Interactive Data Language

    International Nuclear Information System (INIS)

    De Young, P.A.; Hilldore, B.B.; Kiessel, L.M.; Peaslee, G.F.

    2003-01-01

    We have developed an analysis package for event-mode data based on Interactive Data Language (IDL) from Research Systems Inc. This high-level language is high speed, array oriented, object oriented, and has extensive visual (multi-dimensional plotting) and mathematical functions. We have developed a general framework, written in IDL, for the analysis of a variety of experimental data that does not require significant customization for each analysis. Unlike many traditional analysis package, spectra and gates are applied after data are read and are easily changed as analysis proceeds without rereading the data. The events are not sequentially processed into predetermined arrays subject to predetermined gates

  7. Attack Graph Construction for Security Events Analysis

    Directory of Open Access Journals (Sweden)

    Andrey Alexeevich Chechulin

    2014-09-01

    Full Text Available The paper is devoted to investigation of the attack graphs construction and analysis task for a network security evaluation and real-time security event processing. Main object of this research is the attack modeling process. The paper contains the description of attack graphs building, modifying and analysis technique as well as overview of implemented prototype for network security analysis based on attack graph approach.

  8. Simple Public Key Infrastructure Protocol Analysis and Design

    National Research Council Canada - National Science Library

    Vidergar, Alexander G

    2005-01-01

    ...). This thesis aims at proving the applicability of the Simple Public Key Infrastructure (SPKI) as a means of PKC. The strand space approach of Guttman and Thayer is used to provide an appropriate model for analysis...

  9. Analysis of Faraday Mirror in Auto-Compensating Quantum Key Distribution

    International Nuclear Information System (INIS)

    Wei Ke-Jin; Ma Hai-Qiang; Li Rui-Xue; Zhu Wu; Liu Hong-Wei; Zhang Yong; Jiao Rong-Zhen

    2015-01-01

    The ‘plug and play’ quantum key distribution system is the most stable and the earliest commercial system in the quantum communication field. Jones matrix and Jones calculus are widely used in the analysis of this system and the improved version, which is called the auto-compensating quantum key distribution system. Unfortunately, existing analysis has two drawbacks: only the auto-compensating process is analyzed and existing systems do not fully consider laser phase affected by a Faraday mirror (FM). In this work, we present a detailed analysis of the output of light pulse transmitting in a plug and play quantum key distribution system that contains only an FM, by Jones calculus. A similar analysis is made to a home-made auto-compensating system which contains two FMs to compensate for environmental effects. More importantly, we show that theoretical and experimental results are different in the plug and play interferometric setup due to the fact that a conventional Jones matrix of FM neglected an additional phase π on alternative polarization direction. To resolve the above problem, we give a new Jones matrix of an FM according to the coordinate rotation. This new Jones matrix not only resolves the above contradiction in the plug and play interferometric setup, but also is suitable for the previous analyses about auto-compensating quantum key distribution. (paper)

  10. HiggsToFourLeptonsEV in the ATLAS EventView Analysis Framework

    CERN Document Server

    Lagouri, T; Del Peso, J

    2008-01-01

    ATLAS is one of the four experiments at the Large Hadron Collider (LHC) at CERN. This experiment has been designed to study a large range of physics topics, including searches for previously unobserved phenomena such as the Higgs Boson and super-symmetry. The physics analysis package HiggsToFourLeptonsEV for the Standard Model (SM) Higgs to four leptons channel with ATLAS is presented. The physics goal is to investigate with the ATLAS detector, the SM Higgs boson discovery potential through its observation in the four-lepton (electron and muon) final state. HiggsToFourLeptonsEV is based on the official ATLAS software ATHENA and the EventView (EV) analysis framework. EventView is a highly flexible and modular analysis framework in ATHENA and it is one of several analysis schemes for ATLAS physics user analysis. At the core of the EventView is the representative "view" of an event, which defines the contents of event data suitable for event-level physics analysis. The HiggsToFourLeptonsEV package, presented in ...

  11. Extreme Space Weather Events: From Cradle to Grave

    Science.gov (United States)

    Riley, Pete; Baker, Dan; Liu, Ying D.; Verronen, Pekka; Singer, Howard; Güdel, Manuel

    2018-02-01

    Extreme space weather events, while rare, can have a substantial impact on our technologically-dependent society. And, although such events have only occasionally been observed, through careful analysis of a wealth of space-based and ground-based observations, historical records, and extrapolations from more moderate events, we have developed a basic picture of the components required to produce them. Several key issues, however, remain unresolved. For example, what limits are imposed on the maximum size of such events? What are the likely societal consequences of a so-called "100-year" solar storm? In this review, we summarize our current scientific understanding about extreme space weather events as we follow several examples from the Sun, through the solar corona and inner heliosphere, across the magnetospheric boundary, into the ionosphere and atmosphere, into the Earth's lithosphere, and, finally, its impact on man-made structures and activities, such as spacecraft, GPS signals, radio communication, and the electric power grid. We describe preliminary attempts to provide probabilistic forecasts of extreme space weather phenomena, and we conclude by identifying several key areas that must be addressed if we are better able to understand, and, ultimately, predict extreme space weather events.

  12. Physics analysis of the gang partial rod drive event

    International Nuclear Information System (INIS)

    Boman, C.; Frost, R.L.

    1992-08-01

    During the routine positioning of partial-length control rods in Gang 3 on the afternoon of Monday, July 27, 1992, the partial-length rods continued to drive into the reactor even after the operator released the controlling toggle switch. In response to this occurrence, the Safety Analysis and Engineering Services Group (SAEG) requested that the Applied Physics Group (APG) analyze the gang partial rod drive event. Although similar accident scenarios were considered in analysis for Chapter 15 of the Safety Analysis Report (SAR), APG and SAEG conferred and agreed that this particular type of gang partial-length rod motion event was not included in the SAR. This report details this analysis

  13. Multistate event history analysis with frailty

    Directory of Open Access Journals (Sweden)

    Govert Bijwaard

    2014-05-01

    Full Text Available Background: In survival analysis a large literature using frailty models, or models with unobserved heterogeneity, exists. In the growing literature and modelling on multistate models, this issue is only in its infant phase. Ignoring frailty can, however, produce incorrect results. Objective: This paper presents how frailties can be incorporated into multistate models, with an emphasis on semi-Markov multistate models with a mixed proportional hazard structure. Methods: First, the aspects of frailty modeling in univariate (proportional hazard, Cox and multivariate event history models are addressed. The implications of choosing shared or correlated frailty is highlighted. The relevant differences with recurrent events data are covered next. Multistate models are event history models that can have both multivariate and recurrent events. Incorporating frailty in multistate models, therefore, brings all the previously addressed issues together. Assuming a discrete frailty distribution allows for a very general correlation structure among the transition hazards in a multistate model. Although some estimation procedures are covered the emphasis is on conceptual issues. Results: The importance of multistate frailty modeling is illustrated with data on labour market and migration dynamics of recent immigrants to the Netherlands.

  14. Probabilistic analysis of extreme wind events

    Energy Technology Data Exchange (ETDEWEB)

    Chaviaropoulos, P.K. [Center for Renewable Energy Sources (CRES), Pikermi Attikis (Greece)

    1997-12-31

    A vital task in wind engineering and meterology is to understand, measure, analyse and forecast extreme wind conditions, due to their significant effects on human activities and installations like buildings, bridges or wind turbines. The latest version of the IEC standard (1996) pays particular attention to the extreme wind events that have to be taken into account when designing or certifying a wind generator. Actually, the extreme wind events within a 50 year period are those which determine the ``static`` design of most of the wind turbine components. The extremes which are important for the safety of wind generators are those associated with the so-called ``survival wind speed``, the extreme operating gusts and the extreme wind direction changes. A probabilistic approach for the analysis of these events is proposed in this paper. Emphasis is put on establishing the relation between extreme values and physically meaningful ``site calibration`` parameters, like probability distribution of the annual wind speed, turbulence intensity and power spectra properties. (Author)

  15. High-dimensional quantum key distribution with the entangled single-photon-added coherent state

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yang [Zhengzhou Information Science and Technology Institute, Zhengzhou, 450001 (China); Synergetic Innovation Center of Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Bao, Wan-Su, E-mail: 2010thzz@sina.com [Zhengzhou Information Science and Technology Institute, Zhengzhou, 450001 (China); Synergetic Innovation Center of Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China); Bao, Hai-Ze; Zhou, Chun; Jiang, Mu-Sheng; Li, Hong-Wei [Zhengzhou Information Science and Technology Institute, Zhengzhou, 450001 (China); Synergetic Innovation Center of Quantum Information and Quantum Physics, University of Science and Technology of China, Hefei, Anhui 230026 (China)

    2017-04-25

    High-dimensional quantum key distribution (HD-QKD) can generate more secure bits for one detection event so that it can achieve long distance key distribution with a high secret key capacity. In this Letter, we present a decoy state HD-QKD scheme with the entangled single-photon-added coherent state (ESPACS) source. We present two tight formulas to estimate the single-photon fraction of postselected events and Eve's Holevo information and derive lower bounds on the secret key capacity and the secret key rate of our protocol. We also present finite-key analysis for our protocol by using the Chernoff bound. Our numerical results show that our protocol using one decoy state can perform better than that of previous HD-QKD protocol with the spontaneous parametric down conversion (SPDC) using two decoy states. Moreover, when considering finite resources, the advantage is more obvious. - Highlights: • Implement the single-photon-added coherent state source into the high-dimensional quantum key distribution. • Enhance both the secret key capacity and the secret key rate compared with previous schemes. • Show an excellent performance in view of statistical fluctuations.

  16. High-dimensional quantum key distribution with the entangled single-photon-added coherent state

    International Nuclear Information System (INIS)

    Wang, Yang; Bao, Wan-Su; Bao, Hai-Ze; Zhou, Chun; Jiang, Mu-Sheng; Li, Hong-Wei

    2017-01-01

    High-dimensional quantum key distribution (HD-QKD) can generate more secure bits for one detection event so that it can achieve long distance key distribution with a high secret key capacity. In this Letter, we present a decoy state HD-QKD scheme with the entangled single-photon-added coherent state (ESPACS) source. We present two tight formulas to estimate the single-photon fraction of postselected events and Eve's Holevo information and derive lower bounds on the secret key capacity and the secret key rate of our protocol. We also present finite-key analysis for our protocol by using the Chernoff bound. Our numerical results show that our protocol using one decoy state can perform better than that of previous HD-QKD protocol with the spontaneous parametric down conversion (SPDC) using two decoy states. Moreover, when considering finite resources, the advantage is more obvious. - Highlights: • Implement the single-photon-added coherent state source into the high-dimensional quantum key distribution. • Enhance both the secret key capacity and the secret key rate compared with previous schemes. • Show an excellent performance in view of statistical fluctuations.

  17. Fumonisin exposure in women linked to inhibition of an enzyme that is a key event in farm and laboratory animal diseases.

    Science.gov (United States)

    Fumonisin B1 (FB1) is a toxic chemical produced by molds. The molds that produce fumonisin are common in corn. Consumption of contaminated corn by farm animals has been shown to be the cause of animal disease. The proximate cause (key event) in the induction of diseases in animals is inhibition of t...

  18. Key drivers and economic consequences of high-end climate scenarios: uncertainties and risks

    DEFF Research Database (Denmark)

    Halsnæs, Kirsten; Kaspersen, Per Skougaard; Drews, Martin

    2015-01-01

    The consequences of high-end climate scenarios and the risks of extreme events involve a number of critical assumptions and methodological challenges related to key uncertainties in climate scenarios and modelling, impact analysis, and economics. A methodological framework for integrated analysis...... of extreme events increase beyond scaling, and in combination with economic assumptions we find a very wide range of risk estimates for urban precipitation events. A sensitivity analysis addresses 32 combinations of climate scenarios, damage cost curve approaches, and economic assumptions, including risk...... aversion and equity represented by discount rates. Major impacts of alternative assumptions are investigated. As a result, this study demonstrates that in terms of decision making the actual expectations concerning future climate scenarios and the economic assumptions applied are very important...

  19. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  20. Event by event physics in ALICE

    CERN Document Server

    Christakoglou, Panos

    2009-01-01

    Fluctuations of thermodynamic quantities are fundamental for the study of the QGP phase transition. The ALICE experiment is well suited for precise event-by-event measurements of various quantities. In this article, we review the capabilities of ALICE to study the fluctuations of several key observables such as the net charge, the temperature, and the particle ratios. Among the observables related to correlations, we review the balance functions and the long range correlations.

  1. Internal event analysis of Laguna Verde Unit 1 Nuclear Power Plant. System Analysis

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis of Laguna Verde Unit 1 Nuclear Power Plant , CNSNS-TR-004, in five volumes. The reports are organized as follows: CNSNS-TR-004 Volume 1: Introduction and Methodology. CNSNS-TR-004 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR-004 Volume 3: System Analysis. CNSNS-TR-004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR-004 Volume 5: Appendices A, B and C. This volume presents the results of the system analysis for the Laguna Verde Unit 1 Nuclear Power Plant. The system analysis involved the development of logical models for all the systems included in the accident sequence event tree headings, and for all the support systems required to operate the front line systems. For the Internal Event analysis for Laguna Verde, 16 front line systems and 5 support systems were included. Detailed fault trees were developed for most of the important systems. Simplified fault trees focusing on major faults were constructed for those systems that can be adequately represent,ed using this kind of modeling. For those systems where fault tree models were not constructed, actual data were used to represent the dominant failures of the systems. The main failures included in the fault trees are hardware failures, test and maintenance unavailabilities, common cause failures, and human errors. The SETS and TEMAC codes were used to perform the qualitative and quantitative fault tree analyses. (Author)

  2. Florbetaben PET in the Early Diagnosis of Alzheimer's Disease: A Discrete Event Simulation to Explore Its Potential Value and Key Data Gaps

    Science.gov (United States)

    Guo, Shien; Getsios, Denis; Hernandez, Luis; Cho, Kelly; Lawler, Elizabeth; Altincatal, Arman; Lanes, Stephan; Blankenburg, Michael

    2012-01-01

    The growing understanding of the use of biomarkers in Alzheimer's disease (AD) may enable physicians to make more accurate and timely diagnoses. Florbetaben, a beta-amyloid tracer used with positron emission tomography (PET), is one of these diagnostic biomarkers. This analysis was undertaken to explore the potential value of florbetaben PET in the diagnosis of AD among patients with suspected dementia and to identify key data that are needed to further substantiate its value. A discrete event simulation was developed to conduct exploratory analyses from both US payer and societal perspectives. The model simulates the lifetime course of disease progression for individuals, evaluating the impact of their patient management from initial diagnostic work-up to final diagnosis. Model inputs were obtained from specific analyses of a large longitudinal dataset from the New England Veterans Healthcare System and supplemented with data from public data sources and assumptions. The analyses indicate that florbetaben PET has the potential to improve patient outcomes and reduce costs under certain scenarios. Key data on the use of florbetaben PET, such as its influence on time to confirmation of final diagnosis, treatment uptake, and treatment persistency, are unavailable and would be required to confirm its value. PMID:23326754

  3. Florbetaben PET in the Early Diagnosis of Alzheimer's Disease: A Discrete Event Simulation to Explore Its Potential Value and Key Data Gaps

    Directory of Open Access Journals (Sweden)

    Shien Guo

    2012-01-01

    Full Text Available The growing understanding of the use of biomarkers in Alzheimer's disease (AD may enable physicians to make more accurate and timely diagnoses. Florbetaben, a beta-amyloid tracer used with positron emission tomography (PET, is one of these diagnostic biomarkers. This analysis was undertaken to explore the potential value of florbetaben PET in the diagnosis of AD among patients with suspected dementia and to identify key data that are needed to further substantiate its value. A discrete event simulation was developed to conduct exploratory analyses from both US payer and societal perspectives. The model simulates the lifetime course of disease progression for individuals, evaluating the impact of their patient management from initial diagnostic work-up to final diagnosis. Model inputs were obtained from specific analyses of a large longitudinal dataset from the New England Veterans Healthcare System and supplemented with data from public data sources and assumptions. The analyses indicate that florbetaben PET has the potential to improve patient outcomes and reduce costs under certain scenarios. Key data on the use of florbetaben PET, such as its influence on time to confirmation of final diagnosis, treatment uptake, and treatment persistency, are unavailable and would be required to confirm its value.

  4. Event Shape Analysis in ALICE

    CERN Document Server

    AUTHOR|(CDS)2073367; Paic, Guy

    2009-01-01

    The jets are the final state manifestation of the hard parton scattering. Since at LHC energies the production of hard processes in proton-proton collisions will be copious and varied, it is important to develop methods to identify them through the study of their final states. In the present work we describe a method based on the use of some shape variables to discriminate events according their topologies. A very attractive feature of this analysis is the possibility of using the tracking information of the TPC+ITS in order to identify specific events like jets. Through the correlation between the quantities: thrust and recoil, calculated in minimum bias simulations of proton-proton collisions at 10 TeV, we show the sensitivity of the method to select specific topologies and high multiplicity. The presented results were obtained both at level generator and after reconstruction. It remains that with any kind of jet reconstruction algorithm one will confronted in general with overlapping jets. The present meth...

  5. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    Science.gov (United States)

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  6. Screening Analysis of Criticality Features, Events, and Processes for License Application

    International Nuclear Information System (INIS)

    J.A. McClure

    2004-01-01

    This report documents the screening analysis of postclosure criticality features, events, and processes. It addresses the probability of criticality events resulting from degradation processes as well as disruptive events (i.e., seismic, rock fall, and igneous). Probability evaluations are performed utilizing the configuration generator described in ''Configuration Generator Model'', a component of the methodology from ''Disposal Criticality Analysis Methodology Topical Report''. The total probability per package of criticality is compared against the regulatory probability criterion for inclusion of events established in 10 CFR 63.114(d) (consider only events that have at least one chance in 10,000 of occurring over 10,000 years). The total probability of criticality accounts for the evaluation of identified potential critical configurations of all baselined commercial and U.S. Department of Energy spent nuclear fuel waste form and waste package combinations, both internal and external to the waste packages. This criticality screening analysis utilizes available information for the 21-Pressurized Water Reactor Absorber Plate, 12-Pressurized Water Reactor Absorber Plate, 44-Boiling Water Reactor Absorber Plate, 24-Boiling Water Reactor Absorber Plate, and the 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Short waste package types. Where defensible, assumptions have been made for the evaluation of the following waste package types in order to perform a complete criticality screening analysis: 21-Pressurized Water Reactor Control Rod, 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Long, and 2-Multi-Canister Overpack/2-Defense High-Level Radioactive Waste package types. The inputs used to establish probabilities for this analysis report are based on information and data generated for the Total System Performance Assessment for the License Application, where available. This analysis report determines whether criticality is to be

  7. Preliminary Analysis of the Common Cause Failure Events for Domestic Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Daeil; Han, Sanghoon

    2007-01-01

    It is known that the common cause failure (CCF) events have a great effect on the safety and probabilistic safety assessment (PSA) results of nuclear power plants (NPPs). However, the domestic studies have been mainly focused on the analysis method and modeling of CCF events. Thus, the analysis of the CCF events for domestic NPPs were performed to establish a domestic database for the CCF events and to deliver them to the operation office of the international common cause failure data exchange (ICDE) project. This paper presents the analysis results of the CCF events for domestic nuclear power plants

  8. Analysis of system and of course of events

    International Nuclear Information System (INIS)

    Hoertner, H.; Kersting, E.J.; Puetter, B.M.

    1986-01-01

    The analysis of the system and of the course of events is used to determine the frequency of core melt-out accidents and to describe the safety-related boundary conditions of appropriate accidents. The lecture is concerned with the effect of system changes in the reference plant and the effect of triggering events not assessed in detail or not sufficiently assessed in detail in phase A of the German Risk Study on the frequency of core melt-out accidents, the minimum requirements for system functions for controlling triggering events, i.e. to prevent core melt-out accidents, the reliability data important for reliability investigations and frequency assessments. (orig./DG) [de

  9. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    Science.gov (United States)

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  10. Analysis of selected methods for the recovery of encrypted WEP key

    Science.gov (United States)

    Wójtowicz, Sebastian; Belka, Radosław

    2014-11-01

    This paper deals with some of the WEP (Wired Equivalent Privacy) key decryption methods based on aircrack-ng software, which was embedded in Backtrack operating system (Linux distribution). The 64-bit (40-bit) and 128-bit (104- bit) key encrypted with RC4 cipher weakness was shown. Research methods were made in different network environments. In this work we compared different types of keys to check how strong the RC4 stream cipher can be. The 40-bit and 104-bit WEP key has been tested on IEEE 802.11 based wireless LAN using laptop with live-CD Linux operating system. A short analysis of key creation methods was performed to compare the amount of time necessary to decrypt random and nonrandom WEP keys.

  11. Analysis of catchments response to severe drought event for ...

    African Journals Online (AJOL)

    Nafiisah

    The run sum analysis method was a sound method which indicates in ... intensity and duration of stream flow depletion between nearby catchments. ... threshold level analysis method, and allows drought events to be described in more.

  12. Event history analysis and the cross-section

    DEFF Research Database (Denmark)

    Keiding, Niels

    2006-01-01

    Examples are given of problems in event history analysis, where several time origins (generating calendar time, age, disease duration, time on study, etc.) are considered simultaneously. The focus is on complex sampling patterns generated around a cross-section. A basic tool is the Lexis diagram....

  13. Use of PSA for the analysis of operational events in nuclear power plants

    International Nuclear Information System (INIS)

    Hulsmans, M.

    2006-01-01

    An operational event is a safety-relevant incident that occurred in an industrial installation like a nuclear power plant (NPP). The probabilistic approach to event analysis focuses on the potential consequences of an operational event. Within its scope of application, it provides a quantitative assessment of the risk significance of this event (and similar events): it calculates the risk increase induced by the event. Such analyses may result in a more objective and a more accurate event severity measure than those provided by commonly used qualitative methods. Probabilistic event analysis complements the traditional event analysis approaches that are oriented towards the understanding of the (root) causes of an event. In practice, risk-based precursor analysis consists of the mapping of an operational event on a risk model of the installation, such as a probabilistic safety analysis (PSA) model. Precursor analyses result in an objective risk ranking of safety-significant events, called accident precursors. An unexpectedly high (or low) risk increase value is in itself already an important finding. This assessment also yields a lot of information on the structure of the risk, since the underlying dominant factors can easily be determined. Relevant 'what if' studies on similar events and conditions can be identified and performed (which is generally not considered in conventional event analysis), with the potential to yield even broader findings. The findings of such a structured assessment can be used for other purposes than merely risk ranking. The operational experience feedback process can be improved by helping to identify design measures and operational practices in order to prevent re-occurrence or in order to mitigate future consequences, and even to evaluate their expected effectiveness, contributing to the validation and prioritization of corrective measures. Confirmed and re-occurring precursors with correlated characteristics may point out opportunities

  14. External events analysis in PSA studies for Czech NPPs

    International Nuclear Information System (INIS)

    Holy, J.; Hustak, S.; Kolar, L.; Jaros, M.; Hladky, M.; Mlady, O.

    2014-01-01

    The purpose of the paper is to summarize current status of natural external hazards analysis in the PSA projects maintained in Czech Republic for both Czech NPPs - Dukovany and Temelin. The focus of the presentation is put upon the basic milestones in external event analysis effort - identification of external hazards important for Czech NPPs sites, screening out of the irrelevant hazards, modeling of plant response to the initiating events, including the basic activities regarding vulnerability and fragility analysis (supported with on-site analysis), quantification of accident sequences, interpretation of results and development of measures decreasing external events risk. The following external hazards are discussed in the paper, which have been addressed during several last years in PSA projects for Czech NPPs: 1)seismicity, 2)extremely low temperature 3)extremely high temperature 4)extreme wind 5)extreme precipitation (water, snow) 6)transport of dangerous substances (as an example of man-made hazard with some differences identified in comparison with natural hazards) 7)other hazards, which are not considered as very important for Czech NPPs, were screened out in the initial phase of the analysis, but are known as potential problem areas abroad. The paper is a result of coordinated effort with participation of experts and staff from engineering support organization UJV Rez, a.s. and NPPs located in Czech Republic - Dukovany and Temelin. (authors)

  15. Analysis of event tree with imprecise inputs by fuzzy set theory

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Chun, Moon Hyun

    1990-01-01

    Fuzzy set theory approach is proposed as a method to analyze event trees with imprecise or linguistic input variables such as 'likely' or 'improbable' instead of the numerical probability. In this paper, it is shown how the fuzzy set theory can be applied to the event tree analysis. The result of this study shows that the fuzzy set theory approach can be applied as an acceptable and effective tool for analysis of the event tree with fuzzy type of inputs. Comparisons of the fuzzy theory approach with the probabilistic approach of computing probabilities of final states of the event tree through subjective weighting factors and LHS technique show that the two approaches have common factors and give reasonable results

  16. Analysis of Key Factors Driving Japan’s Military Normalization

    Science.gov (United States)

    2017-09-01

    no change to our policy of not giving in to terrorism.”40 Though the prime minister was democratically supported, Koizumi’s leadership style took...of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and...analysis of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and

  17. Regression analysis of mixed recurrent-event and panel-count data.

    Science.gov (United States)

    Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L

    2014-07-01

    In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20: , 1-42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Uncertainty analysis of one Main Circulation Pump trip event at the Ignalina NPP

    International Nuclear Information System (INIS)

    Vileiniskis, V.; Kaliatka, A.; Uspuras, E.

    2004-01-01

    One Main Circulation Pump (MCP) trip event is an anticipated transient with expected frequency of approximately one event per year. There were a few events when one MCP was inadvertently tripped. The throughput of the rest running pumps in the affected Main Circulation Circuit loop increased, however, the total coolant flow through the affected loop decreased. The main question arises whether this coolant flow rate is sufficient for adequate core cooling. This paper presents an investigation of one MCP trip event at the Ignalina NPP. According to international practice, the transient analysis should consist of deterministic analysis by employing best-estimate codes and uncertainty analysis. For that purpose, the plant's RELAP5 model and the GRS (Germany) System for Uncertainty and Sensitivity Analysis package (SUSA) were employed. Uncertainty analysis of flow energy loss in different parts of the Main Circulation Circuit, initial conditions and code-selected models was performed. Such analysis allows to estimate the influence of separate parameters on calculation results and to find the modelling parameters that have the largest impact on the event studied. On the basis of this analysis, recommendations for the further improvement of the model have been developed. (author)

  19. Root Cause Analysis: Learning from Adverse Safety Events.

    Science.gov (United States)

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. © RSNA, 2015.

  20. Six key elements' analysis of FAC effective management in nuclear power plant

    International Nuclear Information System (INIS)

    Zhong Zhaojiang; Chen Hanming

    2010-01-01

    Corporate Commitment, Analysis, Operating Experience, Inspection, Training and Engineering Judgment, Long-Term Strategy are the six key elements of FAC effective management in nuclear power plant. Corporate commitment is the economy base of FAC management and ensure of management system, Analysis is the method of FAC's optimization and consummation, Operating experience is the reference and complementarity of FAC, Inspection is the base of accumulating FAC data, Training and engineering judgment is the technical complementarity and deepening, Long-term strategy is successful key of FAC management. Six key elements supplement each other, and make up of a full system of FAC effective management. For present FAC management in our national nuclear power plant, six key elements are the core and bring out the best in each other to found the FAC effective management system and prevent great FAC occurrence. (authors)

  1. Analysis of human error and organizational deficiency in events considering risk significance

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Kim, Yoonik; Kim, Say Hyung; Kim, Chansoo; Chung, Chang Hyun; Jung, Won Dea

    2004-01-01

    In this study, we analyzed human and organizational deficiencies in the trip events of Korean nuclear power plants. K-HPES items were used in human error analysis, and the organizational factors by Jacobs and Haber were used for organizational deficiency analysis. We proposed the use of CCDP as a risk measure to consider risk information in prioritizing K-HPES items and organizational factors. Until now, the risk significance of events has not been considered in human error and organizational deficiency analysis. Considering the risk significance of events in the process of analysis is necessary for effective enhancement of nuclear power plant safety by focusing on causes of human error and organizational deficiencies that are associated with significant risk

  2. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    Energy Technology Data Exchange (ETDEWEB)

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K. (and others)

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  3. Key Concept Identification: A Comprehensive Analysis of Frequency and Topical Graph-Based Approaches

    Directory of Open Access Journals (Sweden)

    Muhammad Aman

    2018-05-01

    Full Text Available Automatic key concept extraction from text is the main challenging task in information extraction, information retrieval and digital libraries, ontology learning, and text analysis. The statistical frequency and topical graph-based ranking are the two kinds of potentially powerful and leading unsupervised approaches in this area, devised to address the problem. To utilize the potential of these approaches and improve key concept identification, a comprehensive performance analysis of these approaches on datasets from different domains is needed. The objective of the study presented in this paper is to perform a comprehensive empirical analysis of selected frequency and topical graph-based algorithms for key concept extraction on three different datasets, to identify the major sources of error in these approaches. For experimental analysis, we have selected TF-IDF, KP-Miner and TopicRank. Three major sources of error, i.e., frequency errors, syntactical errors and semantical errors, and the factors that contribute to these errors are identified. Analysis of the results reveals that performance of the selected approaches is significantly degraded by these errors. These findings can help us develop an intelligent solution for key concept extraction in the future.

  4. Civil protection and Damaging Hydrogeological Events: comparative analysis of the 2000 and 2015 events in Calabria (southern Italy

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2017-11-01

    Full Text Available Calabria (southern Italy is a flood prone region, due to both its rough orography and fast hydrologic response of most watersheds. During the rainy season, intense rain affects the region, triggering floods and mass movements that cause economic damage and fatalities. This work presents a methodological approach to perform the comparative analysis of two events affecting the same area at a distance of 15 years, by collecting all the qualitative and quantitative features useful to describe both rain and damage. The aim is to understand if similar meteorological events affecting the same area can have different outcomes in terms of damage. The first event occurred between 8 and 10 September 2000, damaged 109 out of 409 municipalities of the region and killed 13 people in a campsite due to a flood. The second event, which occurred between 30 October and 1 November 2015, damaged 79 municipalities, and killed a man due to a flood. The comparative analysis highlights that, despite the exceptionality of triggering daily rain was higher in the 2015 event, the damage caused by the 2000 event to both infrastructures and belongings was higher, and it was strongly increased due to the 13 flood victims. We concluded that, in the 2015 event, the management of pre-event phases, with the issuing of meteorological alert, and the emergency management, with the preventive evacuation of people in hazardous situations due to landslides or floods, contributed to reduce the number of victims.

  5. Analysis of operational events by ATHEANA framework for human factor modelling

    International Nuclear Information System (INIS)

    Bedreaga, Luminita; Constntinescu, Cristina; Doca, Cezar; Guzun, Basarab

    2007-01-01

    In the area of human reliability assessment, the experts recognise the fact that the current methods have not represented correctly the role of human in prevention, initiating and mitigating the accidents in nuclear power plants. The nature of this deficiency appears because the current methods used in modelling of human factor have not taken into account the human performance and reliability such as it has been observed in the operational events. ATHEANA - A Technique for Human Error ANAlysis - is a new methodology for human analysis that has included the specific data of operational events and also psychological models for human behaviour. This method has included new elements such as the unsafe action and error mechanisms. In this paper we present the application of ATHEANA framework in the analysis of operational events that appeared in different nuclear power plants during 1979-2002. The analysis of operational events has consisted of: - identification of the unsafe actions; - including the unsafe actions into a category, omission ar commission; - establishing the type of error corresponding to the unsafe action: slip, lapse, mistake and circumvention; - establishing the influence of performance by shaping the factors and some corrective actions. (authors)

  6. Initiating events in the safety probabilistic analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Stasiulevicius, R.

    1989-01-01

    The importance of the initiating event in the probabilistic safety analysis of nuclear power plants are discussed and the basic procedures necessary for preparing reports, quantification and grouping of the events are described. The examples of initiating events with its occurence medium frequency, included those calculated for OCONEE reactor and Angra-1 reactor are presented. (E.G.)

  7. Safety culture in nuclear installations: Bangladesh perspectives and key lessons learned from major events

    International Nuclear Information System (INIS)

    Jalil, A.; Rabbani, G.

    2002-01-01

    Steps necessary to be taken to ensure safety in nuclear installations are suggested. One of the steps suggested is enhancing the safety culture. It is necessary to gain a common understanding of the concept itself, the development stages of safety culture by way of good management practices and leadership for safety culture improvement in the long-term. International topical meetings on safety culture may serve as an important forum for exchange of experiences. From such conventions new initiatives and programmes may crop up which when implemented around the world is very likely to improve safety management and thus boost up the safety culture in nuclear installations. International co-operation and learning are to be prompted to facilitate the sharing of the achievements to face the challenges involved in the management of safety and fixing priorities for future work and identify areas of co-operations. Key lessons learned from some major events have been reported. Present status and future trend of nuclear safety culture in Bangladesh have been dealt with. (author)

  8. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  9. An analysis of post-event processing in social anxiety disorder.

    Science.gov (United States)

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  10. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    Science.gov (United States)

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Disruptive event analysis: volcanism and igneous intrusion

    International Nuclear Information System (INIS)

    Crowe, B.M.

    1979-01-01

    Three basic topics are addressed for the disruptive event analysis: first, the range of disruptive consequences of a radioactive waste repository by volcanic activity; second, the possible reduction of the risk of disruption by volcanic activity through selective siting of a repository; and third, the quantification of the probability of repository disruption by volcanic activity

  12. Radionuclide data analysis in connection of DPRK event in May 2009

    Science.gov (United States)

    Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei

    2010-05-01

    The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.

  13. Organization of pulse-height analysis programs for high event rates

    Energy Technology Data Exchange (ETDEWEB)

    Cohn, C E [Argonne National Lab., Ill. (USA)

    1976-09-01

    The ability of a pulse-height analysis program to handle high event rates can be enhanced by organizing it so as to minimize the time spent in interrupt housekeeping. Specifically, the routine that services the data-ready interrupt from the ADC should test whether another event is ready before performing the interrupt return.

  14. External events analysis for the Savannah River Site K reactor

    International Nuclear Information System (INIS)

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be ''external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 x 10 -4 per year, from which seismic events are the major contributor (1.2 x 10 -4 per year). Fire initiated events contribute 1.4 x 10 -7 per year, tornados 5.8 x 10 -7 per year, dam failures 1.5 x 10 -6 per year and the crane failure scenario less than 10 -4 per year to the core melt frequency. 8 refs., 3 figs., 5 tabs

  15. Contingency Analysis of Cascading Line Outage Events

    Energy Technology Data Exchange (ETDEWEB)

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  16. Influence of coronal holes on CMEs in causing SEP events

    International Nuclear Information System (INIS)

    Shen Chenglong; Yao Jia; Wang Yuming; Ye Pinzhong; Wang Shui; Zhao Xuepu

    2010-01-01

    The issue of the influence of coronal holes (CHs) on coronal mass ejections (CMEs) in causing solar energetic particle (SEP) events is revisited. It is a continuation and extension of our previous work, in which no evident effects of CHs on CMEs in generating SEPs were found by statistically investigating 56 CME events. This result is consistent with the conclusion obtained by Kahler in 2004. We extrapolate the coronal magnetic field, define CHs as the regions consisting of only open magnetic field lines and perform a similar analysis on this issue for 76 events in total by extending the study interval to the end of 2008. Three key parameters, CH proximity, CH area and CH relative position, are involved in the analysis. The new result confirms the previous conclusion that CHs did not show any evident effect on CMEs in causing SEP events. (research papers)

  17. Analysis of adverse events occurred at overseas nuclear power plants in 2003

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa; Sato, Masahiro; Takagawa, Kenichi; Fushimi, Yasuyuki; Shimada, Hiroki; Shimada, Yoshio

    2004-01-01

    The adverse events that have occurred in the overseas nuclear power plants can be studied to provide an indication of how to improve the safety and the reliability of nuclear power plants in Japan. The Institute of Nuclear Safety Systems (INSS) obtains information related to overseas adverse events and incidents, and by evaluating them proposes improvements to prevent similar occurrences in Japanese PWR plants. In 2003, INSS obtained approximately 2800 pieces of information and, by evaluating them, proposed nine recommendations to Japanese utilities. This report shows a summary of the evaluation activity and of the tendency analysis based on individual event analyzed in 2003. The tendency analysis was undertaken on about 1600 analyzed events, from the view point of Mechanics, Electrics, Instruments and Controls and Operations, about the causes, countermeasures, troubled equipments and the possible of lessons learnt from overseas events. This report is to show the whole tendency of overseas events and incidents for the improvement of the safety and reliability of domestic PWR plants. (author)

  18. Identification and analysis of external event combinations for Hanhikivi 1PRA

    Energy Technology Data Exchange (ETDEWEB)

    Helander, Juho [Fennovoima Oy, Helsinki (Finland)

    2017-03-15

    Fennovoima's nuclear power plant, Hanhikivi 1, Pyhäjoki, Finland, is currently in design phase, and its construction is scheduled to begin in 2018 and electricity production in 2024. The objective of this paper is to produce a preliminary list of safety-significant external event combinations including preliminary probability estimates, to be used in the probabilistic risk assessment of Hanhikivi 1 plant. Starting from the list of relevant single events, the relevant event combinations are identified based on seasonal variation, preconditions related to different events, and dependencies (fundamental and cascade type) between events. Using this method yields 30 relevant event combinations of two events for the Hanhikivi site. The preliminary probability of each combination is evaluated, and event combinations with extremely low probability are excluded from further analysis. Event combinations of three or more events are identified by adding possible events to the remaining combinations of two events. Finally, 10 relevant combinations of two events and three relevant combinations of three events remain. The results shall be considered preliminary and will be updated after evaluating more detailed effects of different events on plant safety.

  19. Microprocessor event analysis in parallel with Camac data acquisition

    International Nuclear Information System (INIS)

    Cords, D.; Eichler, R.; Riege, H.

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a Camac System (GEC-ELLIOTT System Crate) and shares the Camac access with a Nord-1OS computer. Interfaces have been designed and tested for execution of Camac cycles, communication with the Nord-1OS computer and DMA-transfer from Camac to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-1OS computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the result of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-1OS buffer will be reset and the event omitted from further processing. (orig.)

  20. Using discriminant analysis as a nucleation event classification method

    Directory of Open Access Journals (Sweden)

    S. Mikkonen

    2006-01-01

    Full Text Available More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  1. Survival analysis using S analysis of time-to-event data

    CERN Document Server

    Tableman, Mara

    2003-01-01

    Survival Analysis Using S: Analysis of Time-to-Event Data is designed as a text for a one-semester or one-quarter course in survival analysis for upper-level or graduate students in statistics, biostatistics, and epidemiology. Prerequisites are a standard pre-calculus first course in probability and statistics, and a course in applied linear regression models. No prior knowledge of S or R is assumed. A wide choice of exercises is included, some intended for more advanced students with a first course in mathematical statistics. The authors emphasize parametric log-linear models, while also detailing nonparametric procedures along with model building and data diagnostics. Medical and public health researchers will find the discussion of cut point analysis with bootstrap validation, competing risks and the cumulative incidence estimator, and the analysis of left-truncated and right-censored data invaluable. The bootstrap procedure checks robustness of cut point analysis and determines cut point(s). In a chapter ...

  2. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  3. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes...

  4. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Behrmann, Gerd

    1999-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses \\emph{compositionality} and \\emph{dependency analysis} to significantly improve the efficiency of symbolic model checking of state/event models...

  5. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  6. Ultimate design load analysis of planetary gearbox bearings under extreme events

    DEFF Research Database (Denmark)

    Gallego Calderon, Juan Felipe; Natarajan, Anand; Cutululis, Nicolaos Antonio

    2017-01-01

    This paper investigates the impact of extreme events on the planet bearings of a 5 MW gearbox. The system is simulated using an aeroelastic tool, where the turbine structure is modeled, and MATLAB/Simulink, where the drivetrain (gearbox and generator) are modeled using a lumped-parameter approach....... Three extreme events are assessed: low-voltage ride through, emergency stop and normal stop. The analysis is focused on finding which event has the most negative impact on the bearing extreme radial loads. The two latter events are carried out following the guidelines of the International...

  7. Analysis of loss of offsite power events reported in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Volkanovski, Andrija, E-mail: Andrija.VOLKANOVSKI@ec.europa.eu [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Ballesteros Avila, Antonio; Peinador Veira, Miguel [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Kančev, Duško [Kernkraftwerk Goesgen-Daeniken AG, CH-4658 Daeniken (Switzerland); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) gGmbH, Schwertnergasse 1, 50667 Köln (Germany); Stephan, Jean-Luc [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 – 92262 Fontenay-aux-Roses Cedex (France)

    2016-10-15

    Highlights: • Loss of offsite power events were identified in four databases. • Engineering analysis of relevant events was done. • The dominant root cause for LOOP are human failures. • Improved maintenance procedures can decrease the number of LOOP events. - Abstract: This paper presents the results of analysis of the loss of offsite power events (LOOP) in four databases of operational events. The screened databases include: the Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases, the IAEA International Reporting System for Operating Experience (IRS) and the U.S. Licensee Event Reports (LER). In total 228 relevant loss of offsite power events were identified in the IRSN database, 190 in the GRS database, 120 in U.S. LER and 52 in IRS database. Identified events were classified in predefined categories. Obtained results show that the largest percentage of LOOP events is registered during On power operational mode and lasted for two minutes or more. The plant centered events is the main contributor to LOOP events identified in IRSN, GRS and IAEA IRS database. The switchyard centered events are the main contributor in events registered in the NRC LER database. The main type of failed equipment is switchyard failures in IRSN and IAEA IRS, main or secondary lines in NRC LER and busbar failures in GRS database. The dominant root cause for the LOOP events are human failures during test, inspection and maintenance followed by human failures due to the insufficient or wrong procedures. The largest number of LOOP events resulted in reactor trip followed by EDG start. The actions that can result in reduction of the number of LOOP events and minimize consequences on plant safety are identified and presented.

  8. Analysis of unprotected overcooling events in the Integral Fast Reactor

    International Nuclear Information System (INIS)

    Vilim, R.B.

    1989-01-01

    Simple analytic models are developed for predicting the response of a metal fueled, liquid-metal cooled reactor to unprotected overcooling events in the balance of plant. All overcooling initiators are shown to fall into two categories. The first category contains these events for which there is no final equilibrium state of constant overcooling, as in the case for a large steam leak. These events are analyzed using a non-flow control mass approach. The second category contains those events which will eventually equilibrate, such as a loss of feedwater heaters. A steady flow control volume analysis shows that these latter events ultimately affect the plant through the feedwater inlet to the steam generator. The models developed for analyzing these two categories provide upper bounds for the reactor's passive response to overcooling accident initiators. Calculation of these bounds for a prototypic plant indicate that failure limits -- eutectic melting, sodium boiling, fuel pin failure -- are not exceeded in any overcooling event. 2 refs

  9. Microprocessor event analysis in parallel with CAMAC data acquisition

    CERN Document Server

    Cords, D; Riege, H

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a CAMAC System (GEC-ELLIOTT System Crate) and shares the CAMAC access with a Nord-10S computer. Interfaces have been designed and tested for execution of CAMAC cycles, communication with the Nord-10S computer and DMA-transfer from CAMAC to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-10S computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the results of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-10S buffer will be reset and the event omitted from further processing. (5 refs).

  10. Regression analysis of mixed panel count data with dependent terminal events.

    Science.gov (United States)

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L

    2017-05-10

    Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  12. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    Science.gov (United States)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  13. Key steps in the strategic analysis of a dental practice.

    Science.gov (United States)

    Armstrong, J L; Boardman, A E; Vining, A R

    1999-01-01

    As dentistry is becoming increasingly competitive, dentists must focus more on strategic analysis. This paper lays out seven initial steps that are the foundation of strategic analysis. It introduces and describes the use of service-customer matrices and location-proximity maps as tools in competitive positioning. The paper also contains a brief overview of the role of differentiation and cost-control in determining key success factors for dental practices.

  14. Analysis of internal events for the Unit 1 of the Laguna Verde nuclear power station

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    This volume presents the results of the starter event analysis and the event tree analysis for the Unit 1 of the Laguna Verde nuclear power station. The starter event analysis includes the identification of all those internal events which cause a disturbance to the normal operation of the power station and require mitigation. Those called external events stay beyond the reach of this study. For the analysis of the Laguna Verde power station eight transient categories were identified, three categories of loss of coolant accidents (LOCA) inside the container, a LOCA out of the primary container, as well as the vessel break. The event trees analysis involves the development of the possible accident sequences for each category of starter events. Events trees by systems for the different types of LOCA and for all the transients were constructed. It was constructed the event tree for the total loss of alternating current, which represents an extension of the event tree for the loss of external power transient. Also the event tree by systems for the anticipated transients without scram was developed (ATWS). The events trees for the accident sequences includes the sequences evaluation with vulnerable nucleus, that is to say those sequences in which it is had an adequate cooling of nucleus but the remoting systems of residual heat had failed. In order to model adequately the previous, headings were added to the event tree for developing the sequences until the point where be solved the nucleus state. This process includes: the determination of the failure pressure of the primary container, the evaluation of the environment generated in the reactor building as result of the container failure or cracked of itself, the determination of the localization of the components in the reactor building and the construction of boolean expressions to estimate the failure of the subordinated components to an severe environment. (Author)

  15. Markov chains and semi-Markov models in time-to-event analysis.

    Science.gov (United States)

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  16. Transcriptome analysis reveals key differentially expressed genes involved in wheat grain development

    Directory of Open Access Journals (Sweden)

    Yonglong Yu

    2016-04-01

    Full Text Available Wheat seed development is an important physiological process of seed maturation and directly affects wheat yield and quality. In this study, we performed dynamic transcriptome microarray analysis of an elite Chinese bread wheat cultivar (Jimai 20 during grain development using the GeneChip Wheat Genome Array. Grain morphology and scanning electron microscope observations showed that the period of 11–15 days post-anthesis (DPA was a key stage for the synthesis and accumulation of seed starch. Genome-wide transcriptional profiling and significance analysis of microarrays revealed that the period from 11 to 15 DPA was more important than the 15–20 DPA stage for the synthesis and accumulation of nutritive reserves. Series test of cluster analysis of differential genes revealed five statistically significant gene expression profiles. Gene ontology annotation and enrichment analysis gave further information about differentially expressed genes, and MapMan analysis revealed expression changes within functional groups during seed development. Metabolic pathway network analysis showed that major and minor metabolic pathways regulate one another to ensure regular seed development and nutritive reserve accumulation. We performed gene co-expression network analysis to identify genes that play vital roles in seed development and identified several key genes involved in important metabolic pathways. The transcriptional expression of eight key genes involved in starch and protein synthesis and stress defense was further validated by qRT-PCR. Our results provide new insight into the molecular mechanisms of wheat seed development and the determinants of yield and quality.

  17. Top event prevention analysis - a deterministic use of PRA

    International Nuclear Information System (INIS)

    Blanchard, D.P.; Worrell, R.B.

    1995-01-01

    Risk importance measures are popular for many applications of probabilistic analysis. Inherent in the derivation of risk importance measures are implicit assumptions that those using these numerical results should be aware of in their decision making. These assumptions and potential limitations include the following: (1) The risk importance measures are derived for a single event at a time and are therefore valid only if all other event probabilities are unchanged at their current values. (2) The results for which risk importance measures are derived may not be complete for reasons such as truncation

  18. Analysis of events occurred at overseas nuclear power plants in 2004

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa; Nishioka, Hiromasa; Sato, Masahiro; Chiba, Gorou; Takagawa, Kenichi; Shimada, Hiroki

    2005-01-01

    The Institute of Nuclear Safety Systems (INSS) investigates the information related to events and incidents occurred at overseas nuclear power plants, and proposes recommendations for the improvement of the safety and reliability of domestic PWR plants by evaluating them. Succeeding to the 2003 report, this report shows the summary of the evaluation activity and of the tendency analysis based on about 2800 information obtained in 2004. The tendency analysis was undertaken on about 1700 analyzed events, from the view point of mechanics, electrics and operations, about the causes, troubled equipments and so on. (author)

  19. [Analysis on the adverse events of cupping therapy in the application].

    Science.gov (United States)

    Zhou, Xin; Ruan, Jing-wen; Xing, Bing-feng

    2014-10-01

    The deep analysis has been done on the cases of adverse events and common injury of cupping therapy encountered in recent years in terms of manipulation and patient's constitution. The adverse events of cupping therapy are commonly caused by improper manipulation of medical practitioners, ignoring contraindication and patient's constitution. Clinical practitioners should use cupping therapy cautiously, follow strictly the rules of standard manipulation and medical core system, pay attention to the contraindication and take strict precautions against the occurrence of adverse events.

  20. Analysis hierarchical model for discrete event systems

    Science.gov (United States)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  1. The patient's safety - For a dynamics of improvement. Nr 3. How to analyze your significant radiation protection events?

    International Nuclear Information System (INIS)

    2012-07-01

    The objective of this publication is to present event analysis methods which are the most frequently used by radiotherapy departments. After an indication of key figures concerning radiotherapy patients, sessions and events, the document indicates the objectives and steps of an event analysis. It presents various analysis methods: Ishikawa diagram (or 5M method or causes-effects diagram), the cause tree, the ALARM method (association of litigation and risk management), the ORION method. It proposes a comparison between these five methods, their possibilities, their limits. Good practices are outlined in terms of data acquisition, method choice, event analysis, and improvement actions. The use of the cause tree analysis is commented by members of the Limoges hospital radiotherapy department, and that of the Ishikawa method by a member of the Beauvais hospital

  2. Passage Key Inlet, Florida; CMS Modeling and Borrow Site Impact Analysis

    Science.gov (United States)

    2016-06-01

    Impact Analysis by Kelly R. Legault and Sirisha Rayaprolu PURPOSE: This Coastal and Hydraulics Engineering Technical Note (CHETN) describes the...driven sediment transport at Passage Key Inlet. This analysis resulted in issuing a new Florida Department of Environmental Protection (FDEP) permit to...Funding for this study was provided by the USACE Regional Sediment Management (RSM) Program, a Navigation Research, Development, and Technology Portfolio

  3. Evaluation of Fourier integral. Spectral analysis of seismic events

    International Nuclear Information System (INIS)

    Chitaru, Cristian; Enescu, Dumitru

    2003-01-01

    Spectral analysis of seismic events represents a method for great earthquake prediction. The seismic signal is not a sinusoidal signal; for this, it is necessary to find a method for best approximation of real signal with a sinusoidal signal. The 'Quanterra' broadband station allows the data access in numerical and/or graphical forms. With the numerical form we can easily make a computer program (MSOFFICE-EXCEL) for spectral analysis. (authors)

  4. Analysis of Loss-of-Offsite-Power Events 1997-2015

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Nancy Ellen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.

  5. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    International Nuclear Information System (INIS)

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-01-01

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility

  6. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    Science.gov (United States)

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  7. Parallel Key Frame Extraction for Surveillance Video Service in a Smart City.

    Science.gov (United States)

    Zheng, Ran; Yao, Chuanwei; Jin, Hai; Zhu, Lei; Zhang, Qin; Deng, Wei

    2015-01-01

    Surveillance video service (SVS) is one of the most important services provided in a smart city. It is very important for the utilization of SVS to provide design efficient surveillance video analysis techniques. Key frame extraction is a simple yet effective technique to achieve this goal. In surveillance video applications, key frames are typically used to summarize important video content. It is very important and essential to extract key frames accurately and efficiently. A novel approach is proposed to extract key frames from traffic surveillance videos based on GPU (graphics processing units) to ensure high efficiency and accuracy. For the determination of key frames, motion is a more salient feature in presenting actions or events, especially in surveillance videos. The motion feature is extracted in GPU to reduce running time. It is also smoothed to reduce noise, and the frames with local maxima of motion information are selected as the final key frames. The experimental results show that this approach can extract key frames more accurately and efficiently compared with several other methods.

  8. Parallel Key Frame Extraction for Surveillance Video Service in a Smart City.

    Directory of Open Access Journals (Sweden)

    Ran Zheng

    Full Text Available Surveillance video service (SVS is one of the most important services provided in a smart city. It is very important for the utilization of SVS to provide design efficient surveillance video analysis techniques. Key frame extraction is a simple yet effective technique to achieve this goal. In surveillance video applications, key frames are typically used to summarize important video content. It is very important and essential to extract key frames accurately and efficiently. A novel approach is proposed to extract key frames from traffic surveillance videos based on GPU (graphics processing units to ensure high efficiency and accuracy. For the determination of key frames, motion is a more salient feature in presenting actions or events, especially in surveillance videos. The motion feature is extracted in GPU to reduce running time. It is also smoothed to reduce noise, and the frames with local maxima of motion information are selected as the final key frames. The experimental results show that this approach can extract key frames more accurately and efficiently compared with several other methods.

  9. Climate Change Risks – Methodological Framework and Case Study of Damages from Extreme Events in Cambodia

    DEFF Research Database (Denmark)

    Halsnæs, Kirsten; Kaspersen, Per Skougaard; Trærup, Sara Lærke Meltofte

    2016-01-01

    Climate change imposes some special risks on Least Developed Countries, and the chapter presents a methodological framework, which can be used to assess the impacts of key assumptions related to damage costs, risks and equity implications on current and future generations. The methodological...... framework is applied to a case study of severe storms in Cambodia based on statistical information on past storm events including information about buildings damaged and victims. Despite there is limited data available on the probability of severe storm events under climate change as well on the actual...... damage costs associated with the events in the case of Cambodia, we are using the past storm events as proxy data in a sensitivity analysis. It is here demonstrated how key assumptions on future climate change, income levels of victims, and income distribution over time, reflected in discount rates...

  10. Differential Fault Analysis on CLEFIA with 128, 192, and 256-Bit Keys

    Science.gov (United States)

    Takahashi, Junko; Fukunaga, Toshinori

    This paper describes a differential fault analysis (DFA) attack against CLEFIA. The proposed attack can be applied to CLEFIA with all supported keys: 128, 192, and 256-bit keys. DFA is a type of side-channel attack. This attack enables the recovery of secret keys by injecting faults into a secure device during its computation of the cryptographic algorithm and comparing the correct ciphertext with the faulty one. CLEFIA is a 128-bit blockcipher with 128, 192, and 256-bit keys developed by the Sony Corporation in 2007. CLEFIA employs a generalized Feistel structure with four data lines. We developed a new attack method that uses this characteristic structure of the CLEFIA algorithm. On the basis of the proposed attack, only 2 pairs of correct and faulty ciphertexts are needed to retrieve the 128-bit key, and 10.78 pairs on average are needed to retrieve the 192 and 256-bit keys. The proposed attack is more efficient than any previously reported. In order to verify the proposed attack and estimate the calculation time to recover the secret key, we conducted an attack simulation using a PC. The simulation results show that we can obtain each secret key within three minutes on average. This result shows that we can obtain the entire key within a feasible computational time.

  11. Urbanization and fertility: an event-history analysis of coastal Ghana.

    Science.gov (United States)

    White, Michael J; Muhidin, Salut; Andrzejewski, Catherine; Tagoe, Eva; Knight, Rodney; Reed, Holly

    2008-11-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field.

  12. External Events PSA for the Paks NPP

    International Nuclear Information System (INIS)

    Bareith, Attila; Karsa, Zoltan; Siklossy, Tamas; Vida, Zoltan

    2014-01-01

    Initially, probabilistic safety assessment of external events was limited to the analysis of earthquakes for the Paks Nuclear Power Plant in Hungary. The level 1 seismic PSA was completed in 2002 showing a significant contribution of seismic failures to core damage risk. Although other external events of natural origin had previously been screened out from detailed plant PSA mostly on the basis of event frequencies, a review of recent experience on extreme weather phenomena made during the periodic safety review of the plant led to the initiation of PSA for external events other than earthquakes in 2009. In the meantime, the accident of the Fukushima Dai-ichi Nuclear Power Plant confirmed further the importance of such an analysis. The external event PSA for the Paks plant followed the commonly known steps: selection and screening of external hazards, hazard assessment for screened-in external events, analysis of plant response and fragility, PSA model development, and risk quantification and interpretation of results. As a result of event selection and screening the following weather related external hazards were subject to detailed analysis: extreme wind, extreme rainfall (precipitation), extreme snow, extremely high and extremely low temperatures, lightning, frost and ice formation. The analysis proved to be a significant challenge due to scarcity of data, lack of knowledge, as well as limitations of existing PSA methodologies. This paper presents an overview of the external events PSA performed for the Paks NPP. Important methodological aspects are summarised. Key analysis findings and unresolved issues that need further elaboration are highlighted. Development of external events PSA for the Paks NPP was completed by the end of 2012. The analysis followed the commonly known steps: selection and screening of external hazards, hazard assessment for screened-in external events, analysis of plant response and fragility, PSA model development, and risk

  13. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Science.gov (United States)

    Fedosimova, Anastasiya; Gaitinov, Adigam; Grushevskaya, Ekaterina; Lebedev, Igor

    2017-06-01

    In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC), short-range multiparticle correlations (SC) and mixed type (MT) in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  14. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Directory of Open Access Journals (Sweden)

    Fedosimova Anastasiya

    2017-01-01

    Full Text Available In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC, short-range multiparticle correlations (SC and mixed type (MT in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  15. Analysis of convection-permitting simulations for capturing heavy rainfall events over Myanmar Region

    Science.gov (United States)

    Acierto, R. A. E.; Kawasaki, A.

    2017-12-01

    Perennial flooding due to heavy rainfall events causes strong impacts on the society and economy. With increasing pressures of rapid development and potential for climate change impacts, Myanmar experiences a rapid increase in disaster risk. Heavy rainfall hazard assessment is key on quantifying such disaster risk in both current and future conditions. Downscaling using Regional Climate Models (RCM) such as Weather Research and Forecast model have been used extensively for assessing such heavy rainfall events. However, usage of convective parameterizations can introduce large errors in simulating rainfall. Convective-permitting simulations have been used to deal with this problem by increasing the resolution of RCMs to 4km. This study focuses on the heavy rainfall events during the six-year (2010-2015) wet period season from May to September in Myanmar. The investigation primarily utilizes rain gauge observation for comparing downscaled heavy rainfall events in 4km resolution using ERA-Interim as boundary conditions using 12km-4km one-way nesting method. The study aims to provide basis for production of high-resolution climate projections over Myanmar in order to contribute for flood hazard and risk assessment.

  16. Synchronization Techniques in Parallel Discrete Event Simulation

    OpenAIRE

    Lindén, Jonatan

    2018-01-01

    Discrete event simulation is an important tool for evaluating system models in many fields of science and engineering. To improve the performance of large-scale discrete event simulations, several techniques to parallelize discrete event simulation have been developed. In parallel discrete event simulation, the work of a single discrete event simulation is distributed over multiple processing elements. A key challenge in parallel discrete event simulation is to ensure that causally dependent ...

  17. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Lisbeth A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  18. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE

    Directory of Open Access Journals (Sweden)

    J. Ajayakumar

    2017-10-01

    Full Text Available With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE. SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the “North American storm complex” in December 2015, the “snowstorm Jonas” in January 2016, the “West Virginia floods” in June 2016, and the “Hurricane Matthew” in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with

  19. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE)

    Science.gov (United States)

    Ajayakumar, J.; Shook, E.; Turner, V. K.

    2017-10-01

    With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE). SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the "North American storm complex" in December 2015, the "snowstorm Jonas" in January 2016, the "West Virginia floods" in June 2016, and the "Hurricane Matthew" in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI) for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with social media data and will be useful

  20. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  1. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  2. Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment

    International Nuclear Information System (INIS)

    Mercurio, D.

    2011-01-01

    Current Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) methodologies model the evolution of accident sequences in Nuclear Power Plants (NPPs) mainly based on Logic Trees. The evolution of these sequences is a result of the interactions between the crew and plant; in current PRA methodologies, simplified models of these complex interactions are used. In this study, the Accident Dynamic Simulator (ADS), a modeling framework based on the Discrete Dynamic Event Tree (DDET), has been used for the simulation of crew-plant interactions during potential accident scenarios in NPPs. In addition, an operator/crew model has been developed to treat the response of the crew to the plant. The 'crew model' is made up of three operators whose behavior is guided by a set of rules-of-behavior (which represents the knowledge and training of the operators) coupled with written and mental procedures. In addition, an approach for addressing the crew timing variability in DDETs has been developed and implemented based on a set of HRA data from a simulator study. Finally, grouping techniques were developed and applied to the analysis of the scenarios generated by the crew-plant simulation. These techniques support the post-simulation analysis by grouping similar accident sequences, identifying the key contributing events, and quantifying the conditional probability of the groups. These techniques are used to characterize the context of the crew actions in order to obtain insights for HRA. The model has been applied for the analysis of a Small Loss Of Coolant Accident (SLOCA) event for a Pressurized Water Reactor (PWR). The simulation results support an improved characterization of the performance conditions or context of operator actions, which can be used in an HRA, in the analysis of the reliability of the actions. By providing information on the evolution of system indications, dynamic of cues, crew timing in performing procedure steps, situation

  3. Identification of fire modeling issues based on an analysis of real events from the OECD FIRE database

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, Dominik [Swiss Federal Nuclear Safety Inspectorate ENSI, Brugg (Switzerland)

    2017-03-15

    Precursor analysis is widely used in the nuclear industry to judge the significance of events relevant to safety. However, in case of events that may damage equipment through effects that are not ordinary functional dependencies, the analysis may not always fully appreciate the potential for further evolution of the event. For fires, which are one class of such events, this paper discusses modelling challenges that need to be overcome when performing a probabilistic precursor analysis. The events used to analyze are selected from the Organisation for Economic Cooperation and Development (OECD) Fire Incidents Records Exchange (FIRE) Database.

  4. Hazard analysis of typhoon-related external events using extreme value theory

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yo Chan; Jang, Seung Cheol [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lim, Tae Jin [Dept. of Industrial Information Systems Engineering, Soongsil University, Seoul (Korea, Republic of)

    2015-02-15

    After the Fukushima accident, the importance of hazard analysis for extreme external events was raised. To analyze typhoon-induced hazards, which are one of the significant disasters of East Asian countries, a statistical analysis using the extreme value theory, which is a method for estimating the annual exceedance frequency of a rare event, was conducted for an estimation of the occurrence intervals or hazard levels. For the four meteorological variables, maximum wind speed, instantaneous wind speed, hourly precipitation, and daily precipitation, the parameters of the predictive extreme value theory models were estimated. The 100-year return levels for each variable were predicted using the developed models and compared with previously reported values. It was also found that there exist significant long-term climate changes of wind speed and precipitation. A fragility analysis should be conducted to ensure the safety levels of a nuclear power plant for high levels of wind speed and precipitation, which exceed the results of a previous analysis.

  5. Antipsychotics, glycemic disorders, and life-threatening diabetic events: a Bayesian data-mining analysis of the FDA adverse event reporting system (1968-2004).

    Science.gov (United States)

    DuMouchel, William; Fram, David; Yang, Xionghu; Mahmoud, Ramy A; Grogg, Amy L; Engelhart, Luella; Ramaswamy, Krishnan

    2008-01-01

    This analysis compared diabetes-related adverse events associated with use of different antipsychotic agents. A disproportionality analysis of the US Food and Drug Administration (FDA) Adverse Event Reporting System (AERS) was performed. Data from the FDA postmarketing AERS database (1968 through first quarter 2004) were evaluated. Drugs studied included aripiprazole, clozapine, haloperidol, olanzapine, quetiapine, risperidone, and ziprasidone. Fourteen Medical Dictionary for Regulatory Activities (MedDRA) Primary Terms (MPTs) were chosen to identify diabetes-related adverse events; 3 groupings into higher-level descriptive categories were also studied. Three methods of measuring drug-event associations were used: proportional reporting ratio, the empirical Bayes data-mining algorithm known as the Multi-Item Gamma Poisson Shrinker, and logistic regression (LR) analysis. Quantitative measures of association strength, with corresponding confidence intervals, between drugs and specified adverse events were computed and graphed. Some of the LR analyses were repeated separately for reports from patients under and over 45 years of age. Differences in association strength were declared statistically significant if the corresponding 90% confidence intervals did not overlap. Association with various glycemic events differed for different drugs. On average, the rankings of association strength agreed with the following ordering: low association, ziprasidone, aripiprazole, haloperidol, and risperidone; medium association, quetiapine; and strong association, clozapine and olanzapine. The median rank correlation between the above ordering and the 17 sets of LR coefficients (1 set for each glycemic event) was 93%. Many of the disproportionality measures were significantly different across drugs, and ratios of disproportionality factors of 5 or more were frequently observed. There are consistent and substantial differences between atypical antipsychotic drugs in the

  6. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    Science.gov (United States)

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  7. In situ simulation: Taking reported critical incidents and adverse events back to the clinic

    DEFF Research Database (Denmark)

    Juul, Jonas; Paltved, Charlotte; Krogh, Kristian

    2014-01-01

    for content analysis4 and thematic analysis5. Medical experts and simulation faculty will design scenarios for in situ simulation training based on the analysis. Short-term observations using time logs will be performed along with interviews with key informants at the departments. Video data will be collected...... improve patient safety if coupled with training and organisational support2. Insight into the nature of reported critical incidents and adverse events can be used in writing in situ simulation scenarios and thus lead to interventions that enhance patient safety. The patient safety literature emphasises...... well-developed non-technical skills in preventing medical errors3. Furthermore, critical incidents and adverse events reporting systems comprise a knowledgebase to gain in-depth insights into patient safety issues. This study explores the use of critical incidents and adverse events reports to inform...

  8. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual

    International Nuclear Information System (INIS)

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  9. Probabilistic Dynamics for Integrated Analysis of Accident Sequences considering Uncertain Events

    Directory of Open Access Journals (Sweden)

    Robertas Alzbutas

    2015-01-01

    Full Text Available The analytical/deterministic modelling and simulation/probabilistic methods are used separately as a rule in order to analyse the physical processes and random or uncertain events. However, in the currently used probabilistic safety assessment this is an issue. The lack of treatment of dynamic interactions between the physical processes on one hand and random events on the other hand causes the limited assessment. In general, there are a lot of mathematical modelling theories, which can be used separately or integrated in order to extend possibilities of modelling and analysis. The Theory of Probabilistic Dynamics (TPD and its augmented version based on the concept of stimulus and delay are introduced for the dynamic reliability modelling and the simulation of accidents in hybrid (continuous-discrete systems considering uncertain events. An approach of non-Markovian simulation and uncertainty analysis is discussed in order to adapt the Stimulus-Driven TPD for practical applications. The developed approach and related methods are used as a basis for a test case simulation in view of various methods applications for severe accident scenario simulation and uncertainty analysis. For this and for wider analysis of accident sequences the initial test case specification is then extended and discussed. Finally, it is concluded that enhancing the modelling of stimulated dynamics with uncertainty and sensitivity analysis allows the detailed simulation of complex system characteristics and representation of their uncertainty. The developed approach of accident modelling and analysis can be efficiently used to estimate the reliability of hybrid systems and at the same time to analyze and possibly decrease the uncertainty of this estimate.

  10. Trend analysis of human error events and assessment of their proactive prevention measure at Rokkasho reprocessing plant

    International Nuclear Information System (INIS)

    Yamazaki, Satoru; Tanaka, Izumi; Wakabayashi, Toshio

    2012-01-01

    A trend analysis of human error events is important for preventing the recurrence of human error events. We propose a new method for identifying the common characteristics from results of trend analysis, such as the latent weakness of organization, and a management process for strategic error prevention. In this paper, we describe a trend analysis method for human error events that have been accumulated in the organization and the utilization of the results of trend analysis to prevent accidents proactively. Although the systematic analysis of human error events, the monitoring of their overall trend, and the utilization of the analyzed results have been examined for the plant operation, such information has never been utilized completely. Sharing information on human error events and analyzing their causes lead to the clarification of problems in the management and human factors. This new method was applied to the human error events that occurred in the Rokkasho reprocessing plant from 2010 October. Results revealed that the output of this method is effective in judging the error prevention plan and that the number of human error events is reduced to about 50% those observed in 2009 and 2010. (author)

  11. The January 2001, El Salvador event: a multi-data analysis

    Science.gov (United States)

    Vallee, M.; Bouchon, M.; Schwartz, S. Y.

    2001-12-01

    On January 13, 2001, a large normal event (Mw=7.6) occured 100 kilometers away from the Salvadorian coast (Central America) with a centroid depth of about 50km. The size of this event is surprising according to the classical idea that such events have to be much weaker than thrust events in subduction zones. We analysed this earthquake with different types of data: because teleseismic waves are the only data which offer a good azimuthal coverage, we first built a kinematic source model with P and SH waves provided by the IRIS-GEOSCOPE networks. The ambiguity between the 30o plane (plunging toward Pacific Ocean) and the 60o degree plane (plunging toward Central America) leaded us to do a parallel analysis of the two possible planes. We used a simple point-source modelling in order to define the main characteristics of the event and then used an extended source to retrieve the kinematic features of the rupture. For the 2 possible planes, this analysis reveals a downdip and northwest rupture propagation but the difference of fit remains subtle even when using the extended source. In a second part we confronted our models for the two planes with other seismological data, which are (1) regional data, (2) surface wave data through an Empirical Green Function given by a similar but much weaker earthquake which occured in July 1996 and lastly (3) nearfield data provided by Universidad Centroamericana (UCA) and Centro de Investigationes Geotecnicas (CIG). Regional data do not allow to discriminate the 2 planes neither but surface waves and especially near field data confirm that the fault plane is the steepest one plunging toward Central America. Moreover, the slight directivity toward North is confirmed by surface waves.

  12. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    Science.gov (United States)

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  14. Detection of Abnormal Events via Optical Flow Feature Analysis

    Science.gov (United States)

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  15. Preterm Versus Term Children: Analysis of Sedation/Anesthesia Adverse Events and Longitudinal Risk.

    Science.gov (United States)

    Havidich, Jeana E; Beach, Michael; Dierdorf, Stephen F; Onega, Tracy; Suresh, Gautham; Cravero, Joseph P

    2016-03-01

    Preterm and former preterm children frequently require sedation/anesthesia for diagnostic and therapeutic procedures. Our objective was to determine the age at which children who are born risk for sedation/anesthesia adverse events. Our secondary objective was to describe the nature and incidence of adverse events. This is a prospective observational study of children receiving sedation/anesthesia for diagnostic and/or therapeutic procedures outside of the operating room by the Pediatric Sedation Research Consortium. A total of 57,227 patients 0 to 22 years of age were eligible for this study. All adverse events and descriptive terms were predefined. Logistic regression and locally weighted scatterplot regression were used for analysis. Preterm and former preterm children had higher adverse event rates (14.7% vs 8.5%) compared with children born at term. Our analysis revealed a biphasic pattern for the development of adverse sedation/anesthesia events. Airway and respiratory adverse events were most commonly reported. MRI scans were the most commonly performed procedures in both categories of patients. Patients born preterm are nearly twice as likely to develop sedation/anesthesia adverse events, and this risk continues up to 23 years of age. We recommend obtaining birth history during the formulation of an anesthetic/sedation plan, with heightened awareness that preterm and former preterm children may be at increased risk. Further prospective studies focusing on the etiology and prevention of adverse events in former preterm patients are warranted. Copyright © 2016 by the American Academy of Pediatrics.

  16. Root cause analysis of relevant events

    International Nuclear Information System (INIS)

    Perez, Silvia S.; Vidal, Patricia G.

    2000-01-01

    During 1998 the research work followed more specific guidelines, which entailed focusing exclusively on the two selected methods (ASSET and HPIP) and incorporating some additional human behaviour elements based on the documents of reference. Once resident inspectors were incorporated in the project (and trained accordingly), events occurring in Argentine nuclear power plants were analysed. Some events were analysed (all of them from Atucha I and Embalse nuclear power plant), concluding that the systematic methodology used allows us to investigate also minor events that were precursors of the events selected. (author)

  17. EVNTRE, Code System for Event Progression Analysis for PRA

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: EVNTRE is a generalized event tree processor that was developed for use in probabilistic risk analysis of severe accident progressions for nuclear power plants. The general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. EVNTRE efficiently processes large, complex event trees. It can assign probabilities to event tree branch points in several different ways, classify pathways or outcomes into user-specified groupings, and sample input distributions of probabilities and parameters. PSTEVNT, a post-processor program used to sort and reclassify the 'binned' data output from EVNTRE and generate summary tables, is included. 2 - Methods: EVNTRE processes event trees that are cast in the form of questions or events, with multiple choice answers for each question. Split fractions (probabilities or frequencies that sum to unity) are either supplied or calculated for the branches of each question in a path-dependent manner. EVNTRE traverses the tree, enumerating the leaves of the tree and calculating their probabilities or frequencies based upon the initial probability or frequency and the split fractions for the branches taken along the corresponding path to an individual leaf. The questions in the event tree are usually grouped to address specific phases of time regimes in the progression of the scenario or severe accident. Grouping or binning of each path through the event tree in terms of a small number of characteristics or attributes is allowed. Boolean expressions of the branches taken are used to select the appropriate values of the characteristics of interest for the given path. Typically, the user specifies a cutoff tolerance for the frequency of a pathway to terminate further exploration. Multiple sets of input to an event tree can be processed by using Monte Carlo sampling to generate

  18. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  19. Event dependent sampling of recurrent events

    DEFF Research Database (Denmark)

    Kvist, Tine Kajsa; Andersen, Per Kragh; Angst, Jules

    2010-01-01

    The effect of event-dependent sampling of processes consisting of recurrent events is investigated when analyzing whether the risk of recurrence increases with event count. We study the situation where processes are selected for study if an event occurs in a certain selection interval. Motivation...... retrospective and prospective disease course histories are used. We examine two methods to correct for the selection depending on which data are used in the analysis. In the first case, the conditional distribution of the process given the pre-selection history is determined. In the second case, an inverse...

  20. First Dutch Process Control Security Event

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2008-01-01

    On May 21st , 2008, the Dutch National Infrastructure against Cyber Crime (NICC) organised their first Process Control Security Event. Mrs. Annemarie Zielstra, the NICC programme manager, opened the event. She welcomed the over 100 representatives of key industry sectors. “Earlier studies in the

  1. Formal Analysis of Key Integrity in PKCS#11

    Science.gov (United States)

    Falcone, Andrea; Focardi, Riccardo

    PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.

  2. Analysis of area events as part of probabilistic safety assessment for Romanian TRIGA SSR 14 MW reactor

    International Nuclear Information System (INIS)

    Mladin, D.; Stefan, I.

    2005-01-01

    The international experience has shown that the external events could be an important contributor to plant/ reactor risk. For this reason such events have to be included in the PSA studies. In the context of PSA for nuclear facilities, external events are defined as events originating from outside the plant, but with the potential to create an initiating event at the plant. To support plant safety assessment, PSA can be used to find methods for identification of vulnerable features of the plant and to suggest modifications in order to mitigate the impact of external events or the producing of initiating events. For that purpose, probabilistic assessment of area events concerning fire and flooding risk and impact is necessary. Due to the relatively large power level amongst research reactors, the approach to safety analysis of Romanian 14 MW TRIGA benefits from an ongoing PSA project. In this context, treatment of external events should be considered. The specific tasks proposed for the complete evaluation of area event analysis are: identify the rooms important for facility safety, determine a relative area event risk index for these rooms and a relative area event impact index if the event occurs, evaluate the rooms specific area event frequency, determine the rooms contribution to reactor hazard state frequencies, analyze power supply and room dependencies of safety components (as pumps, motor operated valves). The fire risk analysis methodology is based on Berry's method [1]. This approach provides a systematic procedure to carry out a relative index of different rooms. The factors, which affect the fire probability, are: personal presence in the room, number and type of ignition sources, type and area of combustibles, fuel available in the room, fuel location, and ventilation. The flooding risk analysis is based on the amount of piping in the room. For accuracy of the information regarding piping a facility walk-about is necessary. In case of flooding risk

  3. Probabilistic safety analysis for fire events for the NPP Isar 2

    International Nuclear Information System (INIS)

    Schmaltz, H.; Hristodulidis, A.

    2007-01-01

    The 'Probabilistic Safety Analysis for Fire Events' (Fire-PSA KKI2) for the NPP Isar 2 was performed in addition to the PSA for full power operation and considers all possible events which can be initiated due to a fire. The aim of the plant specific Fire-PSA was to perform a quantitative assessment of fire events during full power operation, which is state of the art. Based on simplistic assumptions referring to the fire induced failures, the influence of system- and component-failures on the frequency of the core damage states was analysed. The Fire-PSA considers events, which will result due to fire-induced failures of equipment on the one hand in a SCRAM and on the other hand in events, which will not have direct operational effects but because of the fire-induced failure of safety related installations the plant will be shut down as a precautionary measure. These events are considered because they may have a not negligible influence on the frequency of core damage states in case of failures during the plant shut down because of the reduced redundancy of safety related systems. (orig.)

  4. The first SEPServer event catalogue ~68-MeV solar proton events observed at 1 AU in 1996–2010

    Directory of Open Access Journals (Sweden)

    Rodríguez-Gasén Rosa

    2013-03-01

    Full Text Available SEPServer is a three-year collaborative project funded by the seventh framework programme (FP7-SPACE of the European Union. The objective of the project is to provide access to state-of-the-art observations and analysis tools for the scientific community on solar energetic particle (SEP events and related electromagnetic (EM emissions. The project will eventually lead to better understanding of the particle acceleration and transport processes at the Sun and in the inner heliosphere. These processes lead to SEP events that form one of the key elements of space weather. In this paper we present the first results from the systematic analysis work performed on the following datasets: SOHO/ERNE, SOHO/EPHIN, ACE/EPAM, Wind/WAVES and GOES X-rays. A catalogue of SEP events at 1 AU, with complete coverage over solar cycle 23, based on high-energy (~68-MeV protons from SOHO/ERNE and electron recordings of the events by SOHO/EPHIN and ACE/EPAM are presented. A total of 115 energetic particle events have been identified and analysed using velocity dispersion analysis (VDA for protons and time-shifting analysis (TSA for electrons and protons in order to infer the SEP release times at the Sun. EM observations during the times of the SEP event onset have been gathered and compared to the release time estimates of particles. Data from those events that occurred during the European day-time, i.e., those that also have observations from ground-based observatories included in SEPServer, are listed and a preliminary analysis of their associations is presented. We find that VDA results for protons can be a useful tool for the analysis of proton release times, but if the derived proton path length is out of a range of 1 AU < s ≲ 3 AU, the result of the analysis may be compromised, as indicated by the anti-correlation of the derived path length and release time delay from the associated X-ray flare. The average path length derived from VDA is about 1.9 times

  5. Event Registration System for INR Linac

    International Nuclear Information System (INIS)

    Grekhov, O.V.; Drugakov, A.N.; Kiselev, Yu.V.

    2006-01-01

    The software of the Event registration system for the linear accelerators is described. This system allows receiving of the information on changes of operating modes of the accelerator and supervising of hundreds of key parameters of various systems of the accelerator. The Event registration system consists of the source and listeners of events. The sources of events are subroutines built in existing ACS Linac. The listeners of events are software Supervisor and Client ERS. They are used for warning the operator about change controlled parameter of the accelerator

  6. Developing Health-Based Pre-Planning Clearance Goals for Airport Remediation Following Chemical Terrorist Attack: Introduction and Key Assessment Considerations

    OpenAIRE

    Watson, Annetta; Hall, Linda; Raber, Ellen; Hauschild, Veronique D.; Dolislager, Fredrick; Love, Adam H.; Hanna, M. Leslie

    2011-01-01

    In the event of a chemical terrorist attack on a transportation hub, post-event remediation and restoration activities necessary to attain unrestricted facility reuse and re-entry could require hours to multiple days. While restoration timeframes are dependent on numerous variables, a primary controlling factor is the level of pre-planning and decision-making completed prior to chemical terrorist release. What follows is the first of a two-part analysis identifying key considerations, critica...

  7. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  8. Corrective action program at the Krsko NPP. Trending and analysis of minor events

    International Nuclear Information System (INIS)

    Bach, B.; Kavsek, D.

    2007-01-01

    Industry and On-site Operating Experience has shown that the significant events, minor events and near misses all share something in common: latent weaknesses that result in failed barriers and the same or similar (root) causes for that failure. All these types of events differ only in their resulting consequences: minor events and near misses have no immediate or significant impact to plant safety or reliability. However, the significant events are usually preceded by a number of those kinds of events and could be prevented from occurring if the root cause(s) of these precursor events could be identified and eliminated. It would be therefore poor management to leave minor events and near misses unreported and unanalysed. Reporting and analysing of minor events allows detection of latent weaknesses that may indicate the need for improvement. The benefit of low level event analysis is that deficiencies can be found in barriers that normally go unchallenged and may not be known that they are ineffective in stopping a significant event. In addition, large numbers of minor events and near misses may increase the probability of occurrence of a significant event, which in itself is a sufficient reason for addressing these types of events. However, as it is not often practical neither feasible to perform a detailed root cause determination for every minor events, trending and trend analysis are used to identify and correct the causes prior to their resulting in a significant event. Trending is monitoring a change in frequency of similar minor events occurrence. Adverse trend is an increase in the frequency of minor events which are sorted by commonality such as common equipment failure, human factors, common or similar causal factors, activity etc. or worsening performance of processes that has been trending. The primary goal of any trending programme should be to identify an adverse trend early enough that the operating organization can initiate an investigation to help

  9. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    Science.gov (United States)

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.

  10. A hydrological analysis of the 4 November 2011 event in Genoa

    Directory of Open Access Journals (Sweden)

    F. Silvestro

    2012-09-01

    Full Text Available On the 4 November 2011 a flash flood event hit the area of Genoa with dramatic consequences. Such an event represents, from the meteorological and hydrological perspective, a paradigm of flash floods in the Mediterranean environment.

    The hydro-meteorological probabilistic forecasting system for small and medium size catchments in use at the Civil Protection Centre of Liguria region exhibited excellent performances for the event, by predicting, 24–48 h in advance, the potential level of risk associated with the forecast. It greatly helped the decision makers in issuing a timely and correct alert.

    In this work we present the operational outputs of the system provided during the Liguria events and the post event hydrological modelling analysis that has been carried out accounting also for the crowd sourcing information and data. We discuss the benefit of the implemented probabilistic systems for decision-making under uncertainty, highlighting how, in this case, the multi-catchment approach used for predicting floods in small basins has been crucial.

  11. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    Science.gov (United States)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  12. Geohazard assessment through the analysis of historical alluvial events in Southern Italy

    Science.gov (United States)

    Esposito, Eliana; Violante, Crescenzo

    2015-04-01

    The risk associated with extreme water events such as flash floods, results from a combination of overflows and landslides hazards. A multi-hazard approach have been utilized to analyze the 1773 flood that occurred in conjunction with heavy rainfall, causing major damage in terms of lost lives and economic cost over an area of 200 km2, including both the coastal strip between Salerno and Maiori and the Apennine hinterland, Campania region - Southern Italy. This area has been affected by a total of 40 flood events over the last five centuries, 26 of them occurred between 1900 and 2000. Streamflow events have produced severe impacts on Cava de' Tirreni (SA) and its territory and in particular four catastrophic floods in 1581, 1773, 1899 and 1954, caused a pervasive pattern of destruction. In the study area, rainstorm events typically occur in small and medium-sized fluvial system, characterized by small catchment areas and high-elevation drainage basins, causing the detachment of large amount of volcaniclastic and siliciclastic covers from the carbonate bedrock. The mobilization of these deposits (slope debris) mixed with rising floodwaters along the water paths can produce fast-moving streamflows of large proportion with significant hazardous implications (Violante et al., 2009). In this context the study of 1773 historical flood allows the detection and the definition of those areas where catastrophic events repeatedly took place over the time. Moreover, it improves the understanding of the phenomena themselves, including some key elements in the management of risk mitigation, such as the restoration of the damage suffered by the buildings and/or the environmental effects caused by the floods.

  13. Analysis of events related to cracks and leaks in the reactor coolant pressure boundary

    Energy Technology Data Exchange (ETDEWEB)

    Ballesteros, Antonio, E-mail: Antonio.Ballesteros-Avila@ec.europa.eu [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Sanda, Radian; Peinador, Miguel; Zerger, Benoit [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Negri, Patrice [IRSN: Institut de Radioprotection et de Sûreté Nucléaire (France); Wenke, Rainer [GRS: Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) mbH (Germany)

    2014-08-15

    Highlights: • The important role of Operating Experience Feedback is emphasised. • Events relating to cracks and leaks in the reactor coolant pressure boundary are analysed. • A methodology for event investigation is described. • Some illustrative results of the analysis of events for specific components are presented. - Abstract: The presence of cracks and leaks in the reactor coolant pressure boundary may jeopardise the safe operation of nuclear power plants. Analysis of cracks and leaks related events is an important task for the prevention of their recurrence, which should be performed in the context of activities on Operating Experience Feedback. In response to this concern, the EU Clearinghouse operated by the JRC-IET supports and develops technical and scientific work to disseminate the lessons learned from past operating experience. In particular, concerning cracks and leaks, the studies carried out in collaboration with IRSN and GRS have allowed to identify the most sensitive areas to degradation in the plant primary system and to elaborate recommendations for upgrading the maintenance, ageing management and inspection programmes. An overview of the methodology used in the analysis of cracks and leaks related events is presented in this paper, together with the relevant results obtained in the study.

  14. Integration of risk matrix and event tree analysis: a natural stone ...

    Indian Academy of Sciences (India)

    M Kemal Özfirat

    2017-09-27

    Sep 27, 2017 ... Different types of accidents may occur in natural stone facilities during movement, dimensioning, cutting ... are numerous risk analysis methods such as preliminary ..... machine type and maintenance (MM) event, block control.

  15. Making sense of root cause analysis investigations of surgery-related adverse events.

    Science.gov (United States)

    Cassin, Bryce R; Barach, Paul R

    2012-02-01

    This article discusses the limitations of root cause analysis (RCA) for surgical adverse events. Making sense of adverse events involves an appreciation of the unique features in a problematic situation, which resist generalization to other contexts. The top priority of adverse event investigations must be to inform the design of systems that help clinicians to adapt and respond effectively in real time to undesirable combinations of design, performance, and circumstance. RCAs can create opportunities in the clinical workplace for clinicians to reflect on local barriers and identify enablers of safe and reliable outcomes. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  17. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  18. Extensive Analysis of Worldwide Events Related to The Construction and Commissioning of Nuclear Power Plants: Lessons Learned and Recommendations

    International Nuclear Information System (INIS)

    Noel, M.; Zerger, B.; Vuorio, U.; )

    2011-01-01

    Lessons learnt from past experience are extensively used to improve the safety of nuclear power plants (NPPs) worldwide. Although the process of analyzing operational experience is now widespread and well developed, the need for establishment of a similar process for construction experience was highlighted by several countries embarking on construction of new NPPs and in some international forums including the Working Group on the Regulation of New Reactors (WGRNR) of the OECD-NEA. In 2008, EU Member State Safety Authorities participating to the EU Clearinghouse on Operational Experience Feedback decided to launch a topical study on events related to pre-operational stages of NPPs. The aim of this topical study is to reduce the recurrence of events related to the construction, the initial component manufacturing and the commissioning of NPPs, by identifying the main recurring and safety significant issues. For this study, 1090 IRS event reports, 857 US Licensee Event Reports (LERs) and approximately 100 WGRNR reports have been preselected based on key word searches and screened. The screening period starts from the beginning of the databases operation (in the 1980's as far as IRS and LER database are concerned) and ends in November 2009. After this initial screening, a total of 582 reports have been found applicable (247 IRS reports, 309 LERs and 26 WGRNR reports). Events considered for this study were those which have been initiated before the start of commercial operation, and detected before or even long after commercial operation. The events have been classified into 3 main categories (construction, manufacturing and commissioning), and into further sub-categories (building structures, metallic liners, electrical components, anchors, I and C, penetrations and building seals, emergency diesel generators, pipes, valves, welds, pumps, etc.) in order to facilitate the detailed analysis with the final objective to formulate both equipment specific

  19. Centrality of event across cultures. Emotionally positive and negative events in Mexico, China, Greenland, and Denmark

    DEFF Research Database (Denmark)

    Zaragoza Scherman, Alejandra; Salgado, Sinué; Shao, Zhifang

    During their lifetime, people experience both emotionally positive and negative events. The Centrality of Event Scale (CES; Berntsen and Rubin, 2006; Berntsen, Rubin and Siegler, 2011) measures the extent to which an event is central to someone’s identity and life story. An event becomes central...... disorder (PTSD) and depression symptoms: Participants with higher PTSD and depression scores reported that a traumatic or negative event was highly central to their identity and life story; and 3) A significant number of positive event occurred during participants’ adolescence and early adulthood, while...... an emotional event into our life story and our identity. Key findings: 1) Positive events are rated as more central to identity than negative events; 2) The extent to which highly traumatic and negative events become central to a person’s life story and identity varies as a function of post-traumatic stress...

  20. Event-by-event jet quenching

    Energy Technology Data Exchange (ETDEWEB)

    Fries, R.J.; Rodriguez, R.; Ramirez, E.

    2010-08-14

    High momentum jets and hadrons can be used as probes for the quark gluon plasma (QGP) formed in nuclear collisions at high energies. We investigate the influence of fluctuations in the fireball on jet quenching observables by comparing propagation of light quarks and gluons through averaged, smooth QGP fireballs with event-by-event jet quenching using realistic inhomogeneous fireballs. We find that the transverse momentum and impact parameter dependence of the nuclear modification factor R{sub AA} can be fit well in an event-by-event quenching scenario within experimental errors. However the transport coefficient {cflx q} extracted from fits to the measured nuclear modification factor R{sub AA} in averaged fireballs underestimates the value from event-by-event calculations by up to 50%. On the other hand, after adjusting {cflx q} to fit R{sub AA} in the event-by-event analysis we find residual deviations in the azimuthal asymmetry v{sub 2} and in two-particle correlations, that provide a possible faint signature for a spatial tomography of the fireball. We discuss a correlation function that is a measure for spatial inhomogeneities in a collision and can be constrained from data.

  1. Event-by-event jet quenching

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, R. [Cyclotron Institute and Physics Department, Texas A and M University, College Station, TX 77843 (United States); Fries, R.J., E-mail: rjfries@comp.tamu.ed [Cyclotron Institute and Physics Department, Texas A and M University, College Station, TX 77843 (United States); RIKEN/BNL Research Center, Brookhaven National Laboratory, Upton, NY 11973 (United States); Ramirez, E. [Physics Department, University of Texas El Paso, El Paso, TX 79968 (United States)

    2010-09-27

    High momentum jets and hadrons can be used as probes for the quark gluon plasma (QGP) formed in nuclear collisions at high energies. We investigate the influence of fluctuations in the fireball on jet quenching observables by comparing propagation of light quarks and gluons through averaged, smooth QGP fireballs with event-by-event jet quenching using realistic inhomogeneous fireballs. We find that the transverse momentum and impact parameter dependence of the nuclear modification factor R{sub AA} can be fit well in an event-by-event quenching scenario within experimental errors. However the transport coefficient q extracted from fits to the measured nuclear modification factor R{sub AA} in averaged fireballs underestimates the value from event-by-event calculations by up to 50%. On the other hand, after adjusting q to fit R{sub AA} in the event-by-event analysis we find residual deviations in the azimuthal asymmetry v{sub 2} and in two-particle correlations, that provide a possible faint signature for a spatial tomography of the fireball. We discuss a correlation function that is a measure for spatial inhomogeneities in a collision and can be constrained from data.

  2. Investigation and analysis of hydrogen ignition and explosion events in foreign nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okuda, Yasunori [Institute of Nuclear Safety System, Inc., Mihama, Fukui (Japan)

    2002-09-01

    Reports about hydrogen ignition and explosion events in foreign nuclear power plants from 1980 to 2001 were investigated, and 31 events were identified. Analysis showed that they were categorized in (1) outer leakage ignition events and (2) inner accumulation ignition events. The dominant event for PWR (pressurized water reactor) was outer leakage ignition in the main generator, and in BWR (boiling water reactor) it was inner accumulation ignition in the off-gas system. The outer leakage ignition was a result of work process failure with the ignition source, operator error, or main generator hydrogen leakage. The inner accumulation ignition events were caused by equipment failure or insufficient monitoring. With careful preventive measures, the factors leading to these events could be eliminated. (author)

  3. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Directory of Open Access Journals (Sweden)

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  4. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, de H.C.W.; Zwaan, L.; Timmermans, D.R.M.; Groenewegen, P.P.; Wagner, C.

    2009-01-01

    BACKGROUND: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis: is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  5. Analysis of unintended events in hospitals : inter-rater reliability of constructing causal trees and classifying root causes

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Groenewegen, P.P.; Timmermans, D.

    2009-01-01

    Background. Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  6. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes.

    NARCIS (Netherlands)

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Timmermans, D.; Groenewegen, P.; Wagner, C.

    2009-01-01

    Background: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  7. Economic Multipliers and Mega-Event Analysis

    OpenAIRE

    Victor Matheson

    2004-01-01

    Critics of economic impact studies that purport to show that mega-events such as the Olympics bring large benefits to the communities “lucky” enough to host them frequently cite the use of inappropriate multipliers as a primary reason why these impact studies overstate the true economic gains to the hosts of these events. This brief paper shows in a numerical example how mega-events may lead to inflated multipliers and exaggerated claims of economic benefits.

  8. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    Science.gov (United States)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  9. Vulnerability analysis of a PWR to an external event

    International Nuclear Information System (INIS)

    Aruety, S.; Ilberg, D.; Hertz, Y.

    1980-01-01

    The Vulnerability of a Nuclear Power Plant (NPP) to external events is affected by several factors such as: the degree of redundancy of the reactor systems, subsystems and components; the separation of systems provided in the general layout; the extent of the vulnerable area, i.e., the area which upon being affected by an external event will result in system failure; and the time required to repair or replace the systems, when allowed. The present study offers a methodology, using Probabilistic Safety Analysis, to evaluate the relative importance of the above parameters in reducing the vulnerability of reactor safety systems. Several safety systems of typical PWR's are analyzed as examples. It was found that the degree of redundancy and physical separation of the systems has the most prominent effect on the vulnerability of the NPP

  10. The limiting events transient analysis by RETRAN02 and VIPRE01 for an ABWR

    International Nuclear Information System (INIS)

    Tsai Chiungwen; Shih Chunkuan; Wang Jongrong; Lin Haotzu; Jin Jiunan; Cheng Suchin

    2009-01-01

    This paper describes the transient analysis of generator load rejection (LR) and One Turbine Control Valve Closure (OTCVC) events for Lungmen nuclear power plant (LMNPP). According to the Critical Power Ratio (CPR) criterion, the Preliminary Safety Analysis Report (PSAR) concluded that LR and OTCVC are the first and second limiting events respectively. In addition, the fuel type is changed from GE12 to GE14 now. It's necessary to re-analyze these two events for safety consideration. In this study, to quantify the impact to reactor, the difference of initial critical power ratio (ICPR) and minimum critical power ratio (MCPR), ie. ΔCPR is calculated. The ΔCPRs of the LR and OTCVC events are calculated with the combination of RETRAN02 and VIPRE01 codes. In RETRAN02 calculation, a thermal-hydraulic model was prepared for the transient analysis. The data including upper plenum pressure, core inlet flow, normalized power, and axial power shapes during transient are furthermore submitted into VIPRE01 for ΔCPR calculation. In VIPRE01 calculation, there was a hot channel model built to simulate the hottest fuel bundle. Based on the thermal-hydraulic data from RETRAN02, the ΔCPRs are calculated by VIPRE01 hot channel model. Additionally, the different TCV control modes are considered to study the influence of different TCV closure curves on transient behavior. Meanwhile, sensitivity studies including different initial system pressure and different initial power/flow conditions are also considered. Based on this analysis, the maximum ΔCPRs for LR and OTCVC are 0.162 and 0.191 respectively. According CPR criterion, the result shows that the impact caused by OTCVC event leads to be larger than LR event. (author)

  11. Analysis of syntactic and semantic features for fine-grained event-spatial understanding in outbreak news reports

    Directory of Open Access Journals (Sweden)

    Chanlekha Hutchatai

    2010-03-01

    Full Text Available Abstract Background Previous studies have suggested that epidemiological reasoning needs a fine-grained modelling of events, especially their spatial and temporal attributes. While the temporal analysis of events has been intensively studied, far less attention has been paid to their spatial analysis. This article aims at filling the gap concerning automatic event-spatial attribute analysis in order to support health surveillance and epidemiological reasoning. Results In this work, we propose a methodology that provides a detailed analysis on each event reported in news articles to recover the most specific locations where it occurs. Various features for recognizing spatial attributes of the events were studied and incorporated into the models which were trained by several machine learning techniques. The best performance for spatial attribute recognition is very promising; 85.9% F-score (86.75% precision/85.1% recall. Conclusions We extended our work on event-spatial attribute recognition by focusing on machine learning techniques, which are CRF, SVM, and Decision tree. Our approach avoided the costly development of an external knowledge base by employing the feature sources that can be acquired locally from the analyzed document. The results showed that the CRF model performed the best. Our study indicated that the nearest location and previous event location are the most important features for the CRF and SVM model, while the location extracted from the verb's subject is the most important to the Decision tree model.

  12. Emerging risk – Conceptual definition and a relation to black swan type of events

    International Nuclear Information System (INIS)

    Flage, R.; Aven, T.

    2015-01-01

    The concept of emerging risk has gained increasing attention in recent years. The term has an intuitive appeal and meaning but a consistent and agreed definition is missing. We perform an in-depth analysis of this concept, in particular its relation to black swan type of events, and show that these can be considered meaningful and complementary concepts by relating emerging risk to known unknowns and black swans to unknown knowns, unknown unknowns and a subset of known knowns. The former is consistent with saying that we face emerging risk related to an activity when the background knowledge is weak but contains indications/justified beliefs that a new type of event (new in the context of that activity) could occur in the future and potentially have severe consequences to something humans value. The weak background knowledge among other things results in difficulty specifying consequences and possibly also in fully specifying the event itself; i.e. in difficulty specifying scenarios. Here knowledge becomes the key concept for both emerging risk and black swan type of events, allowing for taking into consideration time dynamics since knowledge develops over time. Some implications of our findings in terms of risk assessment and risk management are pointed out. - Highlights: • We perform an in-depth analysis of the concept of emerging risk. • Emerging risk and black swan type of events are shown to be complementary concepts. • We propose a definition of emerging risk where knowledge becomes the key term. • Some implications for risk assessment and risk management are pointed out.

  13. DISRUPTIVE EVENT BIOSPHERE DOSE CONVERSION FACTOR ANALYSIS

    International Nuclear Information System (INIS)

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The Biosphere Model Report (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objective of this analysis was to develop the BDCFs for the volcanic

  14. Shock events and flood risk management: a media analysis of the institutional long-term effects of flood events in the Netherlands and Poland

    Directory of Open Access Journals (Sweden)

    Maria Kaufmann

    2016-12-01

    Full Text Available Flood events that have proven to create shock waves in society, which we will call shock events, can open windows of opportunity that allow different actor groups to introduce new ideas. Shock events, however, can also strengthen the status quo. We will take flood events as our object of study. Whereas others focus mainly on the immediate impact and disaster management, we will focus on the long-term impact on and resilience of flood risk governance arrangements. Over the last 25 years, both the Netherlands and Poland have suffered several flood-related events. These triggered strategic and institutional changes, but to different degrees. In a comparative analysis these endogenous processes, i.e., the importance of framing of the flood event, its exploitation by different actor groups, and the extent to which arrangements are actually changing, are examined. In line with previous research, our analysis revealed that shock events test the capacity to resist and bounce back and provide opportunities for adapting and learning. They "open up" institutional arrangements and make them more susceptible to change, increasing the opportunity for adaptation. In this way they can facilitate a shift toward different degrees of resilience, i.e., by adjusting the current strategic approach or by moving toward another strategic approach. The direction of change is influenced by the actors and the frames they introduce, and their ability to increase the resonance of the frame. The persistence of change seems to be influenced by the evolution of the initial management approach, the availability of resources, or the willingness to allocate resources.

  15. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  16. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    Directory of Open Access Journals (Sweden)

    Matthew Bucknor

    2017-03-01

    Full Text Available Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general for the postulated transient event.

  17. Advanced reactor passive system reliability demonstration analysis for an external event

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin [Argonne National Laboratory, Argonne (United States)

    2017-03-15

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.

  18. Advanced reactor passive system reliability demonstration analysis for an external event

    International Nuclear Information System (INIS)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2017-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event

  19. An Initiating-Event Analysis for PSA of Hanul Units 3 and 4: Results and Insights

    International Nuclear Information System (INIS)

    Kim, Dong-San; Park, Jin Hee

    2015-01-01

    As a part of the PSA, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard related to IE analysis. This paper describes the methods of, results and some insights from the IE analysis for the PSA of the Hanul units 3 and 4. In this study, as a part of the PSA for the Hanul units 3 and 4, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard. In comparison with the previous IE analysis, this study performed a more systematic and detailed analysis to identify potential initiating events, and calculated the IE frequencies by using the state-of-the-art methods and the latest data. As a result, not a few IE frequencies are quite different from the previous frequencies, which can change the major accident sequences obtained from the quantification of the PSA model

  20. User-Centric Key Entropy: Study of Biometric Key Derivation Subject to Spoofing Attacks

    Directory of Open Access Journals (Sweden)

    Lavinia Mihaela Dinca

    2017-02-01

    Full Text Available Biometric data can be used as input for PKI key pair generation. The concept of not saving the private key is very appealing, but the implementation of such a system shouldn’t be rushed because it might prove less secure then current PKI infrastructure. One biometric characteristic can be easily spoofed, so it was believed that multi-modal biometrics would offer more security, because spoofing two or more biometrics would be very hard. This notion, of increased security of multi-modal biometric systems, was disproved for authentication and matching, studies showing that not only multi-modal biometric systems are not more secure, but they introduce additional vulnerabilities. This paper is a study on the implications of spoofing biometric data for retrieving the derived key. We demonstrate that spoofed biometrics can yield the same key, which in turn will lead an attacker to obtain the private key. A practical implementation is proposed using fingerprint and iris as biometrics and the fuzzy extractor for biometric key extraction. Our experiments show what happens when the biometric data is spoofed for both uni-modal systems and multi-modal. In case of multi-modal system tests were performed when spoofing one biometric or both. We provide detailed analysis of every scenario in regard to successful tests and overall key entropy. Our paper defines a biometric PKI scenario and an in depth security analysis for it. The analysis can be viewed as a blueprint for implementations of future similar systems, because it highlights the main security vulnerabilities for bioPKI. The analysis is not constrained to the biometric part of the system, but covers CA security, sensor security, communication interception, RSA encryption vulnerabilities regarding key entropy, and much more.

  1. Erectile dysfunction and cardiovascular events in diabetic men: a meta-analysis of observational studies.

    Directory of Open Access Journals (Sweden)

    Tomohide Yamada

    Full Text Available BACKGROUND: Several studies have shown that erectile dysfunction (ED influences the risk of cardiovascular events (CV events. However, a meta-analysis of the overall risk of CV events associated with ED in patients with diabetes has not been performed. METHODOLOGY/PRINCIPAL FINDINGS: We searched MEDLINE and the Cochrane Library for pertinent articles (including references published between 1951 and April 22, 2012. English language reports of original observational cohort studies and cross-sectional studies were included. Pooled effect estimates were obtained by random effects meta-analysis. A total of 3,791 CV events were reported in 3 cohort studies and 9 cross-sectional studies (covering 22,586 subjects. Across the cohort studies, the overall odds ratio (OR of diabetic men with ED versus those without ED was 1.74 (95% confidence interval [CI]: 1.34-2.27; P0.05. Moreover, meta-regression analysis found no relationship between the method used to assess ED (questionnaire or interview, mean age, mean hemoglobin A(1c, mean body mass index, or mean duration of diabetes and the risk of CV events or CHD. In the cross-sectional studies, the OR of diabetic men with ED versus those without ED was 3.39 (95% CI: 2.58-4.44; P<0.001 for CV events (N = 9, 3.43 (95% CI: 2.46-4.77; P<0.001 for CHD (N = 7, and 2.63 (95% CI: 1.41-4.91; P = 0.002 for peripheral vascular disease (N = 5. CONCLUSION/SIGNIFICANCE: ED was associated with an increased risk of CV events in diabetic patients. Prevention and early detection of cardiovascular disease are important in the management of diabetes, especially in view of the rapid increase in its prevalence.

  2. [Analysis on the key factors affecting the inheritance of the acupuncture learning].

    Science.gov (United States)

    Li, Su-yun; Zhang, Li-jian; Gang, Wei-juan; Xu, Wen-bin; Xu, Qing-yan

    2010-06-01

    On the basis of systematicly reviewing the developmental history of acupuncture and moxibustion and profoundly understanding its academic connotations, the authors of the present article make a summary and analysis on the key factors influencing the development of acupuncturology. These key factors are (1) the emergence of "microacupuncture needle regulating-Qi" and the establishement of their corresponding theory system, (2) a large number of practitioners who inherited the learnings of acupuncturology generations by generations, and abundant medical classical works which recorded the valuable academic thoughts and clinical experience of the predecesors, (3) the application of acupuncture charts and manikins, and (4) modernizing changes of acupuncture learnings after introduction of western medicine to China. Just under the influence of these key factors, the acupuncture medicine separates itself from the level of the simple experience medicine, and has formed a set of special theory system and developed into a mature subject.

  3. Keys to Successful EPIQ Business Demonstrator Implementation

    NARCIS (Netherlands)

    Shoikova, Elena; Denishev, Vladislav

    2009-01-01

    Shoikova, E., & Denishev, V. (2009). Keys to Successful EPIQ Business Demonstrator Implementation. Paper presented at the 'Open workshop of TENCompetence - Rethinking Learning and Employment at a Time of Economic Uncertainty-event'. November, 19, 2009, Manchester, United Kingdom: TENCompetence.

  4. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  5. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)

    2014-07-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  6. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    International Nuclear Information System (INIS)

    Kančev, Duško; Duchac, Alexander; Zerger, Benoit; Maqua, Michael; Wattrelos, Didier

    2014-01-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  7. Technology and economic impacts of mega-sports events: A key issue? Exploratory insights from literature

    Directory of Open Access Journals (Sweden)

    Chanaron Jean Jacques

    2014-01-01

    Full Text Available Mega-sport events such as Olympic Games or Football World Cup are always presented as providing the hosting nation and/or city with huge benefits. Supporters of such events quote economic, social and cultural impacts for the region as well as contributions to scientific and technological progress and innovation. obviously, they need to politically justify the impressive and growing financial investment required by organizing Olympic Games or World Cup. The article aims at looking at a quite abundant academic literature with the objectives of defining the various potential impacts and the methods used for their assessment. It concludes that there is no universal and scientifically valid model for evaluating the benefits of mega-sport events and that organizers should be very cautious when arguing in favor of deciding to host such events.

  8. Features, Events, and Processes: Disruptive Events

    Energy Technology Data Exchange (ETDEWEB)

    J. King

    2004-03-31

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  9. Features, Events, and Processes: Disruptive Events

    International Nuclear Information System (INIS)

    J. King

    2004-01-01

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report

  10. Modeling Multi-Event Non-Point Source Pollution in a Data-Scarce Catchment Using ANN and Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2017-06-01

    Full Text Available Event-based runoff–pollutant relationships have been the key for water quality management, but the scarcity of measured data results in poor model performance, especially for multiple rainfall events. In this study, a new framework was proposed for event-based non-point source (NPS prediction and evaluation. The artificial neural network (ANN was used to extend the runoff–pollutant relationship from complete data events to other data-scarce events. The interpolation method was then used to solve the problem of tail deviation in the simulated pollutographs. In addition, the entropy method was utilized to train the ANN for comprehensive evaluations. A case study was performed in the Three Gorges Reservoir Region, China. Results showed that the ANN performed well in the NPS simulation, especially for light rainfall events, and the phosphorus predictions were always more accurate than the nitrogen predictions under scarce data conditions. In addition, peak pollutant data scarcity had a significant impact on the model performance. Furthermore, these traditional indicators would lead to certain information loss during the model evaluation, but the entropy weighting method could provide a more accurate model evaluation. These results would be valuable for monitoring schemes and the quantitation of event-based NPS pollution, especially in data-poor catchments.

  11. Modeling time-to-event (survival) data using classification tree analysis.

    Science.gov (United States)

    Linden, Ariel; Yarnold, Paul R

    2017-12-01

    Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.

  12. Mining key elements for severe convection prediction based on CNN

    Science.gov (United States)

    Liu, Ming; Pan, Ning; Zhang, Changan; Sha, Hongzhou; Zhang, Bolei; Liu, Liang; Zhang, Meng

    2017-04-01

    Severe convective weather is a kind of weather disasters accompanied by heavy rainfall, gust wind, hail, etc. Along with recent developments on remote sensing and numerical modeling, there are high-volume and long-term observational and modeling data accumulated to capture massive severe convective events over particular areas and time periods. With those high-volume and high-variety weather data, most of the existing studies and methods carry out the dynamical laws, cause analysis, potential rule study, and prediction enhancement by utilizing the governing equations from fluid dynamics and thermodynamics. In this study, a key-element mining method is proposed for severe convection prediction based on convolution neural network (CNN). It aims to identify the key areas and key elements from huge amounts of historical weather data including conventional measurements, weather radar, satellite, so as numerical modeling and/or reanalysis data. Under this manner, the machine-learning based method could help the human forecasters on their decision-making on operational weather forecasts on severe convective weathers by extracting key information from the real-time and historical weather big data. In this paper, it first utilizes computer vision technology to complete the data preprocessing work of the meteorological variables. Then, it utilizes the information such as radar map and expert knowledge to annotate all images automatically. And finally, by using CNN model, it cloud analyze and evaluate each weather elements (e.g., particular variables, patterns, features, etc.), and identify key areas of those critical weather elements, then help forecasters quickly screen out the key elements from huge amounts of observation data by current weather conditions. Based on the rich weather measurement and model data (up to 10 years) over Fujian province in China, where the severe convective weathers are very active during the summer months, experimental tests are conducted with

  13. Risk analysis of brachytherapy events

    International Nuclear Information System (INIS)

    Buricova, P.; Zackova, H.; Hobzova, L.; Novotny, J.; Kindlova, A.

    2005-01-01

    For prevention radiological events it is necessary to identify hazardous situation and to analyse the nature of committed errors. Though the recommendation on the classification and prevention of radiological events: Radiological accidents has been prepared in the framework of Czech Society of Radiation Oncology, Biology and Physics and it was approved by Czech regulatory body (SONS) in 1999, only a few reports have been submitted up to now from brachytherapy practice. At the radiotherapy departments attention has been paid more likely to the problems of dominant teletherapy treatments. But in the two last decades the usage of brachytherapy methods has gradually increased because .nature of this treatment well as the possibilities of operating facility have been completely changed: new radionuclides of high activity are introduced and sophisticate afterloading systems controlled by computers are used. Consequently also the nature of errors, which can occurred in the clinical practice, has been changing. To determine the potentially hazardous parts of procedure the so-called 'process tree', which follows the flow of entire treatment process, has been created for most frequent type of applications. Marking the location of errors on the process tree indicates where failures occurred and accumulation of marks along branches show weak points in the process. Analysed data provide useful information to prevent medical events in brachytherapy .The results strength the requirements given in Recommendations of SONS and revealed the need for its amendment. They call especially for systematic registration of the events. (authors)

  14. Human reliability analysis of dependent events

    International Nuclear Information System (INIS)

    Swain, A.D.; Guttmann, H.E.

    1977-01-01

    In the human reliability analysis in WASH-1400, the continuous variable of degree of interaction among human events was approximated by selecting four points on this continuum to represent the entire continuum. The four points selected were identified as zero coupling (i.e., zero dependence), complete coupling (i.e., complete dependence), and two intermediate points--loose coupling (a moderate level of dependence) and tight coupling (a high level of dependence). The paper expands the WASH-1400 treatment of common mode failure due to the interaction of human activities. Mathematical expressions for the above four levels of dependence are derived for parallel and series systems. The psychological meaning of each level of dependence is illustrated by examples, with probability tree diagrams to illustrate the use of conditional probabilities resulting from the interaction of human actions in nuclear power plant tasks

  15. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    International Nuclear Information System (INIS)

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J.

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PER) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs

  16. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    Science.gov (United States)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  17. Comparative analysis as a basic research orientation: Key methodological problems

    Directory of Open Access Journals (Sweden)

    N P Narbut

    2015-12-01

    Full Text Available To date, the Sociological Laboratory of the Peoples’ Friendship University of Russia has accumulated a vast experience in the field of cross-cultural studies reflected in the publications based on the results of mass surveys conducted in Moscow, Maikop, Beijing, Guangzhou, Prague, Belgrade, and Pristina. However, these publications mainly focus on the comparisons of the empirical data rather than methodological and technical issues, that is why the aim of this article is to identify key problems of the comparative analysis in cross-cultural studies that become evident only if you conduct an empirical research yourself - from the first step of setting the problem and approving it by all the sides (countries involved to the last step of interpreting and comparing the data obtained. The authors are sure that no sociologist would ever doubt the necessity and importance of comparative analysis in the broadest sense of the word, but at the same time very few are ready to discuss its key methodological challenges and prefer to ignore them completely. We summarize problems of the comparative analysis in sociology as follows: (1 applying research techniques to the sample in another country - both in translating and adapting them to different social realities and worldview (in particular, the problematic status of standardization and qualitative approach; (2 choosing “right” respondents to question and relevant cases (cultures to study; (3 designing the research scheme, i.e. justifying the sequence of steps (what should go first - methodology or techniques; (4 accepting the procedures that are correct within one country for cross-cultural work (whether or not that is an appropriate choice.

  18. Neural network approach in multichannel auditory event-related potential analysis.

    Science.gov (United States)

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  19. Analysis of transverse momentum and event shape in νN scattering

    International Nuclear Information System (INIS)

    Bosetti, P.C.; Graessler, H.; Lanske, D.; Schulte, R.; Schultze, K.; Simopoulou, E.; Vayaki, A.; Barnham, K.W.J.; Hamisi, F.; Miller, D.B.; Mobayyen, M.M.; Wainstein, S.; Aderholz, M.; Hantke, D.; Hoffmann, E.; Katz, U.F.; Kern, J.; Schmitz, N.; Wittek, W.; Albajar, C.; Batley, J.R.; Myatt, G.; Perkins, D.H.; Radojicic, D.; Renton, P.; Saitta, S.; Bullock, F.W.; Burke, S.

    1990-01-01

    The transverse momentum distributions of hadrons produced in neutrino-nucleon charged current interactions and their dependence on W are analysed in detail. It is found that the components of the transverse momentum in the event plane and normal to it increase with W at about the same rate throughout the available W range. A comparison with e + e - data is made. Studies of the energy flow and angular distributions in the events classified as planar do not show clear evidence for high energy, wide angle gluon radiation, in contrast to the conclusion of a previous analysis of similar neutrino data. (orig.)

  20. A Market Analysis of Publications, Trade Conferences, and Key Events for Fleet Readiness Center Southwest

    Science.gov (United States)

    2007-12-01

    Win and Keep Big Customers. Austin: Bard Press, 2005. Kotler , Philip and Kevin Lane Keller. Marketing Management. Upper Saddle River, NJ...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT A Market Analysis of Publications, Trade Conferences...AGENCY USE ONLY (Leave blank) 2. REPORT DATE December 2007 3. REPORT TYPE AND DATES COVERED MBA Professional Report 4. TITLE AND SUBTITLE: A Market

  1. Transcriptome and metabolome of synthetic Solanum autotetraploids reveal key genomic stress events following polyploidization.

    Science.gov (United States)

    Fasano, Carlo; Diretto, Gianfranco; Aversano, Riccardo; D'Agostino, Nunzio; Di Matteo, Antonio; Frusciante, Luigi; Giuliano, Giovanni; Carputo, Domenico

    2016-06-01

    Polyploids are generally classified as autopolyploids, derived from a single species, and allopolyploids, arising from interspecific hybridization. The former represent ideal materials with which to study the consequences of genome doubling and ascertain whether there are molecular and functional rules operating following polyploidization events. To investigate whether the effects of autopolyploidization are common to different species, or if species-specific or stochastic events are prevalent, we performed a comprehensive transcriptomic and metabolomic characterization of diploids and autotetraploids of Solanum commersonii and Solanum bulbocastanum. Autopolyploidization remodelled the transcriptome and the metabolome of both species. In S. commersonii, differentially expressed genes (DEGs) were highly enriched in pericentromeric regions. Most changes were stochastic, suggesting a strong genotypic response. However, a set of robustly regulated transcripts and metabolites was also detected, including purine bases and nucleosides, which are likely to underlie a common response to polyploidization. We hypothesize that autopolyploidization results in nucleotide pool imbalance, which in turn triggers a genomic shock responsible for the stochastic events observed. The more extensive genomic stress and the higher number of stochastic events observed in S. commersonii with respect to S. bulbocastanum could be the result of the higher nucleoside depletion observed in this species. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  2. Incident sequence analysis; event trees, methods and graphical symbols

    International Nuclear Information System (INIS)

    1980-11-01

    When analyzing incident sequences, unwanted events resulting from a certain cause are looked for. Graphical symbols and explanations of graphical representations are presented. The method applies to the analysis of incident sequences in all types of facilities. By means of the incident sequence diagram, incident sequences, i.e. the logical and chronological course of repercussions initiated by the failure of a component or by an operating error, can be presented and analyzed simply and clearly

  3. Analysis of external flooding events occurred in foreign nuclear power plant sites

    International Nuclear Information System (INIS)

    Li Dan; Cai Hankun; Xiao Zhi; An Hongzhen; Mao Huan

    2013-01-01

    This paper screens and studies 17 external flooding events occurred in foreign NPP sites, analysis the characteristic of external flooding events based on the source of the flooding, the impact on the building, systems and equipment, as well as the threat to nuclear safety. Furthermore, based on the experiences and lessons learned from Fukushima nuclear accident relating to external flooding and countermeasures carried out in the world, some suggestions are proposed in order to improve external flooding response capacity for Chinese NPPs. (authors)

  4. RETRIEVAL EVENTS EVALUATION

    International Nuclear Information System (INIS)

    Wilson, T.

    1999-01-01

    The purpose of this analysis is to evaluate impacts to the retrieval concept presented in the Design Analysis ''Retrieval Equipment and Strategy'' (Reference 6), from abnormal events based on Design Basis Events (DBE) and Beyond Design Basis Events (BDBE) as defined in two recent analyses: (1) DBE/Scenario Analysis for Preclosure Repository Subsurface Facilities (Reference 4); and (2) Preliminary Preclosure Design Basis Event Calculations for the Monitored Geologic Repository (Reference 5) The objective of this task is to determine what impacts the DBEs and BDBEs have on the equipment developed for retrieval. The analysis lists potential impacts and recommends changes to be analyzed in subsequent design analyses for developed equipment, or recommend where additional equipment may be needed, to allow retrieval to be performed in all DBE or BDBE situations. This analysis supports License Application design and therefore complies with the requirements of Systems Description Document input criteria comparison as presented in Section 7, Conclusions. In addition, the analysis discusses the impacts associated with not using concrete inverts in the emplacement drifts. The ''Retrieval Equipment and Strategy'' analysis was based on a concrete invert configuration in the emplacement drift. The scope of the analysis, as presented in ''Development Plan for Retrieval Events Evaluation'' (Reference 3) includes evaluation and criteria of the following: Impacts to retrieval from the emplacement drift based on DBE/BDBEs, and changes to the invert configuration for the preclosure period. Impacts to retrieval from the main drifts based on DBE/BDBEs for the preclosure period

  5. Features, Events, and Processes: Disruptive Events

    International Nuclear Information System (INIS)

    P. Sanchez

    2004-01-01

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the disruptive events features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for license application (TSPA-LA). A screening decision, either ''Included'' or ''Excluded,'' is given for each FEP, along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), and (f) [DIRS 156605]. The FEPs addressed in this report deal with both seismic and igneous disruptive events, such as fault displacements through the repository and an igneous intrusion into the repository. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). Previous versions of this report were developed to support the total system performance assessments (TSPA) for various prior repository designs. This revision addresses the repository design for the license application (LA)

  6. Features, Events, and Processes: Disruptive Events

    Energy Technology Data Exchange (ETDEWEB)

    P. Sanchez

    2004-11-08

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the disruptive events features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for license application (TSPA-LA). A screening decision, either ''Included'' or ''Excluded,'' is given for each FEP, along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.114 (d), (e), and (f) [DIRS 156605]. The FEPs addressed in this report deal with both seismic and igneous disruptive events, such as fault displacements through the repository and an igneous intrusion into the repository. For included FEPs, this analysis summarizes the implementation of the FEP in TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from TSPA-LA (i.e., why the FEP is excluded). Previous versions of this report were developed to support the total system performance assessments (TSPA) for various prior repository designs. This revision addresses the repository design for the license application (LA).

  7. Review of the severe accident risk reduction program (SARRP) containment event trees

    International Nuclear Information System (INIS)

    1986-05-01

    A part of the Severe Accident Risk Reduction Program, researchers at Sandia National Laboratories have constructed a group of containment event trees to be used in the analysis of key accident sequences for light water reactors (LWR) during postulated severe accidents. The ultimate goal of the program is to provide to the NRC staff a current assessment of the risk from severe reactor accidents for a group of five light water reactors. This review specifically focuses on the development and construction of the containment event trees and the results for containment failure probability, modes and timing. The report first gives the background on the program, the review criteria, and a summary of the observations, findings and recommendations. secondly, the individual reviews of each committee member on the event trees is presented. Finally, a review is provided on the computer model used to construct and evaluate the event trees

  8. A multiprocessor system for the analysis of pictures of nuclear events

    CERN Document Server

    Bacilieri, P; Matteuzzi, P; Sini, G P; Zanotti, U

    1979-01-01

    The pictures of nuclear events obtained from the bubble chambers such as Gargamelle and BEBC at CERN and others from Serpukhov are geometrically processed at CNAF (Centro Nazionale Analysis Photogrammi) in Bologna. The analysis system includes an Erasme table and a CRT flying spot digitizer. The difficulties connected with the pictures of the four stereoscopic views of the bubble chambers are overcome by the choice of a strong interactive system. (0 refs).

  9. Characterization Of Dissolved Organic Mattter In The Florida Keys Ecosystem

    Science.gov (United States)

    Adams, D. G.; Shank, G. C.

    2009-12-01

    Over the past few decades, Scleractinian coral populations in the Florida Keys have increasingly experienced mortality due to bleaching events as well as microbial mediated illnesses such as black band and white band disease. Such pathologies seem to be most correlated with elevated sea surface temperatures, increased UV exposures, and shifts in the microbial community living on the coral itself. Recent studies indicate that corals’ exposure to UV in the Florida Keys is primarily controlled by the concentration of CDOM (Chromophoric Dissolved Organic Matter) in the water column. Further, microbial community alterations may be linked to changes in concentration and chemical composition of the larger DOM (Dissolved Organic Matter) pool. Our research characterized the spatial and temporal properties of DOM in Florida Bay and along the Keys ecosystems using DOC analyses, in-situ water column optical measurements, and spectral analyses including absorbance and fluorescence measurements. We analyzed DOM characteristics along transects running from the mouth of the Shark River at the southwest base of the Everglades, through Florida Bay, and along near-shore Keys coastal waters. Two 12 hour time-series samplings were also performed at the Seven-Mile Bridge, the primary Florida Bay discharge channel to the lower Keys region. Photo-bleaching experiments showed that the chemical characteristics of the DOM pool are altered by exposure to solar radiation. Results also show that DOC (~0.8-5.8 mg C/L) and CDOM (~0.5-16.5 absorbance coefficient at 305nm) concentrations exhibit seasonal fluctuations in our study region. EEM analyses suggest seasonal transitions between primarily marine (summer) and terrestrial (winter) sources along the Keys. We are currently combining EEM-PARAFAC analysis with in-situ optical measurements to model changes in the spectral properties of DOM in the water column. Additionally, we are using stable δ13C isotopic analysis to further characterize DOM

  10. Modeling the recurrent failure to thrive in less than two-year children: recurrent events survival analysis.

    Science.gov (United States)

    Saki Malehi, Amal; Hajizadeh, Ebrahim; Ahmadi, Kambiz; Kholdi, Nahid

    2014-01-01

    This study aimes to evaluate the failure to thrive (FTT) recurrent event over time. This longitudinal study was conducted during February 2007 to July 2009. The primary outcome was growth failure. The analysis was done using 1283 children who had experienced FTT several times, based on recurrent events analysis. Fifty-nine percent of the children had experienced the FTT at least one time and 5.3% of them had experienced it up to four times. The Prentice-Williams-Peterson (PWP) model revealed significant relationship between diarrhea (HR=1.26), respiratory infections (HR=1.25), urinary tract infections (HR=1.51), discontinuation of breast-feeding (HR=1.96), teething (HR=1.18), initiation age of complementary feeding (HR=1.11) and hazard rate of the first FTT event. Recurrence nature of the FTT is a main problem, which taking it into account increases the accuracy in analysis of FTT event process and can lead to identify different risk factors for each FTT recurrences.

  11. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual. Companion CD

    International Nuclear Information System (INIS)

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  12. Bisphosphonates and risk of cardiovascular events: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Dae Hyun Kim

    Full Text Available Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV events, atrial fibrillation, myocardial infarction (MI, stroke, and CV death in adults with or at risk for low bone mass.A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs and 95% confidence intervals (CIs of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed.Absolute risks over 25-36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84-1.14]; I2 = 0.0%, atrial fibrillation (41 trials; 1.08 [0.92-1.25]; I2 = 0.0%, MI (10 trials; 0.96 [0.69-1.34]; I2 = 0.0%, stroke (10 trials; 0.99 [0.82-1.19]; I2 = 5.8%, and CV death (14 trials; 0.88 [0.72-1.07]; I2 = 0.0% with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96-1.61]; I2 = 0.0%, not for oral bisphosphonates (26 trials; 1.02 [0.83-1.24]; I2 = 0.0%. The CV effects did not vary by subgroups or study quality.Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large reduction in fractures with bisphosphonates, changes in

  13. Practical guidance for statistical analysis of operational event data

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies

  14. Water resources and environmental input-output analysis and its key study issues: a review

    Science.gov (United States)

    YANG, Z.; Xu, X.

    2013-12-01

    Used to study the material and energy flow in socioeconomic system, Input-Output Analysis(IOA) had been an effective analysis tool since its appearance. The research fields of Input-Output Analysis were increasingly expanded and studied in depth with the development of fundamental theory. In this paper, starting with introduction of theory development, the water resources input-output analysis and environmental input-output analysis had been specifically reviewed, and two key study issues mentioned as well. Input-Occupancy-Output Analysis and Grey Input-Output Analysis whose proposal and development were introduced firstly could be regard as the effective complements of traditional IOA theory. Because of the hypotheses of homogeneity, stability and proportionality, Input-Occupancy-Output Analysis and Grey Input-Output Analysis always had been restricted in practical application inevitably. In the applied study aspect, with investigation of abundant literatures, research of water resources input-output analysis and environmental input-output analysis had been comprehensively reviewed and analyzed. The regional water resources flow between different economic sectors had been systematically analyzed and stated, and several types of environmental input-output analysis models combined with other effective analysis tools concluded. In two perspectives in terms of external and inland aspect, the development of water resources and environmental input-output analysis model had been explained, and several typical study cases in recent years listed respectively. By the aid of sufficient literature analysis, the internal development tendency and study hotspot had also been summarized. In recent years, Chinese literatures reporting water resources consumption analysis and virtue water study had occupied a large share. Water resources consumption analysis had always been the emphasis of inland water resources IOA. Virtue water study had been considered as the new hotspot of

  15. Key European Grid event to take place in Geneva

    CERN Multimedia

    2006-01-01

    EGEE'06 is the main conference of the EGEE project, which is co-funded by the European Union and hosted by CERN. More than 90 partners all over Europe and beyond are working together in EGEE to provide researchers in both academia and industry with access to major computing resources, independent of their geographic location. The largest user community of the EGEE Grid is the High-Energy Physics community and in particular the LHC experiments, which are already making heavy use of the infrastructure to prepare for data taking. However, with the many new challenges faced by EGEE in its second phase that started in April this year, an even broader audience than at previous EGEE conferences is expected. In particular, a large number of related Grid projects will feature prominently in both plenary and parallel sessions during the 5 days of this event. Industry will also be well represented, highlighting the EGEE project's commitment to technology transfer to industry. CERN is the host of the conference, which i...

  16. A Hierarchical Convolutional Neural Network for vesicle fusion event classification.

    Science.gov (United States)

    Li, Haohan; Mao, Yunxiang; Yin, Zhaozheng; Xu, Yingke

    2017-09-01

    Quantitative analysis of vesicle exocytosis and classification of different modes of vesicle fusion from the fluorescence microscopy are of primary importance for biomedical researches. In this paper, we propose a novel Hierarchical Convolutional Neural Network (HCNN) method to automatically identify vesicle fusion events in time-lapse Total Internal Reflection Fluorescence Microscopy (TIRFM) image sequences. Firstly, a detection and tracking method is developed to extract image patch sequences containing potential fusion events. Then, a Gaussian Mixture Model (GMM) is applied on each image patch of the patch sequence with outliers rejected for robust Gaussian fitting. By utilizing the high-level time-series intensity change features introduced by GMM and the visual appearance features embedded in some key moments of the fusion process, the proposed HCNN architecture is able to classify each candidate patch sequence into three classes: full fusion event, partial fusion event and non-fusion event. Finally, we validate the performance of our method on 9 challenging datasets that have been annotated by cell biologists, and our method achieves better performances when comparing with three previous methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Drift Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dwayne C. Kicker

    2001-09-28

    A statistical description of the probable block sizes formed by fractures around the emplacement drifts has been developed for each of the lithologic units of the repository host horizon. A range of drift orientations with the drift azimuth varied in 15{sup o} increments has been considered in the static analysis. For the quasi-static seismic analysis, and the time-dependent and thermal effects analysis, two drift orientations have been considered: a drift azimuth of 105{sup o} and the current emplacement drift azimuth of 75{sup o}. The change in drift profile resulting from progressive deterioration of the emplacement drifts has been assessed both with and without backfill. Drift profiles have been determined for four different time increments, including static (i.e., upon excavation), 200 years, 2,000 years, and 10,000 years. The effect of seismic events on rock fall has been analyzed. Block size distributions and drift profiles have been determined for three seismic levels, including a 1,000-year event, a 5,000-year event, and a 10,000-year event. Data developed in this modeling and analysis activity have been entered into the TDMS (DTN: MO0109RDDAAMRR.003). The following conclusions have resulted from this drift degradation analysis: (1) The available fracture data are suitable for supporting a detailed key block analysis of the repository host horizon rock mass. The available data from the north-south Main Drift and the east-west Cross Drift provide a sufficient representative fracture sample of the repository emplacement drift horizon. However, the Tptpln fracture data are only available from a relatively small section of the Cross Drift, resulting in a smaller fracture sample size compared to the other lithologic units. This results in a lower degree of confidence that the key block data based on the Tptpln data set is actually representative of the overall Tptpln key block population. (2) The seismic effect on the rock fall size distribution for all events

  18. [Causes of underreporting of occupational injuries and adverse events in Chile].

    Science.gov (United States)

    Luengo, Carolina; Paravic, Tatiana; Valenzuela, Sandra

    2016-02-01

    Objective To describe the causes of underreporting of occupational injuries and adverse events as identified in the international literature and by key informants in the area of health and risk prevention in Chile. Methods The study uses a qualitative descriptive approach. This includes a systematized literature review that follows the SALSA method (Search, Appraisal, Synthesis and Analysis) and is in line with the PRISMA statement (Preferred Reporting Items for Systematic Reviews and Meta-Analyses). In addition, interviews were conducted with informants in the area of health and risk prevention in Chile. Results The leading causes of underreporting of occupational injuries as described in the literature and by key informants were economic factors and ignorance. With regard to adverse events, the principal causes indicated were fear of sanctions, limited support provided by the authorities, lack of knowledge, and excessive workload. Conclusions It is important to continue working to strengthen the reporting of occupational injuries and adverse events and to implement measures aimed at minimizing factors that appear to be the leading causes of underreporting. In the case of occupational injuries, this means making sure that economic factors are not an impediment but rather an incentive to reporting. With respect to adverse events, steps should be taken to eliminate the fear of sanctions and to develop recommendations, focusing more on systemic improvements than on individuals, to promote joint learning. In both cases it will be necessary to combat ignorance through continuous, systematic training and support.

  19. Using Web Crawler Technology for Geo-Events Analysis: A Case Study of the Huangyan Island Incident

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2014-04-01

    Full Text Available Social networking and network socialization provide abundant text information and social relationships into our daily lives. Making full use of these data in the big data era is of great significance for us to better understand the changing world and the information-based society. Though politics have been integrally involved in the hyperlinked world issues since the 1990s, the text analysis and data visualization of geo-events faced the bottleneck of traditional manual analysis. Though automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have been explored and built recently, the data mining and information collection are not comprehensive enough because of the sensibility, complexity, relativity, timeliness, and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency, and dissemination path of the Huangyan Island incident were studied by using web crawler technology and the text analysis. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios, and dissemination flow graph, based on the crawled information and data processing not only highlight the characteristics of geo-event itself, but also implicate many interesting phenomenon and deep-seated problems behind it, such as related topics, theme vocabularies, subject contents, hot countries, event bodies, opinion leaders, high-frequency vocabularies, information sources, semantic structure, propagation paths, distribution of different attitudes, and regional difference of net citizens’ response in the Huangyan Island incident. Furthermore, the text analysis of network information with the help of focused web crawler is able to express the time-space relationship of crawled information and the information characteristic of semantic network to the geo-events. Therefore, it is a useful tool to

  20. EVENT-MARKETING – FEATURES OF APPLICATION IN MODERN TOURISM

    Directory of Open Access Journals (Sweden)

    Oksana Vlasenko

    2016-03-01

    Full Text Available In the article analyzed the modern features of the development and using of event- marketing. Showed the conditions of the essence and characteristics of event management, its principles and methods of application. Characterized the features and importance of tourism and the benefits of application of event marketing as a promising method of indirect marketing communications. Used examples of practical application of event marketing activity. Determined correlation of event management and marketing and its subordination to the event marketing purposes. Key words: tourism, event-tourism, event-management, event-marketing, socio-cultural sphere. JEL: M 31

  1. The analysis of competing events like cause-specific mortality--beware of the Kaplan-Meier method

    NARCIS (Netherlands)

    Verduijn, Marion; Grootendorst, Diana C.; Dekker, Friedo W.; Jager, Kitty J.; le Cessie, Saskia

    2011-01-01

    Kaplan-Meier analysis is a popular method used for analysing time-to-event data. In case of competing event analyses such as that of cardiovascular and non-cardiovascular mortality, however, the Kaplan-Meier method profoundly overestimates the cumulative mortality probabilities for each of the

  2. Superposed epoch analysis of O+ auroral outflow during sawtooth events and substorms

    Science.gov (United States)

    Nowrouzi, N.; Kistler, L. M.; Lund, E. J.; Cai, X.

    2017-12-01

    Sawtooth events are repeated injection of energetic particles at geosynchronous orbit. Studies have shown that 94% of sawtooth events occurred during magnetic storm times. The main factor that causes a sawtooth event is still an open question. Simulations have suggested that heavy ions like O+ may play a role in triggering the injections. One of the sources of the O+ in the Earth's magnetosphere is the nightside aurora. O+ ions coming from the nightside auroral region have direct access to the near-earth magnetotail. A model (Brambles et al. 2013) for interplanetary coronal mass ejection driven sawtooth events found that nightside O+ outflow caused the subsequent teeth of the sawtooth event through a feedback mechanism. This work is a superposed epoch analysis to test whether the observed auroral outflow supports this model. Using FAST spacecraft data from 1997-2007, we examine the auroral O+ outflow as a function of time relative to an injection onset. Then we determine whether the profile of outflow flux of O+ during sawtooth events is different from the outflow observed during isolated substorms. The auroral region boundaries are estimated using the method of (Andersson et al. 2004). Subsequently the O+ outflow flux inside these boundaries are calculated and binned as a function of superposed epoch time for substorms and sawtooth "teeth". In this way, we will determine if sawtooth events do in fact have greater O+ outflow, and if that outflow is predominantly from the nightside, as suggested by the model results.

  3. Event Sequence Analysis of the Air Intelligence Agency Information Operations Center Flight Operations

    National Research Council Canada - National Science Library

    Larsen, Glen

    1998-01-01

    This report applies Event Sequence Analysis, methodology adapted from aircraft mishap investigation, to an investigation of the performance of the Air Intelligence Agency's Information Operations Center (IOC...

  4. Broadband analysis of landslides seismic signal : example of the Oso-Steelhead landslide and other recent events

    Science.gov (United States)

    Hibert, C.; Stark, C. P.; Ekstrom, G.

    2014-12-01

    Landslide failures on the scale of mountains are spectacular, dangerous, and spontaneous, making direct observations hard to obtain. Measurement of their dynamic properties during runout is a high research priority, but a logistical and technical challenge. Seismology has begun to help in several important ways. Taking advantage of broadband seismic stations, recent advances now allow: (i) the seismic detection and location of large landslides in near-real-time, even for events in very remote areas that may have remain undetected, such as the 2014 Mt La Perouse supraglacial failure in Alaska; (ii) inversion of long-period waves generated by large landslides to yield an estimate of the forces imparted by the bulk accelerating mass; (iii) inference of the landslide mass, its center-of-mass velocity over time, and its trajectory.Key questions persist, such as: What can the short-period seismic data tell us about the high-frequency impacts taking place within the granular flow and along its boundaries with the underlying bedrock? And how does this seismicity relate to the bulk acceleration of the landslide and the long-period seismicity generated by it?Our recent work on the joint analysis of short- and long-period seismic signals generated by past and recent events, such as the Bingham Canyon Mine and the Oso-Steelhead landslides, provides new insights to tackle these issues. Qualitative comparison between short-period signal features and kinematic parameters inferred from long-period surface wave inversion helps to refine interpretation of the source dynamics and to understand the different mechanisms for the origin of the short-period wave radiation. Our new results also suggest that quantitative relationships can be derived from this joint analysis, in particular between the short-period seismic signal envelope and the inferred momentum of the center-of-mass. In the future, these quantitative relationships may help to constrain and calibrate parameters used in

  5. Event shape analysis in ultrarelativistic nuclear collisions

    OpenAIRE

    Kopecna, Renata; Tomasik, Boris

    2016-01-01

    We present a novel method for sorting events. So far, single variables like flow vector magnitude were used for sorting events. Our approach takes into account the whole azimuthal angle distribution rather than a single variable. This method allows us to determine the good measure of the event shape, providing a multiplicity-independent insight. We discuss the advantages and disadvantages of this approach, the possible usage in femtoscopy, and other more exclusive experimental studies.

  6. Earth Science Data Fusion with Event Building Approach

    Science.gov (United States)

    Lukashin, C.; Bartle, Ar.; Callaway, E.; Gyurjyan, V.; Mancilla, S.; Oyarzun, R.; Vakhnin, A.

    2015-01-01

    Objectives of the NASA Information And Data System (NAIADS) project are to develop a prototype of a conceptually new middleware framework to modernize and significantly improve efficiency of the Earth Science data fusion, big data processing and analytics. The key components of the NAIADS include: Service Oriented Architecture (SOA) multi-lingual framework, multi-sensor coincident data Predictor, fast into-memory data Staging, multi-sensor data-Event Builder, complete data-Event streaming (a work flow with minimized IO), on-line data processing control and analytics services. The NAIADS project is leveraging CLARA framework, developed in Jefferson Lab, and integrated with the ZeroMQ messaging library. The science services are prototyped and incorporated into the system. Merging the SCIAMACHY Level-1 observations and MODIS/Terra Level-2 (Clouds and Aerosols) data products, and ECMWF re- analysis will be used for NAIADS demonstration and performance tests in compute Cloud and Cluster environments.

  7. Analysis of events with isolated leptons and missing transverse momentum in ep collisions at HERA

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, G.

    2007-02-07

    A study of events with isolated leptons and missing transverse momentum in ep collisions is presented. Within the Standard Model (SM) such topologies are expected mainly from production of real W bosons with subsequent leptonic decay. This thesis continues the analysis of such events done in the HERA-1 period where an excess over the SM prediction was observed for events with high hadronic transverse momentum P{sup X}{sub T}>25 GeV. New data of the HERA-2 period are added. The analysed data sample recorded in e{sup +}p collisions corresponds to an integrated luminosity of 220 pb{sup -1} which is a factor of two more with respect to the HERA-1 analysis. The e{sup -}p data correspond to 186 pb{sup -1} which is a factor of 13 more with respect to HERA-1. All three lepton generations (electrons muons and tau leptons) are analysed. In the electron and muon channels a total of 53 events are observed in 406 pb{sup -1}. This compares well to the SM expectation of 53.7{+-}6.5 events, dominated by W production. However a difference in the event rate is observed for different electron beam charges. In e{sup +}p data the excess of events with P{sup X}{sub T}>25 GeV is sustained, while the e{sup -}p data agree with the SM. In the tau channel 18 events are observed in all HERA data, with 20{+-}3 expected from the SM. The events are dominated by irreducible background from charged currents. The contribution from W production amounts to about 22%. One event with P{sup X}{sub T}>25 GeV is observed, where 1.4{+-}0.3 are expected from the SM. (orig.)

  8. A limited area model intercomparison on the 'Montserrat-2000' flash-flood event using statistical and deterministic methods

    Directory of Open Access Journals (Sweden)

    S. Mariani

    2005-01-01

    Full Text Available In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as 'Montserrat-2000' event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs, several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard 'eyeball' analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.

  9. WANO Activities Related to Identifying and Reducing the Likelihood for Recurring Events

    International Nuclear Information System (INIS)

    Llewellyn, Michael

    2003-01-01

    Since its inception, WANO has encouraged members to share operating experience and events information through the WANO Operating Experience Programme. Preventing recurring events is a prime reason for sharing events information. Over 2500 events have been shared through WANO since 1989. However, a review of WANO activities in 1997 identified that this information was not being used very well by WANO members, and that WANO was not adding much 'value' to the events sharing process. At the time, WANO only provided the 'postal exchange' function for events sharing, and was not reviewing events across WANO to help members focus on the really important issues. Often, these very important issues involve recurring events. As a result of the 1997 review of the WANO operating experience process, WANO re-sourced and developed new analysis capabilities and began producing new types of reports for its members. The resource commitment includes four seconded engineers (one from each WANO Regional Centre) brought to Paris to staff a WANO Operating Experience Central Team. This team analyses events across WANO, writes WANO event reports, provides operating experience-related training, and provides technical support to members to improve their use of operating experience information. One focus area for events analysis by WANO is the identification and subsequent industry notification of recurring events. As a result of the 1997 review of WANO activities, in 1998 WANO began production of several new types of event-based reports to communicate significant industry events to our members. A key focus of analysis of these significant events is whether they are recurring events (that is, very similar to previous events either at that NPP or another NPP in the industry). The reports are called Significant Operating Experience Reports (SOERs) and Significant Event Reports (SERs). Significant Operating Experience Reports (SOERs) are written by WANO when several event reports indicate that

  10. The Analysis of the Properties of Super Solar Proton Events and the Associated Phenomena

    Science.gov (United States)

    Cheng, L. B.; Le, G. M.; Lu, Y. P.; Chen, M. H.; Li, P.; Yin, Z. Q.

    2014-05-01

    The solar flare, the propagation speed of shock driven by coronal mass ejection (CME) from the sun to the Earth, the source longitudes and Carrington longitudes, and the geomagnetic storms associated with each super solar proton event with the peak flux equal to or exceeding 10000 pfu have been investigated. The analysis results show that the source longitudes of super solar proton events ranged from E30° to W75°. The Carrington longitudes of source regions of super solar proton events distributed in the two longitude bands, 130°˜220° and 260°˜320°, respectively. All super solar proton events were accompanied by major solar flares and fast CMEs. The averaged speeds of shocks propagated from the sun to the Earth were greater than 1200 km/s. Eight super solar proton events were followed by major geomagnetic storms (Dst≤-100 nT). One super solar proton event was followed by a geomagnetic storm with Dst=-96 nT.

  11. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    Science.gov (United States)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  12. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    Science.gov (United States)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two

  13. Data Science Solution to Event Prediction in Outsourced Clinical Trial Models.

    Science.gov (United States)

    Dalevi, Daniel; Lovick, Susan; Mann, Helen; Metcalfe, Paul D; Spencer, Stuart; Hollis, Sally; Ruau, David

    2015-01-01

    Late phase clinical trials are regularly outsourced to a Contract Research Organisation (CRO) while the risk and accountability remain within the sponsor company. Many statistical tasks are delivered by the CRO and later revalidated by the sponsor. Here, we report a technological approach to standardised event prediction. We have built a dynamic web application around an R-package with the aim of delivering reliable event predictions, simplifying communication and increasing trust between the CRO and the in-house statisticians via transparency. Short learning curve, interactivity, reproducibility and data diagnostics are key here. The current implementation is motivated by time-to-event prediction in oncology. We demonstrate a clear benefit of standardisation for both parties. The tool can be used for exploration, communication, sensitivity analysis and generating standard reports. At this point we wish to present this tool and share some of the insights we have gained during the development.

  14. Time to Tenure in Spanish Universities: An Event History Analysis

    Science.gov (United States)

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  15. Time to tenure in Spanish universities: an event history analysis.

    Science.gov (United States)

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  16. Time to tenure in Spanish universities: an event history analysis.

    Directory of Open Access Journals (Sweden)

    Luis Sanz-Menéndez

    Full Text Available Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  17. Implementing recovery: an analysis of the key technologies in Scotland

    Science.gov (United States)

    2011-01-01

    Background Over the past ten years the promotion of recovery has become a stated aim of mental health policies within a number of English speaking countries, including Scotland. Implementation of a recovery approach involves a significant reorientation of mental health services and practices, which often poses significant challenges for reformers. This article examines how four key technologies of recovery have assisted in the move towards the creation of a recovery-oriented mental health system in Scotland. Methods Drawing on documentary analysis and a series of interviews we examine the construction and implementation of four key recovery 'technologies' as they have been put to use in Scotland: recovery narratives, the Scottish Recovery Indicator (SRI), Wellness Recovery Action Planning (WRAP) and peer support. Results Our findings illuminate how each of these technologies works to instantiate, exemplify and disseminate a 'recovery orientation' at different sites within the mental health system in order to bring about a 'recovery oriented' mental health system. They also enable us to identify some of the factors that facilitate or hinder the effectiveness of those technologies in bringing about a change in how mental health services are delivered in Scotland. These finding provide a basis for some general reflections on the utility of 'recovery technologies' to implement a shift towards recovery in mental health services in Scotland and elsewhere. Conclusions Our analysis of this process within the Scottish context will be valuable for policy makers and service coordinators wishing to implement recovery values within their own national mental health systems. PMID:21569633

  18. Implementing recovery: an analysis of the key technologies in Scotland

    Directory of Open Access Journals (Sweden)

    Sturdy Steve

    2011-05-01

    Full Text Available Abstract Background Over the past ten years the promotion of recovery has become a stated aim of mental health policies within a number of English speaking countries, including Scotland. Implementation of a recovery approach involves a significant reorientation of mental health services and practices, which often poses significant challenges for reformers. This article examines how four key technologies of recovery have assisted in the move towards the creation of a recovery-oriented mental health system in Scotland. Methods Drawing on documentary analysis and a series of interviews we examine the construction and implementation of four key recovery 'technologies' as they have been put to use in Scotland: recovery narratives, the Scottish Recovery Indicator (SRI, Wellness Recovery Action Planning (WRAP and peer support. Results Our findings illuminate how each of these technologies works to instantiate, exemplify and disseminate a 'recovery orientation' at different sites within the mental health system in order to bring about a 'recovery oriented' mental health system. They also enable us to identify some of the factors that facilitate or hinder the effectiveness of those technologies in bringing about a change in how mental health services are delivered in Scotland. These finding provide a basis for some general reflections on the utility of 'recovery technologies' to implement a shift towards recovery in mental health services in Scotland and elsewhere. Conclusions Our analysis of this process within the Scottish context will be valuable for policy makers and service coordinators wishing to implement recovery values within their own national mental health systems.

  19. Two-Dimensional Key Table-Based Group Key Distribution in Advanced Metering Infrastructure

    Directory of Open Access Journals (Sweden)

    Woong Go

    2014-01-01

    Full Text Available A smart grid provides two-way communication by using the information and communication technology. In order to establish two-way communication, the advanced metering infrastructure (AMI is used in the smart grid as the core infrastructure. This infrastructure consists of smart meters, data collection units, maintenance data management systems, and so on. However, potential security problems of the AMI increase owing to the application of the public network. This is because the transmitted information is electricity consumption data for charging. Thus, in order to establish a secure connection to transmit electricity consumption data, encryption is necessary, for which key distribution is required. Further, a group key is more efficient than a pairwise key in the hierarchical structure of the AMI. Therefore, we propose a group key distribution scheme using a two-dimensional key table through the analysis result of the sensor network group key distribution scheme. The proposed scheme has three phases: group key predistribution, selection of group key generation element, and generation of group key.

  20. Identification of independent modules in fault trees which contain dependent basic events

    International Nuclear Information System (INIS)

    Sun, H.; Andrews, J.D.

    2004-01-01

    The reliability performance of a system is frequently a function of component failures of which some are independent whilst others are interdependent. It is possible to represent the system failure logic in a fault tree diagram, however only the sections containing independent events can be assessed using the conventional fault tree analysis methodology. The analysis of the dependent sections will require a Markov analysis. Since the efficiency of the Markov analysis largely depends on the size of the established Markov model, the key is to extract from the fault tree the smallest sections which contain dependencies. This paper proposes a method aimed at establishing the smallest Markov model for the dependencies contained within the fault tree

  1. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data.

    Science.gov (United States)

    Luo, Jake; Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-10-17

    Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (Pgroup. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older patients (older than 40 years) were more likely to observe a diverse range of adverse events (Page group (older

  2. Analysis of post-blasting source mechanisms of mining-induced seismic events in Rudna copper mine, Poland

    Directory of Open Access Journals (Sweden)

    Caputa Alicja

    2015-10-01

    Full Text Available The exploitation of georesources by underground mining can be responsible for seismic activity in areas considered aseismic. Since strong seismic events are connected with rockburst hazard, it is a continuous requirement to reduce seismic risk. One of the most effective methods to do so is blasting in potentially hazardous mining panels. In this way, small to moderate tremors are provoked and stress accumulation is substantially reduced. In this paper we present an analysis of post-blasting events using Full Moment Tensor (MT inversion at the Rudna mine, Poland, underground seismic network. In addition, we describe the problems we faced when analyzing seismic signals. Our studies show that focal mechanisms for events that occurred after blasts exhibit common features in the MT solution. The strong isotropic and small Double Couple (DC component of the MT, indicate that these events were provoked by detonations. On the other hand, post-blasting MT is considerably different than the MT obtained for strong mining events. We believe that seismological analysis of provoked and unprovoked events can be a very useful tool in confirming the effectiveness of blasting in seismic hazard reduction in mining areas.

  3. Interactive analysis of human error factors in NPP operation events

    International Nuclear Information System (INIS)

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  4. Analysis of operation events for HFETR emergency diesel generator set

    International Nuclear Information System (INIS)

    Li Zhiqiang; Ji Xifang; Deng Hong

    2015-01-01

    By the statistic analysis of the historical failure data of the emergency diesel generator set, the specific mode, the attribute, and the direct and root origin for each failure are reviewed and summarized. Considering the current status of the emergency diesel generator set, the preventive measures and solutions in terms of operation, handling and maintenance are proposed, and the potential events for the emergency diesel generator set are analyzed. (authors)

  5. Analysis of internal events for the Unit 1 of the Laguna Verde Nuclear Power Station. Appendixes

    International Nuclear Information System (INIS)

    Huerta B, A.; Lopez M, R.

    1995-01-01

    This volume contains the appendices for the accident sequences analysis for those internally initiated events for Laguna Verde Unit 1, Nuclear Power Plant. The appendix A presents the comments raised by the Sandia National Laboratories technical staff as a result of the review of the Internal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant. This review was performed during a joint Sandia/CNSNS multi-day meeting by the end 1992. Also included is a brief evaluation on the applicability of these comments to the present study. The appendix B presents the fault tree models printed for each of the systems included and.analyzed in the Internal Event Analysis for LVNPP. The appendice C presents the outputs of the TEMAC code, used for the cuantification of the dominant accident sequences as well as for the final core damage evaluation. (Author)

  6. Analysis of internal events for the Unit 1 of the Laguna Verde Nuclear Power Station. Appendixes

    International Nuclear Information System (INIS)

    Huerta B, A.; Lopez M, R.

    1995-01-01

    This volume contains the appendices for the accident sequences analysis for those internally initiated events for Laguna Verde Unit 1, Nuclear Power Plant. The appendix A presents the comments raised by the Sandia National Laboratories technical staff as a result of the review of the Internal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant. This review was performed during a joint Sandia/CNSNS multi-day meeting by the end 1992. Also included is a brief evaluation on the applicability of these comments to the present study. The appendix B presents the fault tree models printed for each of the systems included and analyzed in the Internal Event Analysis for LVNPP. The appendice C presents the outputs of the TEMAC code, used for the cuantification of the dominant accident sequences as well as for the final core damage evaluation. (Author)

  7. Tipping the Balance: Hepatotoxicity and the Four Apical Key Events of Hepatic Steatosis

    Science.gov (United States)

    Adverse outcome pathways (AOPs) are descriptive biological sequences that start from a molecular initiating event (MIE) and end with an adverse health outcome. AOPs provide biological context for high throughput chemical testing and further prioritize environmental health risk r...

  8. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    Science.gov (United States)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  9. Human factors analysis and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized event-tree analysis technique

    International Nuclear Information System (INIS)

    Casey, S.M.; Deretsky, Z.

    1980-08-01

    This document provides detailed instructions for using the Computerized Event-Tree Analysis Technique (CETAT), a program designed to assist a human factors analyst in predicting event probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT analysis, (b) develop operator performance data, (c) enter an event-tree structure, (d) modify a data base, and (e) analyze event paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program

  10. DISPELLING ILLUSIONS OF REFLECTION: A NEW ANALYSIS OF THE 2007 MAY 19 CORONAL 'WAVE' EVENT

    International Nuclear Information System (INIS)

    Attrill, Gemma D. R.

    2010-01-01

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified 'reflections' are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  11. Logistic Organization of Mass Events in the Light of SWOT Analysis - Case Study

    Directory of Open Access Journals (Sweden)

    Joanna Woźniak

    2018-02-01

    Full Text Available Rzeszow Juwenalia is the largest free-entry student event in Subcarpathia, and, at the same time, one of the best in Poland. On average, more than 25,000 people stay on the campus of Rzeszow University of Technology for every single day of the event. Such an enormous undertaking requires developing a strategy which will make it possible to design and coordinate the event effectively. In connection with that, the principal objective of this paper is to present the strengths and weaknesses of Rzeszow Juwenalia, and also to attempt to verify opportunities and threats related to the event. SWOT analysis was used in order to attain the adopted objective. With the use of it, results making it possible to conduct a detailed assessment of the undertaking were obtained. In the publication were presented proposals of improvement activities which may be implemented in the future.

  12. Political Shocks and Abnormal Returns During the Taiwan Crisis: An Event Study Analysis

    National Research Council Canada - National Science Library

    Steeves, Geoffrey

    2002-01-01

    .... Focusing on the 1996 Taiwan Crisis, by means of event study analysis, this paper attempts to determine the extent to which this political shock affected the Taiwanese, and surrounding Japanese stock markets...

  13. Reverse translation of adverse event reports paves the way for de-risking preclinical off-targets.

    Science.gov (United States)

    Maciejewski, Mateusz; Lounkine, Eugen; Whitebread, Steven; Farmer, Pierre; DuMouchel, William; Shoichet, Brian K; Urban, Laszlo

    2017-08-08

    The Food and Drug Administration Adverse Event Reporting System (FAERS) remains the primary source for post-marketing pharmacovigilance. The system is largely un-curated, unstandardized, and lacks a method for linking drugs to the chemical structures of their active ingredients, increasing noise and artefactual trends. To address these problems, we mapped drugs to their ingredients and used natural language processing to classify and correlate drug events. Our analysis exposed key idiosyncrasies in FAERS, for example reports of thalidomide causing a deadly ADR when used against myeloma, a likely result of the disease itself; multiplications of the same report, unjustifiably increasing its importance; correlation of reported ADRs with public events, regulatory announcements, and with publications. Comparing the pharmacological, pharmacokinetic, and clinical ADR profiles of methylphenidate, aripiprazole, and risperidone, and of kinase drugs targeting the VEGF receptor, demonstrates how underlying molecular mechanisms can emerge from ADR co-analysis. The precautions and methods we describe may enable investigators to avoid confounding chemistry-based associations and reporting biases in FAERS, and illustrate how comparative analysis of ADRs can reveal underlying mechanisms.

  14. Event-by-event fluctuations of mean transverse momentum in Au·Au ...

    Indian Academy of Sciences (India)

    Abstract. We report results on event-by-event fluctuations in mean transverse momentum in ... 0.150 < pt < 2.0 GeV/c are selected for the analysis. ... for gamma distribution are the mean event multiplicity, M , and the mean and variance of.

  15. PELAKSANAAN KEGIATAN SPECIAL EVENT JAKARTA GOES PINK OLEH LOVEPINK INDONESIA

    Directory of Open Access Journals (Sweden)

    Nugroho Ajie Hartono

    2017-01-01

    aims to determine the event management process by Lovepink Indonesia in Jakarta Goes Pink 2015 to raise awareness. This study uses the Event Management Process by Joe Goldblatt as thecornerstone concept.This study used a qualitative research approach descriptive study. Data was collected by in-depthinterviews, passive participant observation, and literature study. The key informant collection techniqueused is purporsive sampling. Data were analyzed using three stages, which is data reduction, datadisplay, and conclusion drawing. Validity of the data using triangulation of data sources.The result of this study indicates that event management process of Jakarta Goes Pink 2015 iscategorized in several stages, which are research, design, planning, coordination, and evaluation.Research was conducted on the analysis of the situation related to the Indonesian community awarenessabout breast cancer, the result of this research indicates that awareness is still low, especially comparedto Pink Ribbon activities abroad and evaluation of previous year’s event. Event design is done using thecolor element to make the concept of Jakarta Goes Pink comes alive, educating visitors about breastcancer, and the element of entertainment within the concept of fair and festival. The planning involvesin setting goals that determine the time and location, budgeting, human resource management, andpublication through social media and media partners. Coordination is done as an effort to manage thecommunication between external parties such as the communities, volunteers, sponsors, and mediapartners; as well as internal parties, which is the committee of Jakarta Goes Pink itself. Evaluationincludes an evaluation of the event, direct feedback from dearest people, and the calculation of theamount of media coverage and the nature of the news.Key word: Special Event, Awareness, Organization, Breast Cancer, Event Management Process

  16. Analysis methodology for the post-trip return to power steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chul Shin; Kim, Chul Woo; You, Hyung Keun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different from that of non-RTP SLB and is more difficult. Therefore, it is necessary to develop a methodology to analyze the response of the NSSS parameters to the post-trip RTP SLB events and the fuel performance after the total reactivity exceeds the criticality. In this analysis, the cases with and without offsite power were simulated crediting 3-D reactivity feedback effect due to a local heatup in the vicinity of stuck CEA and compared with the cases without 3-D reactivity feedback with respect to post-trip fuel performance. Departure-to Nucleate Boiling Ratio (DNBR) and Linear Heat Generation Rate (LHGR). 36 tabs., 32 figs., 11 refs. (Author) .new.

  17. Exploratory trend and pattern analysis of 1981 through 1983 Licensee Event Report data. Main report. Volume 1

    International Nuclear Information System (INIS)

    Hester, O.V.; Groh, M.R.; Farmer, F.G.

    1986-10-01

    This report presents an overview of the 1981 through 1983 Sequence Coding and Search System (SCSS) data base that contains nuclear power plant operational data derived from Licensee Event Reports (LERs) submitted to the United States Nuclear Regulatory Commission (USNRC). Both overall event reporting and events related to specific components, subsystems, systems, and personnel are discussed. At all of these levels of information, software is used to generate count data for contingency tables. Contingency table analysis is the main tool for the trend and pattern analysis. The tables focus primarily on faults associated with various components and other items of interest across different plants. The abstracts and other SCSS information on the LERs accounting for unusual counts in the tables were examined to gain insights from the events. Trends and patterns in LER reporting and reporting of events for various component groups were examined through log-linear modeling techniques

  18. Internal event analysis for Laguna Verde Unit 1 Nuclear Power Plant. Accident sequence quantification and results

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1994-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant, CNSNS-TR 004, in five volumes. The reports are organized as follows: CNSNS-TR 004 Volume 1: Introduction and Methodology. CNSNS-TR4 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR 004 Volume 3: System Analysis. CNSNS-TR 004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR 005 Volume 5: Appendices A, B and C. This volume presents the development of the dependent failure analysis, the treatment of the support system dependencies, the identification of the shared-components dependencies, and the treatment of the common cause failure. It is also presented the identification of the main human actions considered along with the possible recovery actions included. The development of the data base and the assumptions and limitations in the data base are also described in this volume. The accident sequences quantification process and the resolution of the core vulnerable sequences are presented. In this volume, the source and treatment of uncertainties associated with failure rates, component unavailabilities, initiating event frequencies, and human error probabilities are also presented. Finally, the main results and conclusions for the Internal Event Analysis for Laguna Verde Nuclear Power Plant are presented. The total core damage frequency calculated is 9.03x 10-5 per year for internal events. The most dominant accident sequences found are the transients involving the loss of offsite power, the station blackout accidents, and the anticipated transients without SCRAM (ATWS). (Author)

  19. ANCA: Anharmonic Conformational Analysis of Biomolecular Simulations.

    Science.gov (United States)

    Parvatikar, Akash; Vacaliuc, Gabriel S; Ramanathan, Arvind; Chennubhotla, S Chakra

    2018-05-08

    Anharmonicity in time-dependent conformational fluctuations is noted to be a key feature of functional dynamics of biomolecules. Although anharmonic events are rare, long-timescale (μs-ms and beyond) simulations facilitate probing of such events. We have previously developed quasi-anharmonic analysis to resolve higher-order spatial correlations and characterize anharmonicity in biomolecular simulations. In this article, we have extended this toolbox to resolve higher-order temporal correlations and built a scalable Python package called anharmonic conformational analysis (ANCA). ANCA has modules to: 1) measure anharmonicity in the form of higher-order statistics and its variation as a function of time, 2) output a storyboard representation of the simulations to identify key anharmonic conformational events, and 3) identify putative anharmonic conformational substates and visualization of transitions between these substates. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  20. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  1. Cryogenic dark matter search (CDMS II): Application of neural networks and wavelets to event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Attisha, Michael J. [Brown U.

    2006-01-01

    The Cryogenic Dark Matter Search (CDMS) experiment is designed to search for dark matter in the form of Weakly Interacting Massive Particles (WIMPs) via their elastic scattering interactions with nuclei. This dissertation presents the CDMS detector technology and the commissioning of two towers of detectors at the deep underground site in Soudan, Minnesota. CDMS detectors comprise crystals of Ge and Si at temperatures of 20 mK which provide ~keV energy resolution and the ability to perform particle identification on an event by event basis. Event identification is performed via a two-fold interaction signature; an ionization response and an athermal phonon response. Phonons and charged particles result in electron recoils in the crystal, while neutrons and WIMPs result in nuclear recoils. Since the ionization response is quenched by a factor ~ 3(2) in Ge(Si) for nuclear recoils compared to electron recoils, the relative amplitude of the two detector responses allows discrimination between recoil types. The primary source of background events in CDMS arises from electron recoils in the outer 50 µm of the detector surface which have a reduced ionization response. We develop a quantitative model of this ‘dead layer’ effect and successfully apply the model to Monte Carlo simulation of CDMS calibration data. Analysis of data from the two tower run March-August 2004 is performed, resulting in the world’s most sensitive limits on the spin-independent WIMP-nucleon cross-section, with a 90% C.L. upper limit of 1.6 × 10-43 cm2 on Ge for a 60 GeV WIMP. An approach to performing surface event discrimination using neural networks and wavelets is developed. A Bayesian methodology to classifying surface events using neural networks is found to provide an optimized method based on minimization of the expected dark matter limit. The discrete wavelet analysis of CDMS phonon pulses improves surface event discrimination in conjunction with the neural

  2. A Fourier analysis of extreme events

    DEFF Research Database (Denmark)

    Mikosch, Thomas Valentin; Zhao, Yuwei

    2014-01-01

    The extremogram is an asymptotic correlogram for extreme events constructed from a regularly varying stationary sequence. In this paper, we define a frequency domain analog of the correlogram: a periodogram generated from a suitable sequence of indicator functions of rare events. We derive basic ...... properties of the periodogram such as the asymptotic independence at the Fourier frequencies and use this property to show that weighted versions of the periodogram are consistent estimators of a spectral density derived from the extremogram....

  3. A tandem regression-outlier analysis of a ligand cellular system for key structural modifications around ligand binding.

    Science.gov (United States)

    Lin, Ying-Ting

    2013-04-30

    A tandem technique of hard equipment is often used for the chemical analysis of a single cell to first isolate and then detect the wanted identities. The first part is the separation of wanted chemicals from the bulk of a cell; the second part is the actual detection of the important identities. To identify the key structural modifications around ligand binding, the present study aims to develop a counterpart of tandem technique for cheminformatics. A statistical regression and its outliers act as a computational technique for separation. A PPARγ (peroxisome proliferator-activated receptor gamma) agonist cellular system was subjected to such an investigation. Results show that this tandem regression-outlier analysis, or the prioritization of the context equations tagged with features of the outliers, is an effective regression technique of cheminformatics to detect key structural modifications, as well as their tendency of impact to ligand binding. The key structural modifications around ligand binding are effectively extracted or characterized out of cellular reactions. This is because molecular binding is the paramount factor in such ligand cellular system and key structural modifications around ligand binding are expected to create outliers. Therefore, such outliers can be captured by this tandem regression-outlier analysis.

  4. Safety assessment of the advanced CANDU reactor in postulated LOCA/LOECC events

    International Nuclear Information System (INIS)

    Hazen Hezhi Fan; Zoran Bilanovic

    2005-01-01

    The Advanced CANDU Reactor TM (ACR TM ) retains the proven strengths and features of CANDU reactors, and incorporates innovative new features and state-of-the-art technology. In addition to the enhanced emergency core cooling system, the reserve water system is designed to be available to inject reserve water by gravity into the reactor inlet headers after a postulated loss-of-coolant accident (LOCA). To assist in the ACR design and analysis of beyond the design basis events, simulations are needed to demonstrate the effectiveness of these two independent systems on core cooling, and to assess the consequences of the postulated accident coincident with the impairment of either of the two systems. The current paper is subject to an assessment of a postulated large LOCA coincident with loss of the emergency core cooling (LOECC) system. A postulated LOCA/LOECC has very low probability, in the range usually associated with severe core damage events. However, in the CANDU design, including ACR, the presence of moderator water surrounding the fuel channels acts as an effective heat sink, together with other safety features, to prevents severe core damage following a postulated LOCA/LOECC. Therefore, it is possible to analyse LOCA/LOECC using the same deterministic tools that are used for analysis of events with much higher frequencies, in the design basis event range. The assessment is conducted based on the current ACR-700 design. However, the analysis methodology, scope, computer tools, and the results in principle, are applicable to larger ACR designs. This assessment includes system (circuit), fuel channel, and fuel analyses. Some assessment results are needed in subsequent moderator analysis and containment analysis. In the assessment, several simulations were performed to analyse the full circuit and individual fuel channel transient behaviours, as well as the fission product release behaviour. The assessment has captured the key responses of the reactor heat

  5. Fuel element thermo-mechanical analysis during transient events using the FMS and FETMA codes

    International Nuclear Information System (INIS)

    Hernandez Lopez Hector; Hernandez Martinez Jose Luis; Ortiz Villafuerte Javier

    2005-01-01

    In the Instituto Nacional de Investigaciones Nucleares of Mexico, the Fuel Management System (FMS) software package has been used for long time to simulate the operation of a BWR nuclear power plant in steady state, as well as in transient events. To evaluate the fuel element thermo-mechanical performance during transient events, an interface between the FMS codes and our own Fuel Element Thermo Mechanical Analysis (FETMA) code is currently being developed and implemented. In this work, the results of the thermo-mechanical behavior of fuel rods in the hot channel during the simulation of transient events of a BWR nuclear power plant are shown. The transient events considered for this work are a load rejection and a feedwater control failure, which among the most important events that can occur in a BWR. The results showed that conditions leading to fuel rod failure at no time appeared for both events. Also, it is shown that a transient due load rejection is more demanding on terms of safety that the failure of a controller of the feedwater. (authors)

  6. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    Science.gov (United States)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  7. Analysis of warm convective rain events in Catalonia

    Science.gov (United States)

    Ballart, D.; Figuerola, F.; Aran, M.; Rigo, T.

    2009-09-01

    Between the end of September and November, events with high amounts of rainfall are quite common in Catalonia. The high sea surface temperature of the Mediterranean Sea near to the Catalan Coast is one of the most important factors that help to the development of this type of storms. Some of these events have particular characteristics: elevated rain rate during short time periods, not very deep convection and low lightning activity. Consequently, the use of remote sensing tools for the surveillance is quite useless or limited. With reference to the high rain efficiency, this is caused by internal mechanisms of the clouds, and also by the air mass where the precipitation structure is developed. As aforementioned, the contribution of the sea to the air mass is very relevant, not only by the increase of the big condensation nuclei, but also by high temperature of the low layers of the atmosphere, where are allowed clouds with 5 or 6 km of particles in liquid phase. In fact, the freezing level into these clouds can be detected by -15ºC. Due to these characteristics, this type of rainy structures can produce high quantities of rainfall in a relatively brief period of time, and, in the case to be quasi-stationary, precipitation values at surface could be very important. From the point of view of remote sensing tools, the cloud nature implies that the different tools and methodologies commonly used for the analysis of heavy rain events are not useful. This is caused by the following features: lightning are rarely observed, the top temperatures of clouds are not cold enough to be enhanced in the satellite imagery, and, finally, reflectivity radar values are lower than other heavy rain cases. The third point to take into account is the vulnerability of the affected areas. An elevated percentage of the Catalan population lives in the coastal region. In the central coast of Catalonia, the urban areas are surrounded by a not very high mountain range with small basins and

  8. Video Analysis Verification of Head Impact Events Measured by Wearable Sensors.

    Science.gov (United States)

    Cortes, Nelson; Lincoln, Andrew E; Myer, Gregory D; Hepburn, Lisa; Higgins, Michael; Putukian, Margot; Caswell, Shane V

    2017-08-01

    Wearable sensors are increasingly used to quantify the frequency and magnitude of head impact events in multiple sports. There is a paucity of evidence that verifies head impact events recorded by wearable sensors. To utilize video analysis to verify head impact events recorded by wearable sensors and describe the respective frequency and magnitude. Cohort study (diagnosis); Level of evidence, 2. Thirty male (mean age, 16.6 ± 1.2 years; mean height, 1.77 ± 0.06 m; mean weight, 73.4 ± 12.2 kg) and 35 female (mean age, 16.2 ± 1.3 years; mean height, 1.66 ± 0.05 m; mean weight, 61.2 ± 6.4 kg) players volunteered to participate in this study during the 2014 and 2015 lacrosse seasons. Participants were instrumented with GForceTracker (GFT; boys) and X-Patch sensors (girls). Simultaneous game video was recorded by a trained videographer using a single camera located at the highest midfield location. One-third of the field was framed and panned to follow the ball during games. Videographic and accelerometer data were time synchronized. Head impact counts were compared with video recordings and were deemed valid if (1) the linear acceleration was ≥20 g, (2) the player was identified on the field, (3) the player was in camera view, and (4) the head impact mechanism could be clearly identified. Descriptive statistics of peak linear acceleration (PLA) and peak rotational velocity (PRV) for all verified head impacts ≥20 g were calculated. For the boys, a total recorded 1063 impacts (2014: n = 545; 2015: n = 518) were logged by the GFT between game start and end times (mean PLA, 46 ± 31 g; mean PRV, 1093 ± 661 deg/s) during 368 player-games. Of these impacts, 690 were verified via video analysis (65%; mean PLA, 48 ± 34 g; mean PRV, 1242 ± 617 deg/s). The X-Patch sensors, worn by the girls, recorded a total 180 impacts during the course of the games, and 58 (2014: n = 33; 2015: n = 25) were verified via video analysis (32%; mean PLA, 39 ± 21 g; mean PRV, 1664

  9. Analysis of early initiating event(s) in radiation-induced thymic lymphomagenesis

    International Nuclear Information System (INIS)

    Muto, Masahiro; Ying Chen; Kubo, Eiko; Mita, Kazuei

    1996-01-01

    Since the T cell receptor rearrangement is a sequential process and unique to the progeny of each clone, we investigated the early initiating events in radiation-induced thymic lymphomagenesis by comparing the oncogenic alterations with the pattern of γ T cell receptor (TCR) rearrangements. We reported previously that after leukemogenic irradiation, preneoplastic cells developed, albeit infrequently, from thymic leukemia antigen-2 + (TL-2 + ) thymocytes. Limited numbers of TL-2 + cells from individual irradiated B10.Thy-1.1 mice were injected into B10.Thy-1.2 mice intrathymically, and the common genetic changes among the donor-type T cell lymphomas were investigated with regard to p53 gene and chromosome aberrations. The results indicated that some mutations in the p53 gene had taken place in these lymphomas, but there was no common mutation among the donor-type lymphomas from individual irradiated mice, suggesting that these mutations were late-occurring events in the process of oncogenesis. On the other hand, there were common chromosome aberrations or translocations such as trisomy 15, t(7F; 10C), t(1A; 13D) or t(6A; XB) among the donor-type lymphomas derived from half of the individual irradiated mice. This indicated that the aberrations/translocations, which occurred in single progenitor cells at the early T cell differentiation either just before or after γ T cell receptor rearrangements, might be important candidates for initiating events. In the donor-type lymphomas from the other half of the individual irradiated mice, microgenetic changes were suggested to be initial events and also might take place in single progenitor cells just before or right after γ TCR rearrangements. (author)

  10. Analysis of the highest transverse energy events seen in the UAl detector at the Spp-barS collider

    International Nuclear Information System (INIS)

    1987-06-01

    The first full solid angle analysis is presented of large transverse energy events in pp-bar collisions at the CERN collider. Events with transverse energies in excess of 200 GeV at √s = 630 GeV are studied for any non-standard physics and quantitatively compared with expectations from perturbative QCD Monte Carlo models. A corrected differential cross section is presented. A detailed examination is made of jet profiles, event jet multiplicities and the fraction of the transverse energy carried by the two jets with the highest transverse jet energies. There is good agreement with standard theory for events with transverse energies up to the largest observed values (approx. √s/2) and the analysis shows no evidence for any non-QCD mechanism to account for the event characteristics. (author)

  11. Identification of homogeneous regions for rainfall regional frequency analysis considering typhoon event in South Korea

    Science.gov (United States)

    Heo, J. H.; Ahn, H.; Kjeldsen, T. R.

    2017-12-01

    South Korea is prone to large, and often disastrous, rainfall events caused by a mixture of monsoon and typhoon rainfall phenomena. However, traditionally, regional frequency analysis models did not consider this mixture of phenomena when fitting probability distributions, potentially underestimating the risk posed by the more extreme typhoon events. Using long-term observed records of extreme rainfall from 56 sites combined with detailed information on the timing and spatial impact of past typhoons from the Korea Meteorological Administration (KMA), this study developed and tested a new mixture model for frequency analysis of two different phenomena; events occurring regularly every year (monsoon) and events only occurring in some years (typhoon). The available annual maximum 24 hour rainfall data were divided into two sub-samples corresponding to years where the annual maximum is from either (1) a typhoon event, or (2) a non-typhoon event. Then, three-parameter GEV distribution was fitted to each sub-sample along with a weighting parameter characterizing the proportion of historical events associated with typhoon events. Spatial patterns of model parameters were analyzed and showed that typhoon events are less commonly associated with annual maximum rainfall in the North-West part of the country (Seoul area), and more prevalent in the southern and eastern parts of the country, leading to the formation of two distinct typhoon regions: (1) North-West; and (2) Southern and Eastern. Using a leave-one-out procedure, a new regional frequency model was tested and compared to a more traditional index flood method. The results showed that the impact of typhoon on design events might previously have been underestimated in the Seoul area. This suggests that the use of the mixture model should be preferred where the typhoon phenomena is less frequent, and thus can have a significant effect on the rainfall-frequency curve. This research was supported by a grant(2017-MPSS31

  12. Disruptive event uncertainties in a perturbation approach to nuclear waste repository risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, T.F.

    1980-09-01

    A methodology is developed for incorporating a full range of the principal forecasting uncertainties into a risk analysis of a nuclear waste repository. The result of this methodology is a set of risk curves similar to those used by Rasmussen in WASH-1400. The set of curves is partially derived from a perturbation approach to analyze potential disruptive event sequences. Such a scheme could be useful in truncating the number of disruptive event scenarios and providing guidance to those establishing data-base development priorities.

  13. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  14. Formal Analysis of BPMN Models Using Event-B

    Science.gov (United States)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  15. High speed motion neutron radiography of dynamic events

    International Nuclear Information System (INIS)

    Robinson, A.H.; Barton, J.P.

    1983-01-01

    The development of a technique that permits neutron radiographic analysis of dynamic processes over a period lasting from one to ten milliseconds is described. The key to the technique is the use of a neutron pulse broad enough to span the duration of a brief event and intense enough to allow recording of the results on a high-speed movie film at frame rates of 10,000 frames/sec. Some typical application results in ballistic studies and two-phase flow are shown and discussed. The use of scintillator screens in the high-speed motion neutron radiography system is summarized and the statistical limitations of the technique are discussed

  16. A Survey of Key Technology of Network Public Opinion Analysis

    Directory of Open Access Journals (Sweden)

    Li Su Ying

    2016-01-01

    Full Text Available The internet has become an important base for internet users to make comments because of its interactivity and fast dissemination. The outbreak of internet public opinion has become a major risk for network information security. Domestic and foreign researchers had carried out extensive and in-depth study on public opinion. Fruitful results have achieved in the basic theory research and emergency handling and other aspects of public opinion. But research on the public opinion in China is still in the initial stage, the key technology of the public opinion analysis is still as a starting point for in-depth study and discussion.

  17. An Analysis of Key Factors in Developing a Smart City

    Directory of Open Access Journals (Sweden)

    Aidana Šiurytė

    2016-06-01

    Full Text Available The concept Smart City is used widely but it is perceived differently as well. Literature review reveals key elements of the Smart City – Information and Communication Technologies and Smart Citizens. Nevertheless, raising public awareness is not a priority of local municipalities which are trying to develop cities. Focus group discussion aims to analyse citizens’ insights in regards to the Smart City and their contribution to creation of it. Case study of Vilnius examines a position of mu-nicipality in developing city as smart. Study contains suggestions for the improvement of communication in the city. Methods employed: comparative literature analysis, focus group investigation, case study.

  18. Disruptive Event Biosphere Doser Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  19. Selfish memes: An update of Richard Dawkins’ bibliometric analysis of key papers in sociobiology

    OpenAIRE

    Aaen-Stockdale, Craig

    2017-01-01

    This is an Open Access journal available from http://www.mdpi.com/ In the second edition of The Selfish Gene, Richard Dawkins included a short bibliometric analysis of key papers instrumental to the sociobiological revolution, the intention of which was to support his proposal that ideas spread within a population in an epidemiological manner. In his analysis, Dawkins primarily discussed the influence of an article by British evolutionary biologist William Donald Hamilton which had introdu...

  20. Social Network Analysis Identifies Key Participants in Conservation Development.

    Science.gov (United States)

    Farr, Cooper M; Reed, Sarah E; Pejchar, Liba

    2018-05-01

    Understanding patterns of participation in private lands conservation, which is often implemented voluntarily by individual citizens and private organizations, could improve its effectiveness at combating biodiversity loss. We used social network analysis (SNA) to examine participation in conservation development (CD), a private land conservation strategy that clusters houses in a small portion of a property while preserving the remaining land as protected open space. Using data from public records for six counties in Colorado, USA, we compared CD participation patterns among counties and identified actors that most often work with others to implement CDs. We found that social network characteristics differed among counties. The network density, or proportion of connections in the network, varied from fewer than 2 to nearly 15%, and was higher in counties with smaller populations and fewer CDs. Centralization, or the degree to which connections are held disproportionately by a few key actors, was not correlated strongly with any county characteristics. Network characteristics were not correlated with the prevalence of wildlife-friendly design features in CDs. The most highly connected actors were biological and geological consultants, surveyors, and engineers. Our work demonstrates a new application of SNA to land-use planning, in which CD network patterns are examined and key actors are identified. For better conservation outcomes of CD, we recommend using network patterns to guide strategies for outreach and information dissemination, and engaging with highly connected actor types to encourage widespread adoption of best practices for CD design and stewardship.

  1. SENTINEL EVENTS

    Directory of Open Access Journals (Sweden)

    Andrej Robida

    2004-09-01

    Full Text Available Background. The Objective of the article is a two year statistics on sentinel events in hospitals. Results of a survey on sentinel events and the attitude of hospital leaders and staff are also included. Some recommendations regarding patient safety and the handling of sentinel events are given.Methods. In March 2002 the Ministry of Health introduce a voluntary reporting system on sentinel events in Slovenian hospitals. Sentinel events were analyzed according to the place the event, its content, and root causes. To show results of the first year, a conference for hospital directors and medical directors was organized. A survey was conducted among the participants with the purpose of gathering information about their view on sentinel events. One hundred questionnaires were distributed.Results. Sentinel events. There were 14 reports of sentinel events in the first year and 7 in the second. In 4 cases reports were received only after written reminders were sent to the responsible persons, in one case no reports were obtained. There were 14 deaths, 5 of these were in-hospital suicides, 6 were due to an adverse event, 3 were unexplained. Events not leading to death were a suicide attempt, a wrong side surgery, a paraplegia after spinal anaesthesia, a fall with a femoral neck fracture, a damage of the spleen in the event of pleural space drainage, inadvertent embolization with absolute alcohol into a femoral artery and a physical attack on a physician by a patient. Analysis of root causes of sentinel events showed that in most cases processes were inadequate.Survey. One quarter of those surveyed did not know about the sentinel events reporting system. 16% were having actual problems when reporting events and 47% beleived that there was an attempt to blame individuals. Obstacles in reporting events openly were fear of consequences, moral shame, fear of public disclosure of names of participants in the event and exposure in mass media. The majority of

  2. Analysis of Malaysian Nuclear Agency Key Performance Indicator (KPI) 2005-2013

    International Nuclear Information System (INIS)

    Aisya Raihan Abdul Kadir; Hazmimi Kasim; Azlinda Aziz; Noriah Jamal

    2014-01-01

    Malaysia Nuclear Agency (Nuclear Malaysia) was established on 19 September 1972. Since its inception, Nuclear Malaysia has been entrusted with the responsibility to introduce and promote nuclear science and technology for national development. After more than 40 years of operation, Nuclear Malaysia remains significant as an excellent organization of science, technology and innovation. An analysis of the key performance indicator (KPI) achievements in 2005-2013 as indicator to the role of Nuclear Malaysia as a national research institution. It was established to promote, develop and encourage the application of nuclear technology. (author)

  3. AN ANALYSIS OF RISK EVENTS IN THE OIL-TANKER MAINTENANCE BUSINESS

    Directory of Open Access Journals (Sweden)

    Roque Rabechini Junior

    2012-12-01

    Full Text Available This work presents the results of an investigation into risk events and their respective causes, carried out in ship maintenance undertakings in the logistical sector of the Brazilian oil industry. Its theoretical, conceptual positioning lies in those aspects related to risk management of the undertakings as instruments of support in decision making by executives in the tanker-maintenance business. The case-study method was used as an alternative methodology with a qualitative approach of an exploratory nature and, for the presentation of data, a descriptive format was chosen. Through the analysis of 75 risk events in projects of tanker docking it was possible to extract eight of the greatest relevance. The risk analysis facilitated the identification of actions aimed at their mitigation. As a conclusion it was possible to propose a risk-framework model in four categories, HSE (health, safety and the environment, technicians, externalities and management, designed to provide tanker-docking business executives and administrators, with evidence of actions to assist in their decision-making processes. Finally, the authors identified proposals for further study as well as showing the principal limitations of the study.

  4. Systems analysis of the CANDU 3 Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Wolfgong, J.R.; Linn, M.A.; Wright, A.L.; Olszewski, M.; Fontana, M.H. [Oak Ridge National Lab., TN (United States)

    1993-07-01

    This report presents the results of a systems failure analysis study of the CANDU 3 reactor design; the study was performed for the US Nuclear Regulatory Commission. As part of the study a review of the CANDU 3 design documentation was performed, a plant assessment methodology was developed, representative plant initiating events were identified for detailed analysis, and a plant assessment was performed. The results of the plant assessment included classification of the CANDU 3 event sequences that were analyzed, determination of CANDU 3 systems that are ``significant to safety,`` and identification of key operator actions for the analyzed events.

  5. Device-independent secret-key-rate analysis for quantum repeaters

    Science.gov (United States)

    Holz, Timo; Kampermann, Hermann; Bruß, Dagmar

    2018-01-01

    The device-independent approach to quantum key distribution (QKD) aims to establish a secret key between two or more parties with untrusted devices, potentially under full control of a quantum adversary. The performance of a QKD protocol can be quantified by the secret key rate, which can be lower bounded via the violation of an appropriate Bell inequality in a setup with untrusted devices. We study secret key rates in the device-independent scenario for different quantum repeater setups and compare them to their device-dependent analogon. The quantum repeater setups under consideration are the original protocol by Briegel et al. [Phys. Rev. Lett. 81, 5932 (1998), 10.1103/PhysRevLett.81.5932] and the hybrid quantum repeater protocol by van Loock et al. [Phys. Rev. Lett. 96, 240501 (2006), 10.1103/PhysRevLett.96.240501]. For a given repeater scheme and a given QKD protocol, the secret key rate depends on a variety of parameters, such as the gate quality or the detector efficiency. We systematically analyze the impact of these parameters and suggest optimized strategies.

  6. Key parameters analysis of hybrid HEMP simulator

    International Nuclear Information System (INIS)

    Mao Congguang; Zhou Hui

    2009-01-01

    According to the new standards on the high-altitude electromagnetic pulse (HEMP) developed by International Electrotechnical Commission (IEC), the target parameter requirements of the key structure of the hybrid HEMP simulator are decomposed. Firstly, the influences of the different excitation sources and biconical structures to the key parameters of the radiated electric field wave shape are investigated and analyzed. Then based on the influence curves the target parameter requirements of the pulse generator are proposed. Finally the appropriate parameters of the biconical structure and the excitation sources are chosen, and the computational result of the electric field in free space is presented. The results are of great value for the design of the hybrid HEMP simulator. (authors)

  7. Mixed Methods Analysis of Medical Error Event Reports: A Report from the ASIPS Collaborative

    National Research Council Canada - National Science Library

    Harris, Daniel M; Westfall, John M; Fernald, Douglas H; Duclos, Christine W; West, David R; Niebauer, Linda; Marr, Linda; Quintela, Javan; Main, Deborah S

    2005-01-01

    .... This paper presents a mixed methods approach to analyzing narrative error event reports. Mixed methods studies integrate one or more qualitative and quantitative techniques for data collection and analysis...

  8. The Frasnian-Famennian mass killing event(s), methods of identification and evaluation

    Science.gov (United States)

    Geldsetzer, H. H. J.

    1988-01-01

    The absence of an abnormally high number of earlier Devonian taxa from Famennian sediments was repeatedly documented and can hardly be questioned. Primary recognition of the event(s) was based on paleontological data, especially common macrofossils. Most paleontologists place the disappearance of these common forms at the gigas/triangularis contact and this boundary was recently proposed as the Frasnian-Famennian (F-F) boundary. Not unexpectedly, alternate F-F positions were suggested caused by temporary Frasnian survivors or sudden post-event radiations of new forms. Secondary supporting evidence for mass killing event(s) is supplied by trace element and stable isotope geochemistry but not with the same success as for the K/T boundary, probably due to additional 300 ma of tectonic and diagenetic overprinting. Another tool is microfacies analysis which is surprisingly rarely used even though it can explain geochemical anomalies or paleontological overlap not detectable by conventional macrofacies analysis. The combination of microfacies analysis and geochemistry was applied at two F-F sections in western Canada and showed how interdependent the two methods are. Additional F-F sections from western Canada, western United States, France, Germany and Australia were sampled or re-sampled and await geochemical/microfacies evaluation.

  9. Secret key rates in quantum key distribution using Renyi entropies

    Energy Technology Data Exchange (ETDEWEB)

    Abruzzo, Silvestre; Kampermann, Hermann; Mertz, Markus; Bratzik, Sylvia; Bruss, Dagmar [Institut fuer Theoretische Physik III, Heinrich-Heine-Universitaet Duesseldorf (Germany)

    2010-07-01

    The secret key rate r of a quantum key distribution protocol depends on the involved number of signals and the accepted ''failure probability''. We reconsider a method to calculate r focusing on the analysis of the privacy amplification given by R. Renner and R. Koenig (2005). This approach involves an optimization problem with an objective function depending on the Renyi entropy of the density operator describing the classical outcomes and the eavesdropper system. This problem is analyzed for a generic class of QKD protocols and the current research status is presented.

  10. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    Science.gov (United States)

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  11. Common Elements in Operational Events across Technologies

    International Nuclear Information System (INIS)

    Bley, Dennis C.; Wreathall, John; Cooper, Susan E.

    1998-01-01

    The ATHEANA project, sponsored by the US NRC, began as a study of operational events during low power and shutdown conditions at US commercial nuclear power plants. The purpose was to develop an approach for human reliability analysis that is supported by the experience; i.e., with the history of operational events. As the analysis of operational events progressed, a multidisciplinary framework evolved that can structure the analysis, highlighting significant aspects of each event. The ATHEANA multidisciplinary framework has been used as the basis for retrospective analysis of human performance in operational events in the nuclear power, chemical process, aviation, and medical technologies. The results of these analyses are exemplified by three operational events from different industries. Attention is drawn to those common elements in serious operational events that have negative impacts on human performance. (authors)

  12. Tight finite-key analysis for quantum cryptography.

    Science.gov (United States)

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-17

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies.

  13. Analysis of key technologies for virtual instruments metrology

    Science.gov (United States)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  14. Economic Analysis on Key Challenges for Sustainable Aquaculture Development

    DEFF Research Database (Denmark)

    Gedefaw Abate, Tenaw

    challenges that could obstruct its sustainable development, such as a lack of suitable feed, which includes fishmeal, fish oil and live feed, and negative environmental externalities. If the aquaculture industry is to reach its full potential, it must be both environmentally and economically sustainable...... environmental externalities. A sustainable supply of high-quality live feeds at reasonable prices is absolutely essential for aquaculture hatcheries because many commercially produced high-value marine fish larval species, such as flounder, grouper, halibut, tuna and turbot, require live feed for their early...... developmental stage. The key challenge in this regard is that the conventional used live feed items, Artemia and rotifers, are nutritionally deficient. Thus, the first main purpose of the thesis is carrying out an economic analysis of the feasibility of commercial production and the use of an alternative live...

  15. Analysis of the Power oscillations event in Laguna Verde Nuclear Power Plant. Preliminary Report

    International Nuclear Information System (INIS)

    Gonzalez M, V.M.; Amador G, R.; Castillo, R.; Hernandez, J.L.

    1995-01-01

    The event occurred at Unit 1 of Laguna Verde Nuclear Power Plant in January 24, 1995, is analyzed using the Ramona 3 B code. During this event, Unit 1 suffered power oscillation when operating previous to the transfer at high speed recirculating pumps. This phenomenon was timely detected by reactor operator who put the reactor in shut-down doing a manual Scram. Oscillations reached a maximum extent of 10.5% of nominal power from peak to peak with a frequency of 0.5 Hz. Preliminary evaluations show that the event did not endangered the fuel integrity. The results of simulating the reactor core with Ramona 3 B code show that this code is capable to moderate reactor oscillations. Nevertheless it will be necessary to perform a more detailed simulation of the event in order to prove that the code can predict the beginning of oscillations. It will be need an additional analysis which permit the identification of factors that influence the reactor stability in order to express recommendations and in this way avoid the recurrence of this kind of events. (Author)

  16. Prehospital Interventions During Mass-Casualty Events in Afghanistan: A Case Analysis.

    Science.gov (United States)

    Schauer, Steven G; April, Michael D; Simon, Erica; Maddry, Joseph K; Carter, Robert; Delorenzo, Robert A

    2017-08-01

    Mass-casualty (MASCAL) events are known to occur in the combat setting. There are very limited data at this time from the Joint Theater (Iraq and Afghanistan) wars specific to MASCAL events. The purpose of this report was to provide preliminary data for the development of prehospital planning and guidelines. Cases were identified using the Department of Defense (DoD; Virginia USA) Trauma Registry (DoDTR) and the Prehospital Trauma Registry (PHTR). These cases were identified as part of a research study evaluating Tactical Combat Casualty Care (TCCC) guidelines. Cases that were designated as or associated with denoted MASCAL events were included. Data Fifty subjects were identified during the course of this project. Explosives were the most common cause of injuries. There was a wide range of vital signs. Tourniquet placement and pressure dressings were the most common interventions, followed by analgesia administration. Oral transmucosal fentanyl citrate (OTFC) was the most common parenteral analgesic drug administered. Most were evacuated as "routine." Follow-up data were available for 36 of the subjects and 97% were discharged alive. The most common prehospital interventions were tourniquet and pressure dressing hemorrhage control, along with pain medication administration. Larger data sets are needed to guide development of MASCAL in-theater clinical practice guidelines. Schauer SG , April MD , Simon E , Maddry JK , Carter R III , Delorenzo RA . Prehospital interventions during mass-casualty events in Afghanistan: a case analysis. Prehosp Disaster Med. 2017;32(4):465-468.

  17. Multiple daytime nucleation events in semi-clean savannah and industrial environments in South Africa: analysis based on observations

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2013-06-01

    Full Text Available Recent studies have shown very high frequencies of atmospheric new particle formation in different environments in South Africa. Our aim here was to investigate the causes for two or three consecutive daytime nucleation events, followed by subsequent particle growth during the same day. We analysed 108 and 31 such days observed in a polluted industrial and moderately polluted rural environments, respectively, in South Africa. The analysis was based on two years of measurements at each site. After rejecting the days having notable changes in the air mass origin or local wind direction, i.e. two major reasons for observed multiple nucleation events, we were able to investigate other factors causing this phenomenon. Clouds were present during, or in between most of the analysed multiple particle formation events. Therefore, some of these events may have been single events, interrupted somehow by the presence of clouds. From further analysis, we propose that the first nucleation and growth event of the day was often associated with the mixing of a residual air layer rich in SO2 (oxidized to sulphuric acid into the shallow surface-coupled layer. The second nucleation and growth event of the day usually started before midday and was sometimes associated with renewed SO2 emissions from industrial origin. However, it was also evident that vapours other than sulphuric acid were required for the particle growth during both events. This was especially the case when two simultaneously growing particle modes were observed. Based on our analysis, we conclude that the relative contributions of estimated H2SO4 and other vapours on the first and second nucleation and growth events of the day varied from day to day, depending on anthropogenic and natural emissions, as well as atmospheric conditions.

  18. Is the efficacy of antidepressants in panic disorder mediated by adverse events? A mediational analysis.

    Directory of Open Access Journals (Sweden)

    Irene Bighelli

    Full Text Available It has been hypothesised that the perception of adverse events in placebo-controlled antidepressant clinical trials may induce patients to conclude that they have been randomized to the active arm of the trial, leading to the breaking of blind. This may enhance the expectancies for improvement and the therapeutic response. The main objective of this study is to test the hypothesis that the efficacy of antidepressants in panic disorder is mediated by the perception of adverse events. The present analysis is based on a systematic review of published and unpublished randomised trials comparing antidepressants with placebo for panic disorder. The Baron and Kenny approach was applied to investigate the mediational role of adverse events in the relationship between antidepressants treatment and efficacy. Fourteen placebo-controlled antidepressants trials were included in the analysis. We found that: (a antidepressants treatment was significantly associated with better treatment response (ß = 0.127, 95% CI 0.04 to 0.21, p = 0.003; (b antidepressants treatment was not associated with adverse events (ß = 0.094, 95% CI -0.05 to 0.24, p = 0.221; (c adverse events were negatively associated with treatment response (ß = 0.035, 95% CI -0.06 to -0.05, p = 0.022. Finally, after adjustment for adverse events, the relationship between antidepressants treatment and treatment response remained statistically significant (ß = 0.122, 95% CI 0.01 to 0.23, p = 0.039. These findings do not support the hypothesis that the perception of adverse events in placebo-controlled antidepressant clinical trials may lead to the breaking of blind and to an artificial inflation of the efficacy measures. Based on these results, we argue that the moderate therapeutic effect of antidepressants in individuals with panic disorder is not an artefact, therefore reflecting a genuine effect that doctors can expect to replicate under real-world conditions.

  19. A study of the recovery from 120 events

    International Nuclear Information System (INIS)

    Baumont, Genevieve; Menage, F.; Bigot, F.

    1998-01-01

    The author reports a study which aimed at providing additional information for improving safety by using event analysis. The approach concentrates on the dynamics of error detection and the way errors and shortcomings are managed. The study is based on a systematic analysis of 120 events in nuclear power plants. The authors first outline the differences between the activities described in significant events and that which is assumed to take place during event and accident situations. They describe the methods used to transpose human reliability PSA model to event analysis, report the analysis (event selection, data studied during event analysis, types of errors). Studies concern events during power operation or plant outage. Results are analyzed in terms of number of events, percentage of error type, percentage of activation of engineered safety features before operators recovered the situation. They comment who recovers the error and how it is recovered, and more precisely discuss the case of multiple error situations

  20. Time compression of soil erosion by the effect of largest daily event. A regional analysis of USLE database.

    Science.gov (United States)

    Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.

    2009-04-01

    When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE

  1. Analysis of the highest transverse energy events seen in the UA1 detector at the Spanti pS collider

    International Nuclear Information System (INIS)

    Albajar, C.; Bezaguet, A.; Cennini, P.

    1987-01-01

    This is the first full solid angle analysis of large transverse energy events in panti p collisions at the CERN collider. Events with transverse energies in excess of 200 GeV at √s=630 GeV are studied for any non-standard physics and quantitatively compared with expectations from perturbative QCD Monte Carlo models. A corrected differential cross section is presented. A detailed examination is made of jet profiles, event jet multiplicities and the fraction of the transverse energy carried by the two jets with the highest transverse jet energies. There is good agreement with standard theory for events with transverse energies up to the largest observed values (≅ √s/2) and the analysis shows no evidence for any non-QCD mechanism to account for the event characteristics. (orig.)

  2. Statistical Analysis of Solar Events Associated with SSC over Year of Solar Maximum during Cycle 23: 1. Identification of Related Sun-Earth Events

    Science.gov (United States)

    Grison, B.; Bocchialini, K.; Menvielle, M.; Chambodut, A.; Cornilleau-Wehrlin, N.; Fontaine, D.; Marchaudon, A.; Pick, M.; Pitout, F.; Schmieder, B.; Regnier, S.; Zouganelis, Y.

    2017-12-01

    Taking the 32 sudden storm commencements (SSC) listed by the observatory de l'Ebre / ISGI over the year 2002 (maximal solar activity) as a starting point, we performed a statistical analysis of the related solar sources, solar wind signatures, and terrestrial responses. For each event, we characterized and identified, as far as possible, (i) the sources on the Sun (Coronal Mass Ejections -CME-), with the help of a series of herafter detailed criteria (velocities, drag coefficient, radio waves, polarity), as well as (ii) the structure and properties in the interplanetary medium, at L1, of the event associated to the SSC: magnetic clouds -MC-, non-MC interplanetary coronal mass ejections -ICME-, co-rotating/stream interaction regions -SIR/CIR-, shocks only and unclear events that we call "miscellaneous" events. The categorization of the events at L1 is made on published catalogues. For each potential CME/L1 event association we compare the velocity observed at L1 with the one observed at the Sun and the estimated balistic velocity. Observations of radio emissions (Type II, Type IV detected from the ground and /or by WIND) associated to the CMEs make the solar source more probable. We also compare the polarity of the magnetic clouds with the hemisphere of the solar source. The drag coefficient (estimated with the drag-based model) is calculated for each potential association and it is compared to the expected range values. We identified a solar source for 26 SSC related events. 12 of these 26 associations match all criteria. We finally discuss the difficulty to perform such associations.

  3. [Incidence rate of adverse reaction/event by Qingkailing injection: a Meta-analysis of single rate].

    Science.gov (United States)

    Ai, Chun-ling; Xie, Yan-ming; Li, Ming-quan; Wang, Lian-xin; Liao, Xing

    2015-12-01

    To systematically review the incidence rate of adverse drug reaction/event by Qingkailing injection. Such databases as the PubMed, EMbase, the Cochrane library, CNKI, VIP WanFang data and CBM were searched by computer from foundation to July 30, 2015. Two reviewers independently screened literature according to the inclusion and exclusion criteria, extracted data and cross check data. Then, Meta-analysis was performed by using the R 3.2.0 software, subgroup sensitivity analysis was performed based on age, mode of medicine, observation time and research quality. Sixty-three studies involving 9,793 patients with Qingkailing injection were included, 367 cases of adverse reactions/events were reported in total. The incidence rate of adverse reaction in skin and mucosa group was 2% [95% CI (0.02; 0.03)]; the digestive system adverse reaction was 6% [95% CI(0.05; 0.07); the injection site adverse reaction was 4% [95% CI (0.02; 0.07)]. In the digestive system as the main types of adverse reactions/events, incidence of children and adults were 4.6% [0.021 1; 0.097 7] and 6.9% [0.053 5; 0.089 8], respectively. Adverse reactions to skin and mucous membrane damage as the main performance/event type, the observation time > 7 days and ≤ 7 days incidence of 3% [0.012 9; 0.068 3] and 1.9% [0.007 8; 0.046 1], respectively. Subgroup analysis showed that different types of adverse reactions, combination in the incidence of adverse reactions/events were higher than that of single drug, the difference was statistically significant (P reactions occur, and clinical rational drug use, such as combination, age and other fators, and the influence factors vary in different populations. Therefore, clinical doctors for children and the elderly use special care was required for a clear and open spirit injection, the implementation of individualized medication.

  4. The Influence of Sponsor-Event Congruence in Sponsorship of Music Festivals

    Directory of Open Access Journals (Sweden)

    Penny Hutabarat

    2014-04-01

    Full Text Available This paper focuses the research on the Influence of Sponsor-Event Congruence toward Brand Image, Attitudes toward the Brand and Purchase Intention. Having reviewed the literatures and arranged the hypotheses, the data has been gathered by distributing the questionnaire to 155 audiences at the Java Jazz Music Festival, firstly with convenience sampling and then snowballing sampling approach. The analysis of data was executed with Structural Equation Modeling (SEM. The result shows the sponsor-event congruence variable has a positive impact toward brand image and attitudes toward the brand sponsor. Brand Image also has a positive impact toward purchase intention; in contrary attitudes toward the brand do not have a positive purchase intention. With those results, to increase the sponsorship effectiveness, the role of congruency is very significant in the sponsorship event. Congruency is a key influencer to trigger the sponsorship effectiveness. Congruency between the event and the sponsor is able to boost up the brand image and bring out favorable attitudes towards the brand for the success of marketing communication programs, particularly sponsorship. In addition to it, image transfer gets higher due to the congruency existence (fit between sponsor and event and directs the intention creation to buy sponsor brand product/service (purchase intention. In conclusion, sponsor-event congruence has effect on consumer responds toward sponsorship, either on the cognitive level, affective and also behavior.

  5. Resilience Analysis of Key Update Strategies for Resource-Constrained Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2011-01-01

    Severe resource limitations in certain types of networks lead to various open issues in security. Since such networks usually operate in unattended or hostile environments, revoking the cryptographic keys and establishing (also distributing) new keys – which we refer to as key update – is a criti...

  6. Volunteer motivation in special events for people with disabilities ...

    African Journals Online (AJOL)

    There has been little research attention in the South African context on volunteer motivation for special events for people with disabilities. This study explored the key factors that motivated volunteers to volunteer their services at three major sport events for people with disabilities in South Africa. A 28-item questionnaire was ...

  7. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    Science.gov (United States)

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  8. A fast and versatile quantum key distribution system with hardware key distillation and wavelength multiplexing

    International Nuclear Information System (INIS)

    Walenta, N; Gisin, N; Guinnard, O; Houlmann, R; Korzh, B; Lim, C W; Lunghi, T; Portmann, C; Thew, R T; Burg, A; Constantin, J; Caselunghe, D; Kulesza, N; Legré, M; Monat, L; Soucarros, M; Trinkler, P; Junod, P; Trolliet, G; Vannel, F

    2014-01-01

    We present a compactly integrated, 625 MHz clocked coherent one-way quantum key distribution system which continuously distributes secret keys over an optical fibre link. To support high secret key rates, we implemented a fast hardware key distillation engine which allows for key distillation rates up to 4 Mbps in real time. The system employs wavelength multiplexing in order to run over only a single optical fibre. Using fast gated InGaAs single photon detectors, we reliably distribute secret keys with a rate above 21 kbps over 25 km of optical fibre. We optimized the system considering a security analysis that respects finite-key-size effects, authentication costs and system errors for a security parameter of ε QKD  = 4 × 10 −9 . (paper)

  9. The Blayais event

    International Nuclear Information System (INIS)

    2000-01-01

    This document provides the main events occurred to the Blayais installation during the year 2000. For each events, the detailed chronology, the situation analysis, the crisis management and the public information are provided. Some recommendations are also provided by the nuclear safety authorities. (A.L.B.)

  10. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  11. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    Science.gov (United States)

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  12. Second-Order Multiagent Systems with Event-Driven Consensus Control

    Directory of Open Access Journals (Sweden)

    Jiangping Hu

    2013-01-01

    Full Text Available Event-driven control scheduling strategies for multiagent systems play a key role in future use of embedded microprocessors of limited resources that gather information and actuate the agent control updates. In this paper, a distributed event-driven consensus problem is considered for a multi-agent system with second-order dynamics. Firstly, two kinds of event-driven control laws are, respectively, designed for both leaderless and leader-follower systems. Then, the input-to-state stability of the closed-loop multi-agent system with the proposed event-driven consensus control is analyzed and the bound of the inter-event times is ensured. Finally, some numerical examples are presented to validate the proposed event-driven consensus control.

  13. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    Science.gov (United States)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content

  14. An analysis on boron dilution events during SBLOCA for the KNGR

    International Nuclear Information System (INIS)

    Kim, Young In; Hwang, Young Dong; Park, Jong Kuen; Chung, Young Jong; Sim, Suk Gu

    1999-02-01

    An analysis on boron dilution events during small break loss of coolant accident (LOCA) for Korea Next Generation Reactor (KNGR) was performed using Computational Fluid Dynamic (CFD) computer program FLUENT code. The maximum size of the water slug was determined based on the source of un borated water slug and the possible flow paths. Axisymmetric computational fluid dynamic analysis model is applied for conservative scoping analysis of un borated water slug mixing with recirculation water of the reactor system following small break LOCA assuming one Reactor Coolant Pump (RCP) restart. The computation grid was determined through the sensitivity study on the grid size, which calculates the most conservative results, and the preliminary calculation for boron mixing was performed using the grid. (Author). 17 refs., 3 tabs., 26 figs

  15. Detailed Analysis of Solar Data Related to Historical Extreme Geomagnetic Storms: 1868 – 2010

    DEFF Research Database (Denmark)

    Lefèvre, Laure; Vennerstrøm, Susanne; Dumbović, Mateja

    2016-01-01

    An analysis of historical Sun–Earth connection events in the context of the most extreme space weather events of the last ∼ 150 years is presented. To identify the key factors leading to these extreme events, a sample of the most important geomagnetic storms was selected based mainly on the well-...

  16. Analysis of core damage frequency due to external events at the DOE [Department of Energy] N-Reactor

    International Nuclear Information System (INIS)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L.; Baxter, J.T.; Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P.; Brosseau, D.A.

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs

  17. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. (Sandia National Labs., Albuquerque, NM (USA)); Baxter, J.T. (Westinghouse Hanford Co., Richland, WA (USA)); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. (EQE, Inc., San Francisco, CA (USA)); Brosseau, D.A. (ERCE, Inc., Albuquerque, NM (USA))

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  18. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    International Nuclear Information System (INIS)

    Brinkmann, Markus; Eichbaum, Kathrin; Kammann, Ulrike; Hudjetz, Sebastian; Cofalla, Catrina; Buchinger, Sebastian; Reifferscheid, Georg; Schüttrumpf, Holger; Preuss, Thomas

    2014-01-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios

  19. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Energy Technology Data Exchange (ETDEWEB)

    Brinkmann, Markus; Eichbaum, Kathrin [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Kammann, Ulrike [Thünen-Institute of Fisheries Ecology, Palmaille 9, 22767 Hamburg (Germany); Hudjetz, Sebastian [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Cofalla, Catrina [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Buchinger, Sebastian; Reifferscheid, Georg [Federal Institute of Hydrology (BFG), Department G3: Biochemistry, Ecotoxicology, Am Mainzer Tor 1, 56068 Koblenz (Germany); Schüttrumpf, Holger [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Preuss, Thomas [Department of Environmental Biology and Chemodynamics, Institute for Environmental Research,ABBt- Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); and others

    2014-07-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios.

  20. Characterization of a Flood Event through a Sediment Analysis: The Tescio River Case Study

    Directory of Open Access Journals (Sweden)

    Silvia Di Francesco

    2016-07-01

    Full Text Available This paper presents the hydrological analysis and grain size characteristics of fluvial sediments in a river basin and their combination to characterize a flood event. The overall objective of the research is the development of a practical methodology based on experimental surveys to reconstruct the hydraulic history of ungauged river reaches on the basis of the modifications detected on the riverbed during the dry season. The grain size analysis of fluvial deposits usually requires great technical and economical efforts and traditional sieving based on physical sampling is not appropriate to adequately represent the spatial distribution of sediments in a wide area of a riverbed with a reasonable number of samples. The use of photographic sampling techniques, on the other hand, allows for the quick and effective determination of the grain size distribution, through the use of a digital camera and specific graphical algorithms in large river stretches. A photographic sampling is employed to characterize the riverbed in a 3 km ungauged reach of the Tescio River, a tributary of the Chiascio River, located in central Italy, representative of many rivers in the same geographical area. To this end, the particle size distribution is reconstructed through the analysis of digital pictures of the sediments taken on the riverbed in dry conditions. The sampling has been performed after a flood event of known duration, which allows for the identification of the removal of the armor in one section along the river reach under investigation. The volume and composition of the eroded sediments made it possible to calculate the average flow rate associated with the flood event which caused the erosion, by means of the sediment transport laws and the hydrological analysis of the river basin. A hydraulic analysis of the river stretch under investigation was employed to verify the validity of the proposed procedure.

  1. Men's and women's migration in coastal Ghana: An event history analysis

    Directory of Open Access Journals (Sweden)

    Holly E. Reed

    2010-04-01

    Full Text Available This article uses life history calendar (LHC data from coastal Ghana and event history statistical methods to examine inter-regional migration for men and women, focusing on four specific migration types: rural-urban, rural-rural, urban-urban, and urban-rural. Our analysis is unique because it examines how key determinants of migration-including education, employment, marital status, and childbearing-differ by sex for these four types of migration. We find that women are significantly less mobile than men overall, but that more educated women are more likely to move (particularly to urban areas than their male counterparts. Moreover, employment in the prior year is less of a deterrent to migration among women. While childbearing has a negative effect on migration, this impact is surprisingly stronger for men than for women, perhaps because women's search for assistance in childcare promotes migration. Meanwhile, being married or in union appears to have little effect on migration probabilities for either men or women. These results demonstrate the benefits of a LHC approach and suggest that migration research should further examine men's and women's mobility as it relates to both human capital and household and family dynamics, particularly in developing settings.

  2. ISVASE: identification of sequence variant associated with splicing event using RNA-seq data.

    Science.gov (United States)

    Aljohi, Hasan Awad; Liu, Wanfei; Lin, Qiang; Yu, Jun; Hu, Songnian

    2017-06-28

    Exon recognition and splicing precisely and efficiently by spliceosome is the key to generate mature mRNAs. About one third or a half of disease-related mutations affect RNA splicing. Software PVAAS has been developed to identify variants associated with aberrant splicing by directly using RNA-seq data. However, it bases on the assumption that annotated splicing site is normal splicing, which is not true in fact. We develop the ISVASE, a tool for specifically identifying sequence variants associated with splicing events (SVASE) by using RNA-seq data. Comparing with PVAAS, our tool has several advantages, such as multi-pass stringent rule-dependent filters and statistical filters, only using split-reads, independent sequence variant identification in each part of splicing (junction), sequence variant detection for both of known and novel splicing event, additional exon-exon junction shift event detection if known splicing events provided, splicing signal evaluation, known DNA mutation and/or RNA editing data supported, higher precision and consistency, and short running time. Using a realistic RNA-seq dataset, we performed a case study to illustrate the functionality and effectiveness of our method. Moreover, the output of SVASEs can be used for downstream analysis such as splicing regulatory element study and sequence variant functional analysis. ISVASE is useful for researchers interested in sequence variants (DNA mutation and/or RNA editing) associated with splicing events. The package is freely available at https://sourceforge.net/projects/isvase/ .

  3. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    Science.gov (United States)

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  4. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    Science.gov (United States)

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  5. Participants of the "Grid: the Key to Scientific Collaboration", an outstanding UNESCO-ROSTE and CERN event sponsored by Hewlett Packard held on 28 and 29 September at CERN, Geneva.

    CERN Multimedia

    Maximilien Brice

    2005-01-01

    Based on the collaboration-fostering and research-enabling role of the grid, CERN and UNESCO are taking the opportunity to invite current and future grid participants, universities and research institutions to a grid event hosted by CERN in Geneva. Through presentations by key grid protagonists from CERN, the European Commission, the EGEE Grid, and the European research community, participants have been able to learn about the capabilities of the grid, opportunities to leverage their research work, and participation in international projects.

  6. RELAP5/MOD 3.3 analysis of Reactor Coolant Pump Trip event at NPP Krsko

    International Nuclear Information System (INIS)

    Bencik, V.; Debrecin, N.; Foretic, D.

    2003-01-01

    In the paper the results of the RELAP5/MOD 3.3 analysis of the Reactor Coolant Pump (RCP) Trip event at NPP Krsko are presented. The event was initiated by an operator action aimed to prevent the RCP 2 bearing damage. The action consisted of a power reduction, that lasted for 50 minutes, followed by a reactor and a subsequent RCP 2 trip when the reactor power was reduced to 28 %. Two minutes after reactor trip, the Main Steam Isolation Valves (MSIV) were isolated and the steam dump flow was closed. On the secondary side the Steam Generator (SG) pressure rose until SG 1 Safety Valve (SV) 1 opened. The realistic RELAP5/MOD 3.3 analysis has been performed in order to model the particular plant behavior caused by operator actions. The comparison of the RELAP5/MOD 3.3 results with the measurement for the power reduction transient has shown small differences for the major parameters (nuclear power, average temperature, secondary pressure). The main trends and physical phenomena following the RCP Trip event were well reproduced in the analysis. The parameters that have the major influence on transient results have been identified. In the paper the influence of SG 1 relief and SV valves on transient results was investigated more closely. (author)

  7. Impact of environmental factors on the distribution of extreme scouring (gouging) event, Canadian Beaufort shelf

    Energy Technology Data Exchange (ETDEWEB)

    Blasco, Steve [Geological Survey of Canada, Dartmouth (Canada); Carr, Erin; Campbell, Patrick [Canadian Seabed Research Ltd., Porters Lake (Canada); Shearer, Jim [Shearer Consulting, Ottawa (Canada)

    2011-07-01

    A knowledge of the presence of scours, their dimensions and their return frequencies is highly important in the development of hydrocarbon offshore structures. Mapping surveys have identified 290 extreme scour events across the Canadian Beaufort shelf. This paper investigated the impact of environmental factors on the distribution of extreme scouring events in the Canadian Beaufort shelf. This study used the NEWBASE database of new ice scours to perform an analysis of the scours appearance mechanisms. The geotechnical zonation, the bathymetry and the shelf gradient were evaluated using these data. Estimation of the surficial sediment type, the surficial sediment thickness and sea ice regime were also made. It was found that the spatial distribution of extreme scour events is controlled by sea-ice regime, bathymetry and geotechnical zonation. The results obtained from mapping surveys suggested that the key controlling environmental factors may combine to limit the depth of extreme scour events to 5 meters.

  8. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  9. Accuracy analysis of measurements on a stable power-law distributed series of events

    International Nuclear Information System (INIS)

    Matthews, J O; Hopcraft, K I; Jakeman, E; Siviour, G B

    2006-01-01

    We investigate how finite measurement time limits the accuracy with which the parameters of a stably distributed random series of events can be determined. The model process is generated by timing the emigration of individuals from a population that is subject to deaths and a particular choice of multiple immigration events. This leads to a scale-free discrete random process where customary measures, such as mean value and variance, do not exist. However, converting the number of events occurring in fixed time intervals to a 1-bit 'clipped' process allows the construction of well-behaved statistics that still retain vestiges of the original power-law and fluctuation properties. These statistics include the clipped mean and correlation function, from measurements of which both the power-law index of the distribution of events and the time constant of its fluctuations can be deduced. We report here a theoretical analysis of the accuracy of measurements of the mean of the clipped process. This indicates that, for a fixed experiment time, the error on measurements of the sample mean is minimized by an optimum choice of the number of samples. It is shown furthermore that this choice is sensitive to the power-law index and that the approach to Poisson statistics is dominated by rare events or 'outliers'. Our results are supported by numerical simulation

  10. Superposed ruptile deformational events revealed by field and VOM structural analysis

    Science.gov (United States)

    Kumaira, Sissa; Guadagnin, Felipe; Keller Lautert, Maiara

    2017-04-01

    Virtual outcrop models (VOM) is becoming an important application in the analysis of geological structures due to the possibility of obtaining the geometry and in some cases kinematic aspects of analyzed structures in a tridimensional photorealistic space. These data are used to gain quantitative information on the deformational features which coupled with numeric models can assist in understands deformational processes. Old basement units commonly register superposed deformational events either ductile or ruptile along its evolution. The Porongos Belt, located at southern Brazil, have a complex deformational history registering at least five ductile and ruptile deformational events. In this study, we presents a structural analysis of a quarry in the Porongos Belt, coupling field and VOM structural information to understand process involved in the last two deformational events. Field information was acquired using traditional structural methods for analysis of ruptile structures, such as the descriptions, drawings, acquisition of orientation vectors and kinematic analysis. VOM was created from the image-based modeling method through photogrammetric data acquisition and orthorectification. Photogrammetric data acquisition was acquired using Sony a3500 camera and a total of 128 photographs were taken from ca. 10-20 m from the outcrop in different orientations. Thirty two control point coordinates were acquired using a combination of RTK dGPS surveying and total station work, providing a precision of few millimeters for x, y and z. Photographs were imported into the Photo Scan software to create a 3D dense point cloud from structure from-motion algorithm, which were triangulated and textured to generate the VOM. VOM was georreferenced (oriented and scaled) using the ground control points, and later analyzed in OpenPlot software to extract structural information. Data was imported in Wintensor software to obtain tensor orientations, and Move software to process and

  11. Using Key Part-of-Speech Analysis to Examine Spoken Discourse by Taiwanese EFL Learners

    Science.gov (United States)

    Lin, Yen-Liang

    2015-01-01

    This study reports on a corpus analysis of samples of spoken discourse between a group of British and Taiwanese adolescents, with the aim of exploring the statistically significant differences in the use of grammatical categories between the two groups of participants. The key word method extended to a part-of-speech level using the web-based…

  12. Psychiatric adverse events during treatment with brodalumab: Analysis of psoriasis clinical trials.

    Science.gov (United States)

    Lebwohl, Mark G; Papp, Kim A; Marangell, Lauren B; Koo, John; Blauvelt, Andrew; Gooderham, Melinda; Wu, Jashin J; Rastogi, Shipra; Harris, Susan; Pillai, Radhakrishnan; Israel, Robert J

    2018-01-01

    Individuals with psoriasis are at increased risk for psychiatric comorbidities, including suicidal ideation and behavior (SIB). To distinguish between the underlying risk and potential for treatment-induced psychiatric adverse events in patients with psoriasis being treated with brodalumab, a fully human anti-interleukin 17 receptor A monoclonal antibody. Data were evaluated from a placebo-controlled, phase 2 clinical trial; the open-label, long-term extension of the phase 2 clinical trial; and three phase 3, randomized, double-blind, controlled clinical trials (AMAGINE-1, AMAGINE-2, and AMAGINE-3) and their open-label, long-term extensions of patients with moderate-to-severe psoriasis. The analysis included 4464 patients with 9161.8 patient-years of brodalumab exposure. The follow-up time-adjusted incidence rates of SIB events were comparable between the brodalumab and ustekinumab groups throughout the 52-week controlled phases (0.20 vs 0.60 per 100 patient-years). In the brodalumab group, 4 completed suicides were reported, 1 of which was later adjudicated as indeterminate; all patients had underlying psychiatric disorders or stressors. There was no comparator arm past week 52. Controlled study periods were not powered to detect differences in rare events such as suicide. Comparison with controls and the timing of events do not indicate a causal relationship between SIB and brodalumab treatment. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  13. Combining geomorphic and documentary flood evidence to reconstruct extreme events in Mediterranean basins

    Science.gov (United States)

    Thorndycraft, V. R.; Benito, G.; Barriendos, M.; Rico, M.; Sánchez-Moya, Y.; Sopeña, A.; Casas, A.

    2009-09-01

    Palaeoflood hydrology is the reconstruction of flood magnitude and frequency using geomorphological flood evidence and is particularly valuable for extending the record of extreme floods prior to the availability of instrumental data series. This paper will provide a review of recent developments in palaeoflood hydrology and will be presented in three parts: 1) an overview of the key methodological approaches used in palaeoflood hydrology and the use of historical documentary evidence for reconstructing extreme events; 2) a summary of the Llobregat River palaeoflood case study (Catalonia, NE Spain); and 3) analysis of the AD 1617 flood and its impacts across Catalonia (including the rivers Llobregat, Ter and Segre). The key findings of the Llobregat case study were that at least eight floods occurred with discharges significantly larger than events recorded in the instrumental record, for example at the Pont de Vilomara study reach the palaeodischarges of these events were 3700-4300 m3/s compared to the 1971 flood, the largest on record, of 2300 m3/s. Five of these floods were dated to the last 3000 years and the three events directly dated by radiocarbon all occurred during cold phases of global climate. Comparison of the palaeoflood record with documentary evidence indicated that one flood, radiocarbon dated to cal. AD 1540-1670, was likely to be the AD 1617 event, the largest flood of the last 700 years. Historical records indicate that this event was caused by rainfall occurring from the 2nd to 6th November and the resultant flooding caused widespread socio-economic impacts including the destruction of at least 389 houses, 22 bridges and 17 water mills. Discharges estimated from palaeoflood records and historical flood marks indicate that the Llobregat (4680 m3/s) and Ter (2700-4500 m3/s) rivers witnessed extreme discharges in comparison to observed floods in the instrumental record (2300 and 2350 m3/s, respectively); whilst further east in the Segre River

  14. Key drivers of precipitation isotopes in Windhoek, Namibia (2012-2016)

    Science.gov (United States)

    Kaseke, K. F.; Wang, L.; Wanke, H.

    2017-12-01

    Southern African climate is characterized by large variability with precipitation model estimates varying by as much as 70% during summer. This difference between model estimates is partly because most models associate precipitation over Southern Africa with moisture inputs from the Indian Ocean while excluding inputs from the Atlantic Ocean. However, growing evidence suggests that the Atlantic Ocean may also contribute significant amounts of moisture to the region. This four-year (2012-2016) study investigates the isotopic composition (δ18O, δ2H and δ17O) of event-scale precipitation events, the key drivers of isotope variations and the origins of precipitation experienced in Windhoek, Namibia. Results indicate large storm-to-storm isotopic variability δ18O (25‰), δ2H (180‰) and δ17O (13‰) over the study period. Univariate analysis showed significant correlations between event precipitation isotopes and local meteorological parameters; lifted condensation level, relative humidity (RH), precipitation amount, average wind speed, surface and air temperature (p < 0.05). The number of significant correlations between local meteorological parameters and monthly isotopes was much lower suggesting loss of information through data aggregation. Nonetheless, the most significant isotope driver at both event and monthly scales was RH, consistent with the semi-arid classification of the site. Multiple linear regression analysis suggested RH, precipitation amount and air temperature were the most significant local drivers of precipitation isotopes accounting for about 50% of the variation implying that about 50% could be attributed to source origins. HYSLPIT trajectories indicated that 78% of precipitation originated from the Indian Ocean while 21% originated from the Atlantic Ocean. Given that three of the four study years were droughts while two of the three drought years were El Niño related, our data also suggests that δ'17O-δ'18O could be a useful tool to

  15. Identifying key hospital service quality factors in online health communities.

    Science.gov (United States)

    Jung, Yuchul; Hur, Cinyoung; Jung, Dain; Kim, Minki

    2015-04-07

    classification performance still has room for improvement, but the extraction results are applicable to more detailed analysis. Further analysis of the extracted information reveals that there are differences in the details of social media-based key quality factors for hospitals according to the regions in Korea, and the patterns of change seem to accurately reflect social events (eg, influenza epidemics). These findings could be used to provide timely information to caregivers, hospital officials, and medical officials for health care policies.

  16. Glaciological parameters of disruptive event analysis

    International Nuclear Information System (INIS)

    Bull, C.

    1979-01-01

    The following disruptive events caused by ice sheets are considered: continental glaciation, erosion, loading and subsidence, deep ground water recharge, flood erosion, isostatic rebound rates, melting, and periodicity of ice ages

  17. Vitamin D status and risk of cardiovascular events: lessons learned via systematic review and meta-analysis.

    Science.gov (United States)

    Sokol, Seth I; Tsang, Pansy; Aggarwal, Vikas; Melamed, Michal L; Srinivas, V S

    2011-01-01

    Accumulating data linking hypovitaminosis D to cardiovascular (CV) events has contributed to large increases in vitamin D testing and supplementation. To evaluate the merits of this practice, we conducted a systematic review with meta-analysis providing a framework for interpreting the literature associating hypovitaminosis D with increased CV events. Prospective studies were identified by search of MEDLINE and EMBASE from inception to January 2010, restricted to English language publications. Two authors independently extracted data and graded study quality. Pooled relative risks (RR) were calculated using a random effects model. Ten studies met criteria for review and 7 were included in meta-analysis. Pooled RR for CV events using FAIR and GOOD quality studies was 1.67 (95% confidence interval, 1.23-2.28) during an average follow-up of 11.8 years. There was evidence of significant heterogeneity across studies (Q statistics = 16.6, P = 0.01, I = 63.8%), which was eliminated after omitting 2 studies identified by sensitivity analysis (RR, 1.34 [1.08-1.67]; P for heterogeneity =0.33). When restricting analysis to GOOD quality studies (RR, 1.27 [1.04-1.56]), no significant heterogeneity was found (P = 0.602). Systematic review identified significant shortcomings in the literature, including variability in defining vitamin D status, seasonal adjustments, defining and determining CV outcomes, and the use of baseline vitamin D levels. In conclusion, a modest increased risk of CV events associated with hypovitaminosis D is tempered by significant limitations within the current literature. These findings underscore the importance of critical appraisal of the literature, looking beyond reported risk estimates before translating results into clinical practice.

  18. Stock market returns and clinical trial results of investigational compounds: an event study analysis of large biopharmaceutical companies.

    Science.gov (United States)

    Hwang, Thomas J

    2013-01-01

    For biopharmaceutical companies, investments in research and development are risky, and the results from clinical trials are key inflection points in the process. Few studies have explored how and to what extent the public equity market values clinical trial results. Our study dataset matched announcements of clinical trial results for investigational compounds from January 2011 to May 2013 with daily stock market returns of large United States-listed pharmaceutical and biotechnology companies. Event study methodology was used to examine the relationship between clinical research events and changes in stock returns. We identified public announcements for clinical trials of 24 investigational compounds, including 16 (67%) positive and 8 (33%) negative events. The majority of announcements were for Phase 3 clinical trials (N = 13, 54%), and for oncologic (N = 7, 29%) and neurologic (N = 6, 24%) indications. The median cumulative abnormal returns on the day of the announcement were 0.8% (95% confidence interval [CI]: -2.3, 13.4%; P = 0.02) for positive events and -2.0% (95% CI: -9.1, 0.7%; P = 0.04) for negative events, with statistically significant differences from zero. In the day immediately following the announcement, firms with positive events were associated with stock price corrections, with median cumulative abnormal returns falling to 0.4% (95% CI: -3.8, 12.3%; P = 0.33). For firms with negative announcements, the median cumulative abnormal returns were -1.7% (95% CI: -9.5, 1.0%; P = 0.03), and remained significantly negative over the two day event window. The magnitude of abnormal returns did not differ statistically by indication, by trial phase, or between biotechnology and pharmaceutical firms. The release of clinical trial results is an economically significant event and has meaningful effects on market value for large biopharmaceutical companies. Stock return underperformance due to negative events is greater in magnitude and persists longer than

  19. A matter of definition--key elements identified in a discourse analysis of definitions of palliative care.

    Science.gov (United States)

    Pastrana, T; Jünger, S; Ostgathe, C; Elsner, F; Radbruch, L

    2008-04-01

    For more than 30 years, the term "palliative care" has been used. From the outset, the term has undergone a series of transformations in its definitions and consequently in its tasks and goals. There remains a lack of consensus on a definition. The aim of this article is to analyse the definitions of palliative care in the specialist literature and to identify the key elements of palliative care using discourse analysis: a qualitative methodology. The literature search focused on definitions of the term 'palliative medicine' and 'palliative care' in the World Wide Web and medical reference books in English and German. A total of 37 English and 26 German definitions were identified and analysed. Our study confirmed the lack of a consistent meaning concerning the investigated terms, reflecting on-going discussion about the nature of the field among palliative care practitioners. Several common key elements were identified. Four main categories emerged from the discourse analysis of the definition of palliative care: target groups, structure, tasks and expertise. In addition, the theoretical principles and goals of palliative care were discussed and found to be key elements, with relief and prevention of suffering and improvement of quality of life as main goals. The identified key elements can contribute to the definition of the concept 'palliative care'. Our study confirms the importance of semantic and ethical influences on palliative care that should be considered in future research on semantics in different languages.

  20. CATEGORIZATION OF EVENT SEQUENCES FOR LICENSE APPLICATION

    Energy Technology Data Exchange (ETDEWEB)

    G.E. Ragan; P. Mecheret; D. Dexheimer

    2005-04-14

    The purposes of this analysis are: (1) Categorize (as Category 1, Category 2, or Beyond Category 2) internal event sequences that may occur before permanent closure of the repository at Yucca Mountain. (2) Categorize external event sequences that may occur before permanent closure of the repository at Yucca Mountain. This includes examining DBGM-1 seismic classifications and upgrading to DBGM-2, if appropriate, to ensure Beyond Category 2 categorization. (3) State the design and operational requirements that are invoked to make the categorization assignments valid. (4) Indicate the amount of material put at risk by Category 1 and Category 2 event sequences. (5) Estimate frequencies of Category 1 event sequences at the maximum capacity and receipt rate of the repository. (6) Distinguish occurrences associated with normal operations from event sequences. It is beyond the scope of the analysis to propose design requirements that may be required to control radiological exposure associated with normal operations. (7) Provide a convenient compilation of the results of the analysis in tabular form. The results of this analysis are used as inputs to the consequence analyses in an iterative design process that is depicted in Figure 1. Categorization of event sequences for permanent retrieval of waste from the repository is beyond the scope of this analysis. Cleanup activities that take place after an event sequence and other responses to abnormal events are also beyond the scope of the analysis.

  1. CATEGORIZATION OF EVENT SEQUENCES FOR LICENSE APPLICATION

    International Nuclear Information System (INIS)

    G.E. Ragan; P. Mecheret; D. Dexheimer

    2005-01-01

    The purposes of this analysis are: (1) Categorize (as Category 1, Category 2, or Beyond Category 2) internal event sequences that may occur before permanent closure of the repository at Yucca Mountain. (2) Categorize external event sequences that may occur before permanent closure of the repository at Yucca Mountain. This includes examining DBGM-1 seismic classifications and upgrading to DBGM-2, if appropriate, to ensure Beyond Category 2 categorization. (3) State the design and operational requirements that are invoked to make the categorization assignments valid. (4) Indicate the amount of material put at risk by Category 1 and Category 2 event sequences. (5) Estimate frequencies of Category 1 event sequences at the maximum capacity and receipt rate of the repository. (6) Distinguish occurrences associated with normal operations from event sequences. It is beyond the scope of the analysis to propose design requirements that may be required to control radiological exposure associated with normal operations. (7) Provide a convenient compilation of the results of the analysis in tabular form. The results of this analysis are used as inputs to the consequence analyses in an iterative design process that is depicted in Figure 1. Categorization of event sequences for permanent retrieval of waste from the repository is beyond the scope of this analysis. Cleanup activities that take place after an event sequence and other responses to abnormal events are also beyond the scope of the analysis

  2. Working group of experts on rare events in human error analysis and quantification

    International Nuclear Information System (INIS)

    Goodstein, L.P.

    1977-01-01

    In dealing with the reference problem of rare events in nuclear power plants, the group has concerned itself with the man-machine system and, in particular, with human error analysis and quantification. The Group was requested to review methods of human reliability prediction, to evaluate the extent to which such analyses can be formalized and to establish criteria to be met by task conditions and system design which would permit a systematic, formal analysis. Recommendations are given on the Fessenheim safety system

  3. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  4. Analysis of thermal fatigue events in light water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Okuda, Yasunori [Institute of Nuclear Safety System Inc., Seika, Kyoto (Japan)

    2000-09-01

    Thermal fatigue events, which may cause shutdown of nuclear power stations by wall-through-crack of pipes of RCRB (Reactor Coolant Pressure Boundary), are reported by licensees in foreign countries as well as in Japan. In this paper, thermal fatigue events reported in anomalies reports of light water reactors inside and outside of Japan are investigated. As a result, it is clarified that the thermal fatigue events can be classified in seven patterns by their characteristics, and the trend of the occurrence of the events in PWRs (Pressurized Water Reactors) has stronger co-relation to operation hours than that in BWRs (Boiling Water Reactors). Also, it is concluded that precise identification of locations where thermal fatigue occurs and its monitoring are important to prevent the thermal fatigue events by aging or miss modification. (author)

  5. Probabilistic risk assessment course documentation. Volume 4. System reliability and analysis techniques sessions B/C - event trees/fault trees

    International Nuclear Information System (INIS)

    Haasl, D.; Young, J.

    1985-08-01

    This course will employ a combination of lecture material and practical problem solving in order to develop competence and understanding of th principles and techniques of event tree and fault tree analysis. The role of these techniques in the overall context of PRA will be described. The emphasis of this course will be on the basic, traditional methods of event tree and fault tree analysis

  6. The Influence of Sponsor-Event Congruence in Sponsorship of Music Festivals

    Directory of Open Access Journals (Sweden)

    Penny Hutabarat

    2014-05-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE This paper focuses the research on the Influence of Sponsor-Event Congruence toward Brand Image, Attitudes toward the Brand and Purchase Intention. Having reviewed the literatures and arranged the hypotheses, the data has been gathered by distributing the questionnaire to 155 audiences at the Java Jazz Music Festival, firstly with convenience sampling and then snowballing sampling approach. The analysis of data was executed with Structural Equation Modeling (SEM. The result shows the sponsor-event congruence variable has a positive impact toward brand image and attitudes toward the brand sponsor. Brand Image also has a positive impact toward purchase intention; in contrary attitudes toward the brand do not have a positive purchase intention. With those results, to increase the sponsorship effectiveness, the role of congruency is very significant in the sponsorship event. Congruency is a key influencer to trigger the sponsorship effectiveness. Congruency between the event and the sponsor is able to boost up the brand image and bring out favorable attitudes towards the brand for the success of marketing communication programs, particularly sponsorship. In addition to it, image transfer gets higher due to the congruency existence (fit between sponsor and event and directs the intention creation to buy sponsor brand product/service (purchase intention. In conclusion, sponsor-event congruence has effect on consumer responds toward sponsorship, either on the cognitive level, affective and also behavior.

  7. Analysis of mechanics of verbal manipulation with key words of social vocabulary exemplified in journalistic article

    Directory of Open Access Journals (Sweden)

    Наталья Александровна Бубнова

    2012-03-01

    Full Text Available The article deals with the analysis of mechanism of speech manipulation on readers' consciousness by means of socially marked key words, forming four concept groups: power, nation, wealth, poverty (on the material of journalistic article.

  8. Intra-Hospital Transport of Patients on Non-Invasive Ventilation: Review, Analysis, and Key Practical Recommendations by the International NIV Committee

    Directory of Open Access Journals (Sweden)

    Annia Schreiber

    2017-12-01

    Full Text Available Intra-hospital transport is often needed for diagnostic and therapeutic procedures that cannot be performed at the bedside. However, moving patients from the safe environment of an Intensive Care Unit (ICU can lead to a variety of complications and adverse events, the risk is even higher in ventilated patients. This review is intended as a guide on how to prevent and avoid these adverse events during intra-hospital transport of patients on non-invasive ventilation (NIV. Greater attention should be paid to NIV indications and the selection of the patients to be transported. Detailed planning, preparation, and communication between the ward of origin and destination site, appropriate equipment, skilled staff, and continuous monitoring are the key major determinants of success in transporting critically ill patients on NIV. These points are discussed and analyzed in detail.

  9. Impacts of Extreme Events on Human Health. Chapter 4

    Science.gov (United States)

    Bell, Jesse E.; Herring, Stephanie C.; Jantarasami, Lesley; Adrianopoli, Carl; Benedict, Kaitlin; Conlon, Kathryn; Escobar, Vanessa; Hess, Jeremy; Luvall, Jeffrey; Garcia-Pando, Carlos Perez; hide

    2016-01-01

    Increased Exposure to Extreme Events Key Finding 1: Health impacts associated with climate-related changes in exposure to extreme events include death, injury, or illness; exacerbation of underlying medical conditions; and adverse effects on mental health[High Confidence]. Climate change will increase exposure risk in some regions of the United States due to projected increases in the frequency and/or intensity of drought, wildfires, and flooding related to extreme precipitation and hurricanes [Medium Confidence].Disruption of Essential Infrastructure Key Finding 2: Many types of extreme events related to climate change cause disruption of infrastructure, including power, water, transportation, and communication systems, that are essential to maintaining access to health care and emergency response services and safeguarding human health [High Confidence].Vulnerability to Coastal Flooding Key Finding 3: Coastal populations with greater vulnerability to health impacts from coastal flooding include persons with disabilities or other access and functional needs, certain populations of color, older adults, pregnant women and children, low-income populations, and some occupational groups [High Confidence].Climate change will increase exposure risk to coastal flooding due to increases in extreme precipitation and in hurricane intensity and rainfall rates, as well as sea level rise and the resulting increases in storm surge.

  10. Is thermodynamics of the universe bounded by event horizon a Bekenstein system?

    International Nuclear Information System (INIS)

    Chakraborty, Subenoy

    2012-01-01

    In this brief communication, we have studied the validity of the first law of thermodynamics for the universe bounded by event horizon with two examples. The key point is the appropriate choice of the temperature on the event horizon. Finally, we have concluded that universe bounded by the event horizon may be a Bekenstein system and Einstein's equations and the first law of thermodynamics on the event horizons are equivalent.

  11. Is thermodynamics of the universe bounded by event horizon a Bekenstein system?

    OpenAIRE

    Chakraborty, Subenoy

    2012-01-01

    In this brief communication, we have studied the validity of the first law of thermodynamics for the universe bounded by event horizon with two examples. The key point is the appropriate choice of the temperature on the event horizon. Finally, we have concluded that universe bounded by the event horizon may be a Bekenstein system and the Einstein's equations and the first law of thermodynamics on the event horizons are equivalent.

  12. The fuzzy set theory application to the analysis of accident progression event trees with phenomenological uncertainty issues

    International Nuclear Information System (INIS)

    Chun, Moon-Hyun; Ahn, Kwang-Il

    1991-01-01

    Fuzzy set theory provides a formal framework for dealing with the imprecision and vagueness inherent in the expert judgement, and therefore it can be used for more effective analysis of accident progression of PRA where experts opinion is a major means for quantifying some event probabilities and uncertainties. In this paper, an example application of the fuzzy set theory is first made to a simple portion of a given accident progression event tree with typical qualitative fuzzy input data, and thereby computational algorithms suitable for application of the fuzzy set theory to the accident progression event tree analysis are identified and illustrated with example applications. Then the procedure used in the simple example is extended to extremely complex accident progression event trees with a number of phenomenological uncertainty issues, i.e., a typical plant damage state 'SEC' of the Zion Nuclear Power Plant risk assessment. The results show that the fuzzy averages of the fuzzy outcomes are very close to the mean values obtained by current methods. The main purpose of this paper is to provide a formal procedure for application of the fuzzy set theory to accident progression event trees with imprecise and qualitative branch probabilities and/or with a number of phenomenological uncertainty issues. (author)

  13. Multivariate hydrological frequency analysis for extreme events using Archimedean copula. Case study: Lower Tunjuelo River basin (Colombia)

    Science.gov (United States)

    Gómez, Wilmar

    2017-04-01

    By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.

  14. Event-shape analysis: Sequential versus simultaneous multifragment emission

    International Nuclear Information System (INIS)

    Cebra, D.A.; Howden, S.; Karn, J.; Nadasen, A.; Ogilvie, C.A.; Vander Molen, A.; Westfall, G.D.; Wilson, W.K.; Winfield, J.S.; Norbeck, E.

    1990-01-01

    The Michigan State University 4π array has been used to select central-impact-parameter events from the reaction 40 Ar+ 51 V at incident energies from 35 to 85 MeV/nucleon. The event shape in momentum space is an observable which is shown to be sensitive to the dynamics of the fragmentation process. A comparison of the experimental event-shape distribution to sequential- and simultaneous-decay predictions suggests that a transition in the breakup process may have occurred. At 35 MeV/nucleon, a sequential-decay simulation reproduces the data. For the higher energies, the experimental distributions fall between the two contrasting predictions

  15. THE CCAUV.A-K3 KEY COMPARISON OF PRESSURE RECIPROCITY CALIBRATION OF LS2P MICROPHONES: RESULTS AND ANALYSIS

    DEFF Research Database (Denmark)

    Cutanda Henríquez, Vicente; Rasmussen, Knud; Nielsen, Lars

    2006-01-01

    The CCAUV.A-K3 Key Comparison has involved 15 countries organized in two loops with two common laboratories, CENAM and DPLA. The measurements took place in 2003. This is the first CCAUV key comparison organized with more than one loop, and therefore the analysis of the results required a more ela...

  16. Community Response and Engagement During Extreme Water Events in Saskatchewan, Canada and Queensland, Australia

    Science.gov (United States)

    McMartin, Dena W.; Sammel, Alison J.; Arbuthnott, Katherine

    2018-01-01

    Technology alone cannot address the challenges of how societies, communities, and individuals understand water accessibility, water management, and water consumption, particularly under extreme conditions like floods and droughts. At the community level, people are increasingly aware challenges related to responses to and impacts of extreme water events. This research begins with an assessment of social and political capacities of communities in two Commonwealth jurisdictions, Queensland, Australia and Saskatchewan, Canada, in response to major flooding events. The research further reviews how such capacities impact community engagement to address and mitigate risks associated with extreme water events and provides evidence of key gaps in skills, understanding, and agency for addressing impacts at the community level. Secondary data were collected using template analysis to elucidate challenges associated with education (formal and informal), social and political capacity, community ability to respond appropriately, and formal government responses to extreme water events in these two jurisdictions. The results indicate that enhanced community engagement alongside elements of an empowerment model can provide avenues for identifying and addressing community vulnerability to negative impacts of flood and drought.

  17. The power of event-driven analytics in Large Scale Data Processing

    CERN Multimedia

    CERN. Geneva; Marques, Paulo

    2011-01-01

    FeedZai is a software company specialized in creating high-­‐throughput low-­‐latency data processing solutions. FeedZai develops a product called "FeedZai Pulse" for continuous event-­‐driven analytics that makes application development easier for end users. It automatically calculates key performance indicators and baselines, showing how current performance differ from previous history, creating timely business intelligence updated to the second. The tool does predictive analytics and trend analysis, displaying data on real-­‐time web-­‐based graphics. In 2010 FeedZai won the European EBN Smart Entrepreneurship Competition, in the Digital Models category, being considered one of the "top-­‐20 smart companies in Europe". The main objective of this seminar/workshop is to explore the topic for large-­‐scale data processing using Complex Event Processing and, in particular, the possible uses of Pulse in...

  18. 365 MAPPING MALARIA CASE EVENT AND FACTORS OF ...

    African Journals Online (AJOL)

    Osondu

    Key words: Malaria case event; prevention; vulnerability; GIS; Nigeria. Introduction. The mapping of ... Ethiopian Journal of Environmental Studies and Management Vol. 6 No.4 2013 ... review articles Tanser et al., (2000), indicate that. Satellite ...

  19. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    Science.gov (United States)

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  20. Analysis and modeling of a hail event consequences on a building portfolio

    Science.gov (United States)

    Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel

    2014-05-01

    North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the

  1. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs

  2. Analysis of double random phase encryption from a key-space perspective

    Science.gov (United States)

    Monaghan, David S.; Situ, Guohai; Ryle, James; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    The main advantage of the double random phase encryption technique is its physical implementation however to allow us to analyse its behaviour we perform the encryption/decryption numerically. A typically strong encryption scheme will have an extremely large key-space, which will make the probable success of any brute force attack on that algorithm miniscule. Traditionally, designers of optical image encryption systems only demonstrate how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. We analyse this algorithm from a key-space perspective. The key-space of an encryption algorithm can be defined as the set of possible keys that can be used to encode data using that algorithm. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute force attack.

  3. Two damaging hydrogeological events in Calabria, September 2000 and November 2015. Comparative analysis of causes and effects

    Science.gov (United States)

    Petrucci, Olga; Caloiero, Tommaso; Aurora Pasqua, Angela

    2016-04-01

    Each year, especially during winter season, some episode of intense rain affects Calabria, the southernmost Italian peninsular region, triggering flash floods and mass movements that cause damage and fatalities. This work presents a comparative analysis between two events that affected the southeast sector of the region, in 2000 and 2014, respectively. The event occurred between 9th and 10th of September 2000 is known in Italy as Soverato event, after the name of the municipality where it reached the highest damage severity. In the Soverato area, more than 200 mm of rain that fell in 24 hours caused a disastrous flood that swept away a campsite at about 4 a.m., killing 13 people and hurting 45. Besides, the rain affected a larger area, causing damage in 89 (out of 409) municipalities of the region. Flooding was the most common process, which damaged housing and trading. Landslide mostly affected the road network, housing and cultivations. The most recent event affected the same regional sector between 30th October and 2nd November 2015. The daily rain recorded at some of the rain gauges of the area almost reached 400 mm. Out of the 409 municipalities of Calabria, 109 suffered damage. The most frequent types of processes were both flash floods and landslides. The most heavily damaged element was the road network: the representative picture of the event is a railway bridge destroyed by the river flow. Housing was damaged too, and 486 people were temporarily evacuated from home. The event also caused a victim killed by a flood. The event-centred study approach aims to highlight differences and similarities in both the causes and the effects of the two events that occurred at a temporal distance of 14 years. The comparative analysis focus on three main aspects: the intensity of triggering rain, the modifications of urbanised areas, and the evolution of emergency management. The comparative analysis of rain is made by comparing the return period of both daily and

  4. Events and mega events: leisure and business in tourism

    Directory of Open Access Journals (Sweden)

    Ricardo Alexandre Paiva

    2015-12-01

    Full Text Available The promotion of events and mega events mobilizes at the same time, in a concatenated way or not, leisure and business practices, which are captured by the tourism industry as a stimulus for the reproduction of capitalism, by the amount of other activities which raise (primary, secondary and tertiary , placing the architecture and the city as protagonists in contemporary urban development. In this sense, the article analyzes the articulation of events and mega events to the provision of architecture and urban infrastructure, as well as the construction of the tourist image of the places, motivated by leisure and business activities. The methodological procedures have theoretical and exploratory character and have multidisciplinary intentions. This will be discussed, in a historical perspective, the concepts of leisure and business activities that raise as moving or traveling; next it will be delimited similarities and differences between tourism events and business tourism, entering after the analysis of the distinctions between events and mega events, highlighting the complexity and the role of mega-events as a major symptom of globalization; finally it will be presented the spatial scale developments in architecture and the city in the realization of (mega events, as well as its impact on the city's image. As a synthesis, it is important to notice that spatial developments business tourism, events and mega events are manifested in various scales and with different levels of complexity, revealing the strengths and / or weaknesses of the places. The urban planning, architecture and urbanism are important objects of knowledge and spatial intervention to ensure infrastructure and urban and architectural structures appropriate for events, which should be sensitive to the demands of tourists and host communities.

  5. Event-by-event particle multiplicity fluctuations in Pb-Pb collisions with ALICE

    Energy Technology Data Exchange (ETDEWEB)

    Arslandok, Mesut [Institut fuer Kernphysik, Goethe-Universitaet Frankfurt (Germany); Collaboration: ALICE-Collaboration

    2014-07-01

    The study of event-by-event fluctuations of identified hadrons may reveal the degrees of freedom of the strongly interacting mater created in heavy-ion collisions. Particle identification that is based on the measurement of the specific ionization energy loss dE/dx works well on a statistical basis, however, suffers from ambiguities when applied on the event-by-event level. A novel experimental technique called the ''Identity Method'' was recently proposed to overcome such limitations. The method follows a probabilistic approach using the inclusive dE/dx distributions measured in the ALICE TPC, and determines the moments of the multiplicity distributions by an unfolding procedure. In this contribution, the status of an event-by-event fluctuation analysis that applies the Identity Method to Pb-Pb data from ALICE is presented.

  6. Detector decoy quantum key distribution

    International Nuclear Information System (INIS)

    Moroder, Tobias; Luetkenhaus, Norbert; Curty, Marcos

    2009-01-01

    Photon number resolving detectors can enhance the performance of many practical quantum cryptographic setups. In this paper, we employ a simple method to estimate the statistics provided by such a photon number resolving detector using only a threshold detector together with a variable attenuator. This idea is similar in spirit to that of the decoy state technique, and is especially suited to those scenarios where only a few parameters of the photon number statistics of the incoming signals have to be estimated. As an illustration of the potential applicability of the method in quantum communication protocols, we use it to prove security of an entanglement-based quantum key distribution scheme with an untrusted source without the need for a squash model and by solely using this extra idea. In this sense, this detector decoy method can be seen as a different conceptual approach to adapt a single-photon security proof to its physical, full optical implementation. We show that in this scenario, the legitimate users can now even discard the double click events from the raw key data without compromising the security of the scheme, and we present simulations on the performance of the BB84 and the 6-state quantum key distribution protocols.

  7. Do climate extreme events foster violent civil conflicts? A coincidence analysis

    Science.gov (United States)

    Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.

    2014-05-01

    Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.

  8. OAE: The Ontology of Adverse Events.

    Science.gov (United States)

    He, Yongqun; Sarntivijai, Sirarat; Lin, Yu; Xiang, Zuoshuang; Guo, Abra; Zhang, Shelley; Jagannathan, Desikan; Toldo, Luca; Tao, Cui; Smith, Barry

    2014-01-01

    A medical intervention is a medical procedure or application intended to relieve or prevent illness or injury. Examples of medical interventions include vaccination and drug administration. After a medical intervention, adverse events (AEs) may occur which lie outside the intended consequences of the intervention. The representation and analysis of AEs are critical to the improvement of public health. The Ontology of Adverse Events (OAE), previously named Adverse Event Ontology (AEO), is a community-driven ontology developed to standardize and integrate data relating to AEs arising subsequent to medical interventions, as well as to support computer-assisted reasoning. OAE has over 3,000 terms with unique identifiers, including terms imported from existing ontologies and more than 1,800 OAE-specific terms. In OAE, the term 'adverse event' denotes a pathological bodily process in a patient that occurs after a medical intervention. Causal adverse events are defined by OAE as those events that are causal consequences of a medical intervention. OAE represents various adverse events based on patient anatomic regions and clinical outcomes, including symptoms, signs, and abnormal processes. OAE has been used in the analysis of several different sorts of vaccine and drug adverse event data. For example, using the data extracted from the Vaccine Adverse Event Reporting System (VAERS), OAE was used to analyse vaccine adverse events associated with the administrations of different types of influenza vaccines. OAE has also been used to represent and classify the vaccine adverse events cited in package inserts of FDA-licensed human vaccines in the USA. OAE is a biomedical ontology that logically defines and classifies various adverse events occurring after medical interventions. OAE has successfully been applied in several adverse event studies. The OAE ontological framework provides a platform for systematic representation and analysis of adverse events and of the factors (e

  9. #JeSuisCharlie: Towards a Multi-Method Study of Hybrid Media Events

    Directory of Open Access Journals (Sweden)

    Johanna Sumiala

    2016-10-01

    Full Text Available This article suggests a new methodological model for the study of hybrid media events with global appeal. This model, developed in the project on the 2015 Charlie Hebdo attacks in Paris, was created specifically for researching digital media—and in particular, Twitter. The article is structured as follows. Firstly, the methodological scope is discussed against the theoretical context, e.g. the theory of media events. In the theoretical discussion, special emphasis is given to i disruptive, upsetting, or disintegrative media events and hybrid media events and ii the conditions of today’s heterogeneous and globalised media communication landscape. Secondly, the article introduces a multi-method approach developed for the analysis of hybrid media events. In this model, computational social science—namely, automated content analysis (ACA and social network analytics (SNA—are combined with a qualitative approach—specifically, digital ethnography. The article outlines three key phases for research in which the interplay between quantitative and qualitative approaches is played out. In the first phase, preliminary digital ethnography is applied to provide the outline of the event. In the second phase, quantitative social network analytics are applied to construct the digital field for research. In this phase, it is necessary to map a what is circulating on the websites and b where this circulation takes place. The third and final phase applies a qualitative approach and digital ethnography to provide a more nuanced, in-depth interpretation of what (substance/content is circulating and how this material connects with the ‘where’ in the digital landscape, hence constituting links and connections in the hybrid media landscape. In conclusion, the article reflects on how this multi-method approach contributes to understanding the workings of today’s hybrid media events: how they create and maintain symbolic battles over certain imagined

  10. The economic burden of nurse-sensitive adverse events in 22 medical-surgical units: retrospective and matching analysis.

    Science.gov (United States)

    Tchouaket, Eric; Dubois, Carl-Ardy; D'Amour, Danielle

    2017-07-01

    The aim of this study was to assess the economic burden of nurse-sensitive adverse events in 22 acute-care units in Quebec by estimating excess hospital-related costs and calculating resulting additional hospital days. Recent changes in the worldwide economic and financial contexts have made the cost of patient safety a topical issue. Yet, our knowledge about the economic burden of safety of nursing care is quite limited in Canada in general and Quebec in particular. Retrospective analysis of charts of 2699 patients hospitalized between July 2008 - August 2009 for at least 2 days of 30-day periods in 22 medical-surgical units in 11 hospitals in Quebec. Data were collected from September 2009 to August 2010. Nurse-sensitive adverse events analysed were pressure ulcers, falls, medication administration errors, pneumonia and urinary tract infections. Descriptive statistics identified numbers of cases for each nurse-sensitive adverse event. A literature analysis was used to estimate excess median hospital-related costs of treatments with these nurse-sensitive adverse events. Costs were calculated in 2014 Canadian dollars. Additional hospital days were estimated by comparing lengths of stay of patients with nurse-sensitive adverse events with those of similar patients without nurse-sensitive adverse events. This study found that five adverse events considered nurse-sensitive caused nearly 1300 additional hospital days for 166 patients and generated more than Canadian dollars 600,000 in excess treatment costs. The results present the financial consequences of the nurse-sensitive adverse events. Government should invest in prevention and in improvements to care quality and patient safety. Managers need to strengthen safety processes in their facilities and nurses should take greater precautions. © 2017 John Wiley & Sons Ltd.

  11. Statistical analysis of hydrodynamic cavitation events

    Science.gov (United States)

    Gimenez, G.; Sommer, R.

    1980-10-01

    The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.

  12. Material control study: a directed graph and fault tree procedure for adversary event set generation

    International Nuclear Information System (INIS)

    Lambert, H.E.; Lim, J.J.; Gilman, F.M.

    1978-01-01

    In work for the United States Nuclear Regulatory Commission, Lawrence Livermore Laboratory is developing an assessment procedure to evaluate the effectiveness of a potential nuclear facility licensee's material control (MC) system. The purpose of an MC system is to prevent the theft of special nuclear material such as plutonium and highly enriched uranium. The key in the assessment procedure is the generation and analysis of the adversary event sets by a directed graph and fault-tree methodology

  13. Reducing the occurrence of plant events through improved human performance

    International Nuclear Information System (INIS)

    Ross, T.; Burkhart, A.D.

    1993-01-01

    During a routine control room surveillance, the reactor operator is distracted by an alarming secondary annunciator and a telephone call. When the reactor operator resumes the surveillance, he inadvertently performs the procedural steps out of order. This causes a reportable nuclear event. How can procedure-related human performance problems such as this be prevented? The question is vitally important for the nuclear industry. The U.S. Nuclear Regulatory Commission's Office for Analysis and Evaluation of Operational Data observed, open-quotes With the perceived reduction in the number of events caused by equipment failures, INPO and other industry groups and human performance experts agree that a key to continued improvement in plant performance and safety is improved human performance.close quotes In fact, open-quotes more than 50% of the reportable events occurring at nuclear power plants involve human error.close quotes Prevention (or correction) of a human performance problem is normally based on properly balancing the following three factors: (1) supervisory involvement; (2) personnel training; and (3) procedures. The nuclear industry is implementing a formula known as ACME, which better balances supervisory involvement, personnel training, and procedures. Webster's New World Dictionary defines acme as the highest point, the peak. ACME human performance is the goal: ACME Adherence to and use of procedures; Self-Checking; Management Involvement; and Event Investigations

  14. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  15. Analysis of the Steam Generator Tubes Rupture Initiating Event

    International Nuclear Information System (INIS)

    Trillo, A.; Minguez, E.; Munoz, R.; Melendez, E.; Sanchez-Perea, M.; Izquierd, J.M.

    1998-01-01

    In PSA studies, Event Tree-Fault Tree techniques are used to analyse to consequences associated with the evolution of an initiating event. The Event Tree is built in the sequence identification stage, following the expected behaviour of the plant in a qualitative way. Computer simulation of the sequences is performed mainly to determine the allowed time for operator actions, and do not play a central role in ET validation. The simulation of the sequence evolution can instead be performed by using standard tools, helping the analyst obtain a more realistic ET. Long existing methods and tools can be used to automatism the construction of the event tree associated to a given initiator. These methods automatically construct the ET by simulating the plant behaviour following the initiator, allowing some of the systems to fail during the sequence evolution. Then, the sequences with and without the failure are followed. The outcome of all this is a Dynamic Event Tree. The work described here is the application of one such method to the particular case of the SGTR initiating event. The DYLAM scheduler, designed at the Ispra (Italy) JRC of the European Communities, is used to automatically drive the simulation of all the sequences constituting the Event Tree. Similarly to the static Event Tree, each time a system is demanded, two branches are open: one corresponding to the success and the other to the failure of the system. Both branches are followed by the plant simulator until a new system is demanded, and the process repeats. The plant simulation modelling allows the treatment of degraded sequences that enter into the severe accident domain as well as of success sequences in which long-term cooling is started. (Author)

  16. Keyed shear joints

    DEFF Research Database (Denmark)

    Hansen, Klaus

    This report gives a summary of the present information on the behaviour of vertical keyed shear joints in large panel structures. An attemp is made to outline the implications which this information might have on the analysis and design of a complete wall. The publications also gives a short...

  17. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Dale [Los Alamos National Laboratory; Selby, Neil [AWE Blacknest

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  18. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    Science.gov (United States)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture

  19. Subjective Well-Being and Adaptation to Life Events: A Meta-Analysis on Differences Between Cognitive and Affective Well-Being

    Science.gov (United States)

    Luhmann, Maike; Hofmann, Wilhelm; Eid, Michael; Lucas, Richard E.

    2012-01-01

    Previous research has shown that major life events can have short- and long-term effects on subjective well-being (SWB). The present meta-analysis examines (a) whether life events have different effects on cognitive and affective well-being and (b) how the rate of adaptation varies across different life events. Longitudinal data from 188 publications (313 samples, N = 65,911) were integrated to describe the reaction and adaptation to four family events (marriage, divorce, bereavement, child birth) and four work events (unemployment, reemployment, retirement, relocation/migration). The findings show that life events have very different effects on affective and cognitive well-being, and that for most events the effects of life events on cognitive well-being are stronger and more consistent across samples. Different life events differ in their effects on SWB, but these effects are not a function of the alleged desirability of events. The results are discussed with respect to their theoretical implications, and recommendations for future studies on adaptation are given. PMID:22059843

  20. Preliminary analysis of beam trip and beam jump events in an ADS prototype

    International Nuclear Information System (INIS)

    D'Angelo, A.; Bianchini, G.; Carta, M.

    2001-01-01

    A core dynamics analysis relevant to some typical current transient events has been carried out on an 80 MW energy amplifier prototype (EAP) fuelled by mixed oxides and cooled by lead-bismuth. Fuel and coolant temperature trends relevant to recovered beam trip and beam jump events have been preliminary investigated. Beam trip results show that the drop in temperature of the core outlet coolant would be reduced a fair amount if the beam intensity could be recovered within few seconds. Due to the low power density in the EAP fuel, the beam jump from 50% of the nominal power transient evolves benignly. The worst thinkable current transient, beam jump with cold reactor, mainly depends on the coolant flow conditions. In the EAP design, the primary loop coolant flow is assured by natural convection and is enhanced by a particular system of cover gas injection into the bottom part of the riser. If this system of coolant flow enhancement is assumed in function, even the beam jump with cold reactor event evolves without severe consequences. (authors)

  1. Applicability of PRISM PRA Methodology to the Level II Probabilistic Safety Analysis of KALIMER-600 (I) (Core Damage Event Tree Analysis Part)

    International Nuclear Information System (INIS)

    Park, S. Y.; Kim, T. W.; Ha, K. S.; Lee, B. Y.

    2009-03-01

    The Korea Atomic Energy Research Institute (KAERI) has been developing liquid metal reactor (LMR) design technologies under a National Nuclear R and D Program. Nevertheless, there is no experience of the PSA domestically for a fast reactor with the metal fuel. Therefore, the objective of this study is to establish the methodologies of risk assessment for the reference design of KALIMER-600 reactor. An applicability of the PSA of the PRISM plant to the KALIMER-600 has been studied. The study is confined to a core damage event tree analysis which is a part of a level 2 PSA. Assuming that the accident types, which can be developed from level 1 PSA, are same as the PRISM PRA, core damage categories are defined and core damage event trees are developed for the KALIMER-600 reactor. Fission product release fractions of the core damage categories and branch probabilities of the core damage event trees are referred from the PRISM PRA temporarily. Plant specific data will be used during the detail analysis

  2. Changes in coral reef communities among the Florida Keys, 1996-2003

    Science.gov (United States)

    Somerfield, P. J.; Jaap, W. C.; Clarke, K. R.; Callahan, M.; Hackett, K.; Porter, J.; Lybolt, M.; Tsokos, C.; Yanev, G.

    2008-12-01

    Hard coral (Scleractinia and Milleporina) cover data were examined from 37 sites surveyed annually from 1996 to 2003 in the Florida reef tract, USA. Analyses of species numbers and total cover showed that site-to-site differences were generally very much greater than differences among times within sites. There were no significant differences among different geographical areas within the reef tract (Upper, Middle and Lower Keys). Large-scale changes documented included a reduction in species numbers and total cover on both deep and shallow offshore reefs between 1997 and 1999 followed by no recovery in cover, and only scant evidence of any recovery in species numbers by 2003. These changes coincided with bleaching events in 1997 and 1998, and the passage of Hurricane Georges through the Lower Keys in 1998. The lack of recovery among offshore reefs suggests that they were no longer resilient. Multivariate analyses revealed that some sites showed relatively little temporal variation in community composition, essentially random in direction, while others showed relatively large year-on-year changes. There was little evidence of any major region-wide changes affecting assemblage composition, or of any events that had impacted all of the sampling sites in any single year. Instead, different sites exhibited differing patterns of temporal variation, with certain sites displaying greater variation than others. Changes in community composition at some sites are interpreted in the light of knowledge of events at those sites and the relative sensitivities of species to various stressors, such as changes in cover of Acropora palmata and Millepora complanata at Sand Key following the bleaching events and hurricane in 1998, and declines in Montastraea annularis at Smith Shoal following a harmful algal bloom in 2002. For most sites, however, it is impossible to determine the causes of observed variation.

  3. Replica analysis of overfitting in regression models for time-to-event data

    Science.gov (United States)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  4. Statistical methods for the time-to-event analysis of individual participant data from multiple epidemiological studies

    DEFF Research Database (Denmark)

    Thompson, Simon; Kaptoge, Stephen; White, Ian

    2010-01-01

    Meta-analysis of individual participant time-to-event data from multiple prospective epidemiological studies enables detailed investigation of exposure-risk relationships, but involves a number of analytical challenges....

  5. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  6. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    Science.gov (United States)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  7. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    Science.gov (United States)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  8. Relation of air mass history to nucleation events in Po Valley, Italy, using back trajectories analysis

    Directory of Open Access Journals (Sweden)

    L. Sogacheva

    2007-01-01

    Full Text Available In this paper, we study the transport of air masses to San Pietro Capofiume (SPC in Po Valley, Italy, by means of back trajectories analysis. Our main aim is to investigate whether air masses originate over different regions on nucleation event days and on nonevent days, during three years when nucleation events have been continuously recorded at SPC. The results indicate that nucleation events occur frequently in air masses arriving from Central Europe, whereas event frequency is much lower in the air transported from southern directions and from the Atlantic Ocean. We also analyzed the behaviour of meteorological parameters during 96 h transport to SPC, and found that, on average, event trajectories undergo stronger subsidence during the last 12 h before the arrival at SPC than nonevent trajectories. This causes a reversal in the temperature and relative humidity (RH differences between event and nonevent trajectories: between 96 and 12 h back time, temperature is lower and RH is higher for event than nonevent trajectories and between 12 and 0 h vice versa. Boundary layer mixing is stronger along the event trajectories compared to nonevent trajectories. The absolute humidity (AH is similar for the event and nonevent trajectories between about 96 h and about 60 h back time, but after that, the event trajectories AH becomes lower due to stronger rain. We also studied transport of SO2 to SPC, and conclude that although sources in Po Valley most probably dominate the measured concentrations, certain Central and Eastern European sources also make a substantial contribution.

  9. Soft error rate analysis methodology of multi-Pulse-single-event transients

    International Nuclear Information System (INIS)

    Zhou Bin; Huo Mingxue; Xiao Liyi

    2012-01-01

    As transistor feature size scales down, soft errors in combinational logic because of high-energy particle radiation is gaining more and more concerns. In this paper, a combinational logic soft error analysis methodology considering multi-pulse-single-event transients (MPSETs) and re-convergence with multi transient pulses is proposed. In the proposed approach, the voltage pulse produced at the standard cell output is approximated by a triangle waveform, and characterized by three parameters: pulse width, the transition time of the first edge, and the transition time of the second edge. As for the pulse with the amplitude being smaller than the supply voltage, the edge extension technique is proposed. Moreover, an efficient electrical masking model comprehensively considering transition time, delay, width and amplitude is proposed, and an approach using the transition times of two edges and pulse width to compute the amplitude of pulse is proposed. Finally, our proposed firstly-independently-propagating-secondly-mutually-interacting (FIP-SMI) is used to deal with more practical re-convergence gate with multi transient pulses. As for MPSETs, a random generation model of MPSETs is exploratively proposed. Compared to the estimates obtained using circuit level simulations by HSpice, our proposed soft error rate analysis algorithm has 10% errors in SER estimation with speed up of 300 when the single-pulse-single-event transient (SPSET) is considered. We have also demonstrated the runtime and SER decrease with the increment of P0 using designs from the ISCAS-85 benchmarks. (authors)

  10. MULTI-SPACECRAFT ANALYSIS OF ENERGETIC HEAVY ION AND INTERPLANETARY SHOCK PROPERTIES IN ENERGETIC STORM PARTICLE EVENTS NEAR 1 au

    Energy Technology Data Exchange (ETDEWEB)

    Ebert, R. W.; Dayeh, M. A.; Desai, M. I. [Southwest Research Institute, 6220 Culebra Road, San Antonio, TX 78238 (United States); Jian, L. K. [Department of Astronomy, University of Maryland, College Park, MD 20742 (United States); Li, G. [The Center for Space Plasma and Aeronomic Research (CSPAR), University of Alabama in Huntsville, Huntsville, AL 35756 (United States); Mason, G. M., E-mail: rebert@swri.edu [Johns Hopkins University/Applied Physics Laboratory, Laurel, MD 20273 (United States)

    2016-11-10

    We examine the longitude distribution of and relationship between interplanetary (IP) shock properties and ∼0.1–20 MeV nucleon{sup -1} O and Fe ions during seven multi-spacecraft energetic storm particle (ESP) events at 1 au. These ESP events were observed at two spacecraft and were primarily associated with low Mach number, quasi-perpendicular shocks. Key observations include the following: (i) the Alfvén Mach number increased from east to west of the coronal mass ejection source longitude, while the shock speed, compression ratios, and obliquity showed no clear dependence; (ii) the O and Fe time intensity profiles and peak intensities varied significantly between longitudinally separated spacecraft observing the same event, the peak intensities being larger near the nose and smaller along the flank of the IP shock; (iii) the O and Fe peak intensities had weak to no correlations with the shock parameters; (iv) the Fe/O time profiles showed intra-event variations upstream of the shock that disappeared downstream of the shock, where values plateaued to those comparable to the mean Fe/O of solar cycle 23; (v) the O and Fe spectral index ranged from ∼1.0 to 3.4, the Fe spectra being softer in most events; and (vi) the observed spectral index was softer than the value predicted from the shock compression ratio in most events. We conclude that while the variations in IP shock properties may account for some variations in O and Fe properties within these multi-spacecraft events, detailed examination of the upstream seed population and IP turbulence, along with modeling, are required to fully characterize these observations.

  11. SHAREHOLDERS VALUE AND CATASTROPHE BONDS. AN EVENT STUDY ANALYSIS AT EUROPEAN LEVEL

    OpenAIRE

    Constantin, Laura-Gabriela; Cernat-Gruici, Bogdan; Lupu, Radu; Nadotti Loris, Lino Maria

    2015-01-01

    Considering that the E.U. based (re)insurance companies are increasingly active within the segment of alternative risk transfer market, the aim of the present paper is to emphasize the impact of issuing cat bonds on the shareholders’ value for highlighting the competitive advantages of the analysed (re)insurance companies while pursuing the consolidation of their resilience in a turbulent economic environment.Eminently an applicative research, the analysis employs an event study methodology w...

  12. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    Science.gov (United States)

    2011-01-01

    organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  13. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    Directory of Open Access Journals (Sweden)

    Lum Karl

    2011-03-01

    countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  14. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    Science.gov (United States)

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  15. Analysis of Secret Key Randomness Exploiting the Radio Channel Variability

    Directory of Open Access Journals (Sweden)

    Taghrid Mazloum

    2015-01-01

    Full Text Available A few years ago, physical layer based techniques have started to be considered as a way to improve security in wireless communications. A well known problem is the management of ciphering keys, both regarding the generation and distribution of these keys. A way to alleviate such difficulties is to use a common source of randomness for the legitimate terminals, not accessible to an eavesdropper. This is the case of the fading propagation channel, when exact or approximate reciprocity applies. Although this principle has been known for long, not so many works have evaluated the effect of radio channel properties in practical environments on the degree of randomness of the generated keys. To this end, we here investigate indoor radio channel measurements in different environments and settings at either 2.4625 GHz or 5.4 GHz band, of particular interest for WIFI related standards. Key bits are extracted by quantizing the complex channel coefficients and their randomness is evaluated using the NIST test suite. We then look at the impact of the carrier frequency, the channel variability in the space, time, and frequency degrees of freedom used to construct a long secret key, in relation to the nature of the radio environment such as the LOS/NLOS character.

  16. Development of time dependent safety analysis code for plasma anomaly events in fusion reactors

    International Nuclear Information System (INIS)

    Honda, Takuro; Okazaki, Takashi; Bartels, H.W.; Uckan, N.A.; Seki, Yasushi.

    1997-01-01

    A safety analysis code SAFALY has been developed to analyze plasma anomaly events in fusion reactors, e.g., a loss of plasma control. The code is a hybrid code comprising a zero-dimensional plasma dynamics and a one-dimensional thermal analysis of in-vessel components. The code evaluates the time evolution of plasma parameters and temperature distributions of in-vessel components. As the plasma-safety interface model, we proposed a robust plasma physics model taking into account updated data for safety assessment. For example, physics safety guidelines for beta limit, density limit and H-L mode confinement transition threshold power, etc. are provided in the model. The model of the in-vessel components are divided into twenty temperature regions in the poloidal direction taking account of radiative heat transfer between each surface of each region. This code can also describe the coolant behavior under hydraulic accidents with the results by hydraulics code and treat vaporization (sublimation) from plasma facing components (PFCs). Furthermore, the code includes the model of impurity transport form PFCs by using a transport probability and a time delay. Quantitative analysis based on the model is possible for a scenario of plasma passive shutdown. We examined the possibility of the code as a safety analysis code for plasma anomaly events in fusion reactors and had a prospect that it would contribute to the safety analysis of the International Thermonuclear Experimental Reactor (ITER). (author)

  17. Analysis on Outcome of 3537 Patients with Coronary Artery Disease: Integrative Medicine for Cardiovascular Events

    Directory of Open Access Journals (Sweden)

    Zhu-ye Gao

    2013-01-01

    Full Text Available Aims. To investigate the treatment of hospitalized patients with coronary artery disease (CAD and the prognostic factors in Beijing, China. Materials and Methods. A multicenter prospective study was conducted through an integrative platform of clinical and research at 12 hospitals in Beijing, China. The clinical information of 3537 hospitalized patients with CAD was collected from September 2009 to May 2011, and the efficacy of secondary prevention during one-year followup was evaluated. In addition, a logistic regression analysis was performed to identify some factors which will have independent impact on the prognosis. Results. The average age of all patients was 64.88 ± 11.97. Of them, 65.42% are males. The medicines for patients were as follows: antiplatelet drugs accounting for 91.97%, statins accounting for 83.66%, β-receptor blockers accounting for 72.55%, ACEI/ARB accounting for 58.92%, and revascularization (including PCI and CABG accounting for 40.29%. The overall incidence of cardiovascular events was 13.26% (469/3537. The logistic stepwise regression analysis showed that heart failure (OR, 3.707, 95% CI = 2.756–4.986, age ≥ 65 years old (OR, 2.007, 95% CI = 1.587–2.53, and myocardial infarction (OR, 1.649, 95% CI = 1.322–2.057 were the independent risk factors of others factors for cardiovascular events that occurred during followup of one-year period. Integrative medicine (IM therapy showed the beneficial tendency for decreasing incidence of cardiovascular events, although no statistical significance was found (OR, 0.797, 95% CI = 0.613~1.036. Conclusions. Heart failure, age ≥ 65 years old, and myocardial infarction were associated with an increase in incidence of cardiovascular events, and treatment with IM showed a tendency for decreasing incidence of cardiovascular events.

  18. A trend analysis of human error events for proactive prevention of accidents. Methodology development and effective utilization

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Ebisu, Mitsuhiro; Aikawa, Takeshi; Matsubara, Katsuyuki

    2006-01-01

    This paper described methods for analyzing human error events that has been accumulated in the individual plant and for utilizing the result to prevent accidents proactively. Firstly, a categorization framework of trigger action and causal factors of human error events were reexamined, and the procedure to analyze human error events was reviewed based on the framework. Secondly, a method for identifying the common characteristics of trigger action data and of causal factor data accumulated by analyzing human error events was clarified. In addition, to utilize the results of trend analysis effectively, methods to develop teaching material for safety education, to develop the checkpoints for the error prevention and to introduce an error management process for strategic error prevention were proposed. (author)

  19. Exploitation of a component event data bank for common cause failure analysis

    International Nuclear Information System (INIS)

    Games, A.M.; Amendola, A.; Martin, P.

    1985-01-01

    Investigations into using the European Reliability Data System Component Event Data Bank for common cause failure analysis have been carried out. Starting from early exercises where data were analyzed without computer aid, different types of linked multiple failures have been identified. A classification system is proposed based on this experience. It defines a multiple failure event space wherein each category defines causal, modal, temporal and structural links between failures. It is shown that a search algorithm which incorporates the specific interrogative procedures of the data bank can be developed in conjunction with this classification system. It is concluded that the classification scheme and the search algorithm are useful organizational tools in the field of common cause failures studies. However, it is also suggested that the use of the term common cause failure should be avoided since it embodies to many different types of linked multiple failures

  20. A systemic approach for managing extreme risk events-dynamic financial analysis

    Directory of Open Access Journals (Sweden)

    Ph.D.Student Rodica Ianole

    2011-12-01

    Full Text Available Following the Black Swan logic, it often happens that what we do not know becomes more relevant that what we (believe to know. The management of extreme risks falls under this paradigm in the sense that it cannot be limited to a static approach based only on objective and easily quantifiable variables. Making appeal to the operational tools developed primarily for the insurance industry, the present paper aims to investigate how dynamic financial analysis (DFA can be used within the framework of extreme risk events.

  1. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  2. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  3. Deep learning based beat event detection in action movie franchises

    Science.gov (United States)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  4. Best Practices in Pulic Outreach Events

    Science.gov (United States)

    Cobb, Whitney; Buxner, Sanlyn; Shipp, Stephanie

    2015-11-01

    IntroductionEach year the National Aeronautics and Space Administration (NASA) sponsors public outreach events designed to increase student, educator, and general public engagement in its missions and goals. NASA SMD Education’s review of large-scale events, “Best Practices in Outreach Events,” highlighted planning and implementation best practices, which were used by the Dawn mission to strategize and implement its Ceres arrival celebration event, i C Ceres.BackgroundThe literature review focused on best identifying practices rising from evaluations of large-scale public outreach events. The following criteria guided the study:* Public, science-related events open to adults and children* Events that occurred during the last 5 years* Evaluations that included information on data collected from visitors and/or volunteers* Evaluations that specified the type of data collected, methodology, and associated resultsBest Practices: Planning and ImplementationThe literature review revealed key considerations for planning implement large-scale events. Best practices included can be pertinent for all event organizers and evaluators regardless of event size. A summary of related best practices is presented below.1) Advertise the event2) Use and advertise access to scientists* Attendees who reported an interaction with a science professional were 15% to 19% more likely to report positive learning impacts, (SFA, 2012, p. 24).3) Recruit scientists using findings such as:* High percentages of scientists (85% to 96%) from most events were interested in participating again (SFA, 2012).4) Ensure that the event is group and, particularly, child friendly5) Target specific event outcomesBest Practices Informing Real-world Planning, Implementation and EvaluationDawn mission’s collaborative design of a series of events, i C Ceres, including in-person, interactive events geared to families and live presentations, will be shared, with focus on the family event, and the evidence

  5. Living with extreme weather events - perspectives from climatology, geomorphological analysis, chronicles and opinion polls

    Science.gov (United States)

    Auer, I.; Kirchengast, A.; Proske, H.

    2009-09-01

    The ongoing climate change debate focuses more and more on changing extreme events. Information on past events can be derived from a number of sources, such as instrumental data, residual impacts in the landscape, but also chronicles and people's memories. A project called "A Tale of Two Valleys” within the framework of the research program "proVision” allowed to study past extreme events in two inner-alpine valleys from the sources mentioned before. Instrumental climate time series provided information for the past 200 years, however great attention had to be given to the homogeneity of the series. To derive homogenized time series of selected climate change indices methods like HOCLIS and Vincent have been applied. Trend analyses of climate change indices inform about increase or decrease of extreme events. Traces of major geomorphodynamic processes of the past (e.g. rockfalls, landslides, debris flows) which were triggered or affected by extreme weather events are still apparent in the landscape and could be evaluated by geomorphological analysis using remote sensing and field data. Regional chronicles provided additional knowledge and covered longer periods back in time, however compared to meteorological time series they enclose a high degree of subjectivity and intermittent recordings cannot be obviated. Finally, questionnaires and oral history complemented our picture of past extreme weather events. People were differently affected and have different memories of it. The joint analyses of these four data sources showed agreement to some extent, however also showed some reasonable differences: meteorological data are point measurements only with a sometimes too coarse temporal resolution. Due to land-use changes and improved constructional measures the impact of an extreme meteorological event may be different today compared to earlier times.

  6. Event filter monitoring with the ATLAS tile calorimeter

    CERN Document Server

    Fiorini, L

    2008-01-01

    The ATLAS Tile Calorimeter detector is presently involved in an intense phase of subsystems integration and commissioning with muons of cosmic origin. Various monitoring programs have been developed at different levels of the data flow to tune the set-up of the detector running conditions and to provide a fast and reliable assessment of the data quality already during data taking. This paper focuses on the monitoring system integrated in the highest level of the ATLAS trigger system, the Event Filter, and its deployment during the Tile Calorimeter commissioning with cosmic ray muons. The key feature of Event Filter monitoring is the capability of performing detector and data quality control on complete physics events at the trigger level, hence before events are stored on disk. In ATLAS' online data flow, this is the only monitoring system capable of giving a comprehensive event quality feedback.

  7. Preliminary analysis on faint luminous lightning events recorded by multiple high speed cameras

    Science.gov (United States)

    Alves, J.; Saraiva, A. V.; Pinto, O.; Campos, L. Z.; Antunes, L.; Luz, E. S.; Medeiros, C.; Buzato, T. S.

    2013-12-01

    The objective of this work is the study of some faint luminous events produced by lightning flashes that were recorded simultaneously by multiple high-speed cameras during the previous RAMMER (Automated Multi-camera Network for Monitoring and Study of Lightning) campaigns. The RAMMER network is composed by three fixed cameras and one mobile color camera separated by, in average, distances of 13 kilometers. They were located in the Paraiba Valley (in the cities of São José dos Campos and Caçapava), SP, Brazil, arranged in a quadrilateral shape, centered in São José dos Campos region. This configuration allowed RAMMER to see a thunderstorm from different angles, registering the same lightning flashes simultaneously by multiple cameras. Each RAMMER sensor is composed by a triggering system and a Phantom high-speed camera version 9.1, which is set to operate at a frame rate of 2,500 frames per second with a lens Nikkor (model AF-S DX 18-55 mm 1:3.5 - 5.6 G in the stationary sensors, and a lens model AF-S ED 24 mm - 1:1.4 in the mobile sensor). All videos were GPS (Global Positioning System) time stamped. For this work we used a data set collected in four RAMMER manual operation days in the campaign of 2012 and 2013. On Feb. 18th the data set is composed by 15 flashes recorded by two cameras and 4 flashes recorded by three cameras. On Feb. 19th a total of 5 flashes was registered by two cameras and 1 flash registered by three cameras. On Feb. 22th we obtained 4 flashes registered by two cameras. Finally, in March 6th two cameras recorded 2 flashes. The analysis in this study proposes an evaluation methodology for faint luminous lightning events, such as continuing current. Problems in the temporal measurement of the continuing current can generate some imprecisions during the optical analysis, therefore this work aim to evaluate the effects of distance in this parameter with this preliminary data set. In the cases that include the color camera we analyzed the RGB

  8. Partition of some key regulating services in terrestrial ecosystems: Meta-analysis and review

    Energy Technology Data Exchange (ETDEWEB)

    Viglizzo, E.F., E-mail: evigliz@cpenet.com.ar [INTA, EEA Anguil, Grupo de Investigaciones en Gestión Ambiental (GIGA), Av. Spinetto 785, 6300 Santa Rosa, La Pampa (Argentina); INCITAP-CONICET, Ruta 35, km 335, 6300 Santa Rosa, La Pampa (Argentina); UNLPam, Facultad de Ciencias Exactas y Naturales, Av. Uruguay 151, 6300 Santa Rosa, La Pampa (Argentina); Jobbágy, E.G. [CONICET, Andes 950, 5700 San Luis, San Luis (Argentina); Grupo de Estudios Ambientales IMASL, Ejército de los, Andes 950, 5700 San Luis, San Luis (Argentina); Ricard, M.F. [INCITAP-CONICET, Ruta 35, km 335, 6300 Santa Rosa, La Pampa (Argentina); UNLPam, Facultad de Ciencias Exactas y Naturales, Av. Uruguay 151, 6300 Santa Rosa, La Pampa (Argentina); Paruelo, J.M. [Laboratorio de Análisis Regional y Teledetección, Departamento de Métodos Cuantitativos Sistemas de información, Facultad de Agronomía and IFEVA, Universidad de Buenos Aires and CONICET, Av. San Martín 4453, 1417 Buenos Aires (Argentina)

    2016-08-15

    Our knowledge about the functional foundations of ecosystem service (ES) provision is still limited and more research is needed to elucidate key functional mechanisms. Using a simplified eco-hydrological scheme, in this work we analyzed how land-use decisions modify the partition of some essential regulatory ES by altering basic relationships between biomass stocks and water flows. A comprehensive meta-analysis and review was conducted based on global, regional and local data from peer-reviewed publications. We analyzed five datasets comprising 1348 studies and 3948 records on precipitation (PPT), aboveground biomass (AGB), AGB change, evapotranspiration (ET), water yield (WY), WY change, runoff (R) and infiltration (I). The conceptual framework was focused on ES that are associated with the ecological functions (e.g., intermediate ES) of ET, WY, R and I. ES included soil protection, carbon sequestration, local climate regulation, water-flow regulation and water recharge. To address the problem of data normality, the analysis included both parametric and non-parametric regression analysis. Results demonstrate that PPT is a first-order biophysical factor that controls ES release at the broader scales. At decreasing scales, ES are partitioned as result of PPT interactions with other biophysical and anthropogenic factors. At intermediate scales, land-use change interacts with PPT modifying ES partition as it the case of afforestation in dry regions, where ET and climate regulation may be enhanced at the expense of R and water-flow regulation. At smaller scales, site-specific conditions such as topography interact with PPT and AGB displaying different ES partition formats. The probable implications of future land-use and climate change on some key ES production and partition are discussed. - Highlights: • The partition of regulatory services in ecosystems poses a major policy challenge. • We examined how partitions occur at the hydrosphere

  9. Partition of some key regulating services in terrestrial ecosystems: Meta-analysis and review

    International Nuclear Information System (INIS)

    Viglizzo, E.F.; Jobbágy, E.G.; Ricard, M.F.; Paruelo, J.M.

    2016-01-01

    Our knowledge about the functional foundations of ecosystem service (ES) provision is still limited and more research is needed to elucidate key functional mechanisms. Using a simplified eco-hydrological scheme, in this work we analyzed how land-use decisions modify the partition of some essential regulatory ES by altering basic relationships between biomass stocks and water flows. A comprehensive meta-analysis and review was conducted based on global, regional and local data from peer-reviewed publications. We analyzed five datasets comprising 1348 studies and 3948 records on precipitation (PPT), aboveground biomass (AGB), AGB change, evapotranspiration (ET), water yield (WY), WY change, runoff (R) and infiltration (I). The conceptual framework was focused on ES that are associated with the ecological functions (e.g., intermediate ES) of ET, WY, R and I. ES included soil protection, carbon sequestration, local climate regulation, water-flow regulation and water recharge. To address the problem of data normality, the analysis included both parametric and non-parametric regression analysis. Results demonstrate that PPT is a first-order biophysical factor that controls ES release at the broader scales. At decreasing scales, ES are partitioned as result of PPT interactions with other biophysical and anthropogenic factors. At intermediate scales, land-use change interacts with PPT modifying ES partition as it the case of afforestation in dry regions, where ET and climate regulation may be enhanced at the expense of R and water-flow regulation. At smaller scales, site-specific conditions such as topography interact with PPT and AGB displaying different ES partition formats. The probable implications of future land-use and climate change on some key ES production and partition are discussed. - Highlights: • The partition of regulatory services in ecosystems poses a major policy challenge. • We examined how partitions occur at the hydrosphere

  10. A Global Geospatial Database of 5000+ Historic Flood Event Extents

    Science.gov (United States)

    Tellman, B.; Sullivan, J.; Doyle, C.; Kettner, A.; Brakenridge, G. R.; Erickson, T.; Slayback, D. A.

    2017-12-01

    A key dataset that is missing for global flood model validation and understanding historic spatial flood vulnerability is a global historical geo-database of flood event extents. Decades of earth observing satellites and cloud computing now make it possible to not only detect floods in near real time, but to run these water detection algorithms back in time to capture the spatial extent of large numbers of specific events. This talk will show results from the largest global historical flood database developed to date. We use the Dartmouth Flood Observatory flood catalogue to map over 5000 floods (from 1985-2017) using MODIS, Landsat, and Sentinel-1 Satellites. All events are available for public download via the Earth Engine Catalogue and via a website that allows the user to query floods by area or date, assess population exposure trends over time, and download flood extents in geospatial format.In this talk, we will highlight major trends in global flood exposure per continent, land use type, and eco-region. We will also make suggestions how to use this dataset in conjunction with other global sets to i) validate global flood models, ii) assess the potential role of climatic change in flood exposure iii) understand how urbanization and other land change processes may influence spatial flood exposure iv) assess how innovative flood interventions (e.g. wetland restoration) influence flood patterns v) control for event magnitude to assess the role of social vulnerability and damage assessment vi) aid in rapid probabilistic risk assessment to enable microinsurance markets. Authors on this paper are already using the database for the later three applications and will show examples of wetland intervention analysis in Argentina, social vulnerability analysis in the USA, and micro insurance in India.

  11. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    International Nuclear Information System (INIS)

    FORESTER, JOHN A.; BLEY, DENNIS C.; COOPER, SUSANE; KOLACZKOWSKI, ALAN M.; THOMPSON, CATHERINE; RAMEY-SMITH, ANN; WREATHALL, JOHN

    2000-01-01

    This paper describes the most recent version of a human reliability analysis (HRA) method called ''A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed

  12. Gene expression profiling of resting and activated vascular smooth muscle cells by serial analysis of gene expression and clustering analysis

    NARCIS (Netherlands)

    Beauchamp, Nicholas J.; van Achterberg, Tanja A. E.; Engelse, Marten A.; Pannekoek, Hans; de Vries, Carlie J. M.

    2003-01-01

    Migration and proliferation of vascular smooth muscle cells (SMCs) are key events in atherosclerosis. However, little is known about alterations in gene expression upon transition of the quiescent, contractile SMC to the proliferative SMC. We performed serial analysis of gene expression (SAGE) of

  13. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    Science.gov (United States)

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential

  14. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    Science.gov (United States)

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  15. Key-Phenomenon and Religious Meaning

    Directory of Open Access Journals (Sweden)

    Lomuscio Vincenzo

    2017-09-01

    Full Text Available In this paper I develop a phenomenology of religious experience through the notion of keyphenomenon. My analysis moves from a general phenomenology of situation, in which we have to relate different phenomena according to a sense. What does “according to a sense” mean? My suggestion is that we should look for a relationship among these data when we find a key-phenomenon (among a series of phenomena that would enlighten all the others. This key-phenomenon would show a non-phenomenal meaning which would make all the others understandable. Each other datum, therefore, becomes the witness of invisible meaning through a key-witness. The key-phenomenon we choose determines the role (i.e., the truth of each datum within its situation. This phenomenological relationship belongs to both the sense of day-life situations, and that one of possible religious situations. If the religious interpretation of a situation depends on our choice of key-phenomenon, or key-witness, we have to define what kind of keyphenomenon constitutes a religious intuition.

  16. An expert system for prevention of abnormal event recurrence

    International Nuclear Information System (INIS)

    Nishiyama, Takuya

    1990-01-01

    A huge amount of information related to abnormal events occurring in nuclear power plants in Japan and abroad is collected and accumulated in the Nuclear Information Center at CRIEPI. This information contains a variety of knowledge which may be useful for prevention of similar trouble. An expert system named, 'Consultation System for Prevention of Abnormal-Event Recurrence (CSPAR) is being developed with the objective of preventing recurrence of similar abnormal events by offering an effective means of utilizing such knowledge. This paper presents the key points in designing and constructing the system, the system functional outline, and some demonstration examples. (author)

  17. A sequential threshold cure model for genetic analysis of time-to-event data

    DEFF Research Database (Denmark)

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    In analysis of time-to-event data, classical survival models ignore the presence of potential nonsusceptible (cured) individuals, which, if present, will invalidate the inference procedures. Existence of nonsusceptible individuals is particularly relevant under challenge testing with specific...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...... survival data were generated through 2 underlying traits: susceptibility and endurance (risk of dying per time-unit), associated with 2 sets of underlying liabilities. Despite considerable phenotypic confounding, the proposed model was largely able to distinguish the 2 traits. Furthermore, if selection...

  18. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  19. Evaluation of cool season precipitation event characteristics over the Northeast US in a suite of downscaled climate model hindcasts

    Science.gov (United States)

    Loikith, Paul C.; Waliser, Duane E.; Kim, Jinwon; Ferraro, Robert

    2017-08-01

    Cool season precipitation event characteristics are evaluated across a suite of downscaled climate models over the northeastern US. Downscaled hindcast simulations are produced by dynamically downscaling the Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA2) using the National Aeronautics and Space Administration (NASA)-Unified Weather Research and Forecasting (WRF) regional climate model (RCM) and the Goddard Earth Observing System Model, Version 5 (GEOS-5) global climate model. NU-WRF RCM simulations are produced at 24, 12, and 4-km horizontal resolutions using a range of spectral nudging schemes while the MERRA2 global downscaled run is provided at 12.5-km. All model runs are evaluated using four metrics designed to capture key features of precipitation events: event frequency, event intensity, even total, and event duration. Overall, the downscaling approaches result in a reasonable representation of many of the key features of precipitation events over the region, however considerable biases exist in the magnitude of each metric. Based on this evaluation there is no clear indication that higher resolution simulations result in more realistic results in general, however many small-scale features such as orographic enhancement of precipitation are only captured at higher resolutions suggesting some added value over coarser resolution. While the differences between simulations produced using nudging and no nudging are small, there is some improvement in model fidelity when nudging is introduced, especially at a cutoff wavelength of 600 km compared to 2000 km. Based on the results of this evaluation, dynamical regional downscaling using NU-WRF results in a more realistic representation of precipitation event climatology than the global downscaling of MERRA2 using GEOS-5.

  20. KeyPathwayMinerWeb

    DEFF Research Database (Denmark)

    List, Markus; Alcaraz, Nicolas; Dissing-Hansen, Martin

    2016-01-01

    , for instance), KeyPathwayMiner extracts connected sub-networks containing a high number of active or differentially regulated genes (proteins, metabolites) in the molecular profiles. The web interface at (http://keypathwayminer.compbio.sdu.dk) implements all core functionalities of the KeyPathwayMiner tool set......We present KeyPathwayMinerWeb, the first online platform for de novo pathway enrichment analysis directly in the browser. Given a biological interaction network (e.g. protein-protein interactions) and a series of molecular profiles derived from one or multiple OMICS studies (gene expression...... such as data integration, input of background knowledge, batch runs for parameter optimization and visualization of extracted pathways. In addition to an intuitive web interface, we also implemented a RESTful API that now enables other online developers to integrate network enrichment as a web service...