WorldWideScience

Sample records for key events analysis

  1. A Key Event Path Analysis Approach for Integrated Systems

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  2. A Market Analysis of Publications, Trade Conferences, and Key Events for Fleet Readiness Center Southwest

    2007-12-01

    Win and Keep Big Customers. Austin: Bard Press, 2005. Kotler , Philip and Kevin Lane Keller. Marketing Management. Upper Saddle River, NJ...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA MBA PROFESSIONAL REPORT A Market Analysis of Publications, Trade Conferences...AGENCY USE ONLY (Leave blank) 2. REPORT DATE December 2007 3. REPORT TYPE AND DATES COVERED MBA Professional Report 4. TITLE AND SUBTITLE: A Market

  3. The key events of 2012

    2013-01-01

    The article reviews the main events or changes or issues that occurred in 2012 in France in the different sectors of activities of the ASN (control, public information, management of accidental situations, and international cooperation) or that had an impact on the activities of ASN (changes in national or european regulations for instance)

  4. Preparedness of newly qualified midwives to deliver clinical care: an evaluation of pre-registration midwifery education through an analysis of key events.

    Skirton, Heather; Stephen, Nicole; Doris, Faye; Cooper, Maggie; Avis, Mark; Fraser, Diane M

    2012-10-01

    this study was part of a larger project commissioned to ascertain whether midwife teachers bring a unique contribution to the preparation of midwives for practice. The aim of this phase was to determine whether the student midwives' educational programme had equipped them to practise competently after entry to the professional register. this was a prospective, longitudinal qualitative study, using participant diaries to collect data. data were collected from newly qualified midwives during the initial six months after they commenced their first post as a qualified midwife. the potential participants were all student midwives who were completing their education at one of six Universities (three in England, one in Scotland, one in Wales and one in Northern Ireland). Diary data were submitted by 35 newly qualified midwives; 28 were graduates of the three year programme and seven of the shortened programme. diary entries were analysed using thematic analysis (Braun and Clarke, 2006), with a focus on identification of key events in the working lives of the newly qualified midwives. A total of 263 key events were identified, under three main themes: (1) impact of the event on confidence, (2) gaps in knowledge or experience and (3) articulated frustration, conflict or distress. essentially, pre-registration education, delivered largely by midwife teachers and supported by clinical mentors, has been shown to equip newly qualified midwives to work effectively as autonomous practitioners caring for mothers and babies. While newly qualified midwives are able to cope with a range of challenging clinical situations in a safe manner, they lack confidence in key areas. Positive reinforcement by supportive colleagues plays a significant role in enabling them to develop as practitioners. whilst acknowledging the importance of normality in childbearing there is a need within the curriculum to enable midwives to recognise and respond to complex care situations by providing theory

  5. Predicting Key Events in the Popularity Evolution of Online Information.

    Hu, Ying; Hu, Changjun; Fu, Shushen; Fang, Mingzhe; Xu, Wenwen

    2017-01-01

    The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.

  6. Predicting Key Events in the Popularity Evolution of Online Information.

    Ying Hu

    Full Text Available The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.

  7. Power quality event classification: an overview and key issues ...

    ... used for PQ events' classifications. Various artificial intelligent techniques which are used in PQ event classification are also discussed. Major Key issues and challenges in classifying PQ events are critically examined and outlined. Keywords: Power quality, PQ event classifiers, artificial intelligence techniques, PQ noise, ...

  8. Key events in the history of sustainable development

    Sustainable Development Commission

    2005-01-01

    This document is a table which summaries the key events in the history of sustainable development, adapted from International Institute for Sustainable Development's sustainable development timeline. Publisher PDF

  9. Genetic stratigraphy of key demographic events in Arabia.

    Fernandes, Verónica; Triska, Petr; Pereira, Joana B; Alshamali, Farida; Rito, Teresa; Machado, Alison; Fajkošová, Zuzana; Cavadas, Bruno; Černý, Viktor; Soares, Pedro; Richards, Martin B; Pereira, Luísa

    2015-01-01

    At the crossroads between Africa and Eurasia, Arabia is necessarily a melting pot, its peoples enriched by successive gene flow over the generations. Estimating the timing and impact of these multiple migrations are important steps in reconstructing the key demographic events in the human history. However, current methods based on genome-wide information identify admixture events inefficiently, tending to estimate only the more recent ages, as here in the case of admixture events across the Red Sea (~8-37 generations for African input into Arabia, and 30-90 generations for "back-to-Africa" migrations). An mtDNA-based founder analysis, corroborated by detailed analysis of the whole-mtDNA genome, affords an alternative means by which to identify, date and quantify multiple migration events at greater time depths, across the full range of modern human history, albeit for the maternal line of descent only. In Arabia, this approach enables us to infer several major pulses of dispersal between the Near East and Arabia, most likely via the Gulf corridor. Although some relict lineages survive in Arabia from the time of the out-of-Africa dispersal, 60 ka, the major episodes in the peopling of the Peninsula took place from north to south in the Late Glacial and, to a lesser extent, the immediate post-glacial/Neolithic. Exchanges across the Red Sea were mainly due to the Arab slave trade and maritime dominance (from ~2.5 ka to very recent times), but had already begun by the early Holocene, fuelled by the establishment of maritime networks since ~8 ka. The main "back-to-Africa" migrations, again undetected by genome-wide dating analyses, occurred in the Late Glacial period for introductions into eastern Africa, whilst the Neolithic was more significant for migrations towards North Africa.

  10. Analysis of extreme events

    Khuluse, S

    2009-04-01

    Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...

  11. Mining the key predictors for event outbreaks in social networks

    Yi, Chengqi; Bao, Yuanyuan; Xue, Yibo

    2016-04-01

    It will be beneficial to devise a method to predict a so-called event outbreak. Existing works mainly focus on exploring effective methods for improving the accuracy of predictions, while ignoring the underlying causes: What makes event go viral? What factors that significantly influence the prediction of an event outbreak in social networks? In this paper, we proposed a novel definition for an event outbreak, taking into account the structural changes to a network during the propagation of content. In addition, we investigated features that were sensitive to predicting an event outbreak. In order to investigate the universality of these features at different stages of an event, we split the entire lifecycle of an event into 20 equal segments according to the proportion of the propagation time. We extracted 44 features, including features related to content, users, structure, and time, from each segment of the event. Based on these features, we proposed a prediction method using supervised classification algorithms to predict event outbreaks. Experimental results indicate that, as time goes by, our method is highly accurate, with a precision rate ranging from 79% to 97% and a recall rate ranging from 74% to 97%. In addition, after applying a feature-selection algorithm, the top five selected features can considerably improve the accuracy of the prediction. Data-driven experimental results show that the entropy of the eigenvector centrality, the entropy of the PageRank, the standard deviation of the betweenness centrality, the proportion of re-shares without content, and the average path length are the key predictors for an event outbreak. Our findings are especially useful for further exploring the intrinsic characteristics of outbreak prediction.

  12. Finite key analysis in quantum cryptography

    Meyer, T.

    2007-01-01

    finite number of input signals, without making any approximations. As an application, we investigate the so-called ''Tomographic Protocol'', which is based on the Six-State Protocol and where Alice and Bob can obtain the additional information which quantum state they share after the distribution step of the protocol. We calculate the obtainable secret key rate under the assumption that the eavesdropper only conducts collective attacks and give a detailed analysis of the dependence of the key rate on various parameters: The number of input signals (the block size), the error rate in the sifted key (the QBER), and the security parameter. Furthermore, we study the influence of multi-photon events which naturally occur in a realistic implementation (orig.)

  13. Cogeneration: Key feasibility analysis parameters

    Coslovi, S.; Zulian, A.

    1992-01-01

    This paper first reviews the essential requirements, in terms of scope, objectives and methods, of technical/economic feasibility analyses applied to cogeneration systems proposed for industrial plants in Italy. Attention is given to the influence on overall feasibility of the following factors: electric power and fuel costs, equipment coefficients of performance, operating schedules, maintenance costs, Italian Government taxes and financial and legal incentives. Through an examination of several feasibility studies that were done on cogeneration proposals relative to different industrial sectors, a sensitivity analysis is performed on the effects of varying the weights of different cost benefit analysis parameters. With the use of statistical analyses, standard deviations are then determined for key analysis parameters, and guidelines are suggested for analysis simplifications

  14. Decision Trajectories in Dementia Care Networks: Decisions and Related Key Events.

    Groen-van de Ven, Leontine; Smits, Carolien; Oldewarris, Karen; Span, Marijke; Jukema, Jan; Eefsting, Jan; Vernooij-Dassen, Myrra

    2017-10-01

    This prospective multiperspective study provides insight into the decision trajectories of people with dementia by studying the decisions made and related key events. This study includes three waves of interviews, conducted between July 2010 and July 2012, with 113 purposefully selected respondents (people with beginning to advanced stages of dementia and their informal and professional caregivers) completed in 12 months (285 interviews). Our multilayered qualitative analysis consists of content analysis, timeline methods, and constant comparison. Four decision themes emerged-managing daily life, arranging support, community living, and preparing for the future. Eight key events delineate the decision trajectories of people with dementia. Decisions and key events differ between people with dementia living alone and living with a caregiver. Our study clarifies that decisions relate not only to the disease but to living with the dementia. Individual differences in decision content and sequence may effect shared decision-making and advance care planning.

  15. Finite key analysis in quantum cryptography

    Meyer, T.

    2007-10-31

    the obtainable key rate for any finite number of input signals, without making any approximations. As an application, we investigate the so-called ''Tomographic Protocol'', which is based on the Six-State Protocol and where Alice and Bob can obtain the additional information which quantum state they share after the distribution step of the protocol. We calculate the obtainable secret key rate under the assumption that the eavesdropper only conducts collective attacks and give a detailed analysis of the dependence of the key rate on various parameters: The number of input signals (the block size), the error rate in the sifted key (the QBER), and the security parameter. Furthermore, we study the influence of multi-photon events which naturally occur in a realistic implementation (orig.)

  16. EVENT PLANNING USING FUNCTION ANALYSIS

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  17. MGR External Events Hazards Analysis

    Booth, L.

    1999-01-01

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses

  18. Negated bio-events: analysis and identification

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  19. TEMAC, Top Event Sensitivity Analysis

    Iman, R.L.; Shortencarier, M.J.

    1988-01-01

    1 - Description of program or function: TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude of risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates. 2 - Method of solution: Sensitivity and uncertainty analyses associated with top events involve mathematical operations on the corresponding Boolean expression for the top event, as well as repeated evaluations of the top event in a Monte Carlo fashion. TEMAC employs a general matrix approach which provides a convenient general form for Boolean expressions, is computationally efficient, and allows large problems to be analyzed. 3 - Restrictions on the complexity of the problem - Maxima of: 4000 cut sets, 500 events, 500 values in a Monte Carlo sample, 16 characters in an event name. These restrictions are implemented through the FORTRAN 77 PARAMATER statement

  20. Bayesian analysis of rare events

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  1. Advanced event reweighting using multivariate analysis

    Martschei, D; Feindt, M; Honc, S; Wagner-Kuhr, J

    2012-01-01

    Multivariate analysis (MVA) methods, especially discrimination techniques such as neural networks, are key ingredients in modern data analysis and play an important role in high energy physics. They are usually trained on simulated Monte Carlo (MC) samples to discriminate so called 'signal' from 'background' events and are then applied to data to select real events of signal type. We here address procedures that improve this work flow. This will be the enhancement of data / MC agreement by reweighting MC samples on a per event basis. Then training MVAs on real data using the sPlot technique will be discussed. Finally we will address the construction of MVAs whose discriminator is independent of a certain control variable, i.e. cuts on this variable will not change the discriminator shape.

  2. Key European Grid event to take place in Geneva

    2006-01-01

    EGEE'06 is the main conference of the EGEE project, which is co-funded by the European Union and hosted by CERN. More than 90 partners all over Europe and beyond are working together in EGEE to provide researchers in both academia and industry with access to major computing resources, independent of their geographic location. The largest user community of the EGEE Grid is the High-Energy Physics community and in particular the LHC experiments, which are already making heavy use of the infrastructure to prepare for data taking. However, with the many new challenges faced by EGEE in its second phase that started in April this year, an even broader audience than at previous EGEE conferences is expected. In particular, a large number of related Grid projects will feature prominently in both plenary and parallel sessions during the 5 days of this event. Industry will also be well represented, highlighting the EGEE project's commitment to technology transfer to industry. CERN is the host of the conference, which i...

  3. Key events and their effects on cycling behaviour in Dar-es-Salaam : abstract + powerpoint

    Nkurunziza, A.; Zuidgeest, M.H.P.; Brussel, M.J.G.; van Maarseveen, M.F.A.M.

    2012-01-01

    The paper explores key events and investigates their effects on cycling behaviour in the city of Dar-es-Salaam, Tanzania. The objective of the study is to identify specific key events during a person’s life course with a significant effect on change of travel behaviour towards cycling in relation to

  4. Trending analysis of precursor events

    Watanabe, Norio

    1998-01-01

    The Accident Sequence Precursor (ASP) Program of United States Nuclear Regulatory Commission (U.S.NRC) identifies and categorizes operational events at nuclear power plants in terms of the potential for core damage. The ASP analysis has been performed on yearly basis and the results have been published in the annual reports. This paper describes the trends in initiating events and dominant sequences for 459 precursors identified in the ASP Program during the 1969-94 period and also discusses a comparison with dominant sequences predicted in the past Probabilistic Risk Assessment (PRA) studies. These trends were examined for three time periods, 1969-81, 1984-87 and 1988-94. Although the different models had been used in the ASP analyses for these three periods, the distribution of precursors by dominant sequences show similar trends to each other. For example, the sequences involving loss of both main and auxiliary feedwater were identified in many PWR events and those involving loss of both high and low coolant injection were found in many BWR events. Also, it was found that these dominant sequences were comparable to those determined to be dominant in the predictions by the past PRAs. As well, a list of the 459 precursors identified are provided in Appendix, indicating initiating event types, unavailable systems, dominant sequences, conditional core damage probabilities, and so on. (author)

  5. Event Shape Analysis in ALICE

    AUTHOR|(CDS)2073367; Paic, Guy

    2009-01-01

    The jets are the final state manifestation of the hard parton scattering. Since at LHC energies the production of hard processes in proton-proton collisions will be copious and varied, it is important to develop methods to identify them through the study of their final states. In the present work we describe a method based on the use of some shape variables to discriminate events according their topologies. A very attractive feature of this analysis is the possibility of using the tracking information of the TPC+ITS in order to identify specific events like jets. Through the correlation between the quantities: thrust and recoil, calculated in minimum bias simulations of proton-proton collisions at 10 TeV, we show the sensitivity of the method to select specific topologies and high multiplicity. The presented results were obtained both at level generator and after reconstruction. It remains that with any kind of jet reconstruction algorithm one will confronted in general with overlapping jets. The present meth...

  6. Key events and their effects on cycling behaviour in Dar-es-Salaam : abstract + powerpoint

    Nkurunziza, A.; Zuidgeest, M.H.P.; Brussel, M.J.G.; van Maarseveen, M.F.A.M.

    2012-01-01

    The paper explores key events and investigates their effects on cycling behaviour in the city of Dar-es-Salaam, Tanzania. The objective of the study is to identify specific key events during a person’s life course with a significant effect on change of travel behaviour towards cycling in relation to stage of change. Stage of change is a key construct of the transtheoretical model of behaviour change that defines behavioural readiness (intentions and actions) into six distinct categories (i.e....

  7. Artist concept illustrating key events on day by day basis during Apollo 9

    1969-01-01

    Artist concept illustrating key events on day by day basis during Apollo 9 mission. First photograph illustrates activities on the first day of the mission, including flight crew preparation, orbital insertion, 103 north mile orbit, separations, docking and docked Service Propulsion System Burn (19792); Second day events include landmark tracking, pitch maneuver, yaw-roll maneuver, and high apogee orbits (19793); Third day events include crew transfer and Lunar Module system evaluation (19794); Fourth day events include use of camera, day-night extravehicular activity, use of golden slippers, and television over Texas and Louisiana (19795); Fifth day events include vehicles undocked, Lunar Module burns for rendezvous, maximum separation, ascent propulsion system burn, formation flying and docking, and Lunar Module jettison ascent burn (19796); Sixth thru ninth day events include service propulsion system burns and landmark sightings, photograph special tests (19797); Tenth day events i

  8. Joint Attributes and Event Analysis for Multimedia Event Detection.

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  9. Collecting operational event data for statistical analysis

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  10. Risk analysis of brachytherapy events

    Buricova, P.; Zackova, H.; Hobzova, L.; Novotny, J.; Kindlova, A.

    2005-01-01

    For prevention radiological events it is necessary to identify hazardous situation and to analyse the nature of committed errors. Though the recommendation on the classification and prevention of radiological events: Radiological accidents has been prepared in the framework of Czech Society of Radiation Oncology, Biology and Physics and it was approved by Czech regulatory body (SONS) in 1999, only a few reports have been submitted up to now from brachytherapy practice. At the radiotherapy departments attention has been paid more likely to the problems of dominant teletherapy treatments. But in the two last decades the usage of brachytherapy methods has gradually increased because .nature of this treatment well as the possibilities of operating facility have been completely changed: new radionuclides of high activity are introduced and sophisticate afterloading systems controlled by computers are used. Consequently also the nature of errors, which can occurred in the clinical practice, has been changing. To determine the potentially hazardous parts of procedure the so-called 'process tree', which follows the flow of entire treatment process, has been created for most frequent type of applications. Marking the location of errors on the process tree indicates where failures occurred and accumulation of marks along branches show weak points in the process. Analysed data provide useful information to prevent medical events in brachytherapy .The results strength the requirements given in Recommendations of SONS and revealed the need for its amendment. They call especially for systematic registration of the events. (authors)

  11. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to

  12. Surface Management System Departure Event Data Analysis

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  13. External events analysis for experimental fusion facilities

    Cadwallader, L.C.

    1990-01-01

    External events are those off-normal events that threaten facilities either from outside or inside the building. These events, such as floods, fires, and earthquakes, are among the leading risk contributors for fission power plants, and the nature of fusion facilities indicates that they may also lead fusion risk. This paper gives overviews of analysis methods, references good analysis guidance documents, and gives design tips for mitigating the effects of floods and fires, seismic events, and aircraft impacts. Implications for future fusion facility siting are also discussed. Sites similar to fission plant sites are recommended. 46 refs

  14. Event analysis in primary substation

    Paulasaari, H. [Tampere Univ. of Technology (Finland)

    1996-12-31

    The target of the project is to develop a system which observes the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: the first is the communication with the SRIO and the SCADA system, the second is the internal clock, which is used to produce time stamped event data, and the third is the capability to register some values during the fault. For example, during a short circuit fault the relay registers the value of the short circuit current and information on the number of faulted phases. In the case of an earth fault the relay stores both the neutral current and the neutral voltage

  15. Event analysis in primary substation

    Paulasaari, H [Tampere Univ. of Technology (Finland)

    1997-12-31

    The target of the project is to develop a system which observes the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: the first is the communication with the SRIO and the SCADA system, the second is the internal clock, which is used to produce time stamped event data, and the third is the capability to register some values during the fault. For example, during a short circuit fault the relay registers the value of the short circuit current and information on the number of faulted phases. In the case of an earth fault the relay stores both the neutral current and the neutral voltage

  16. External event analysis methods for NUREG-1150

    Bohn, M.P.; Lambright, J.A.

    1989-01-01

    The US Nuclear Regulatory Commission is sponsoring probabilistic risk assessments of six operating commercial nuclear power plants as part of a major update of the understanding of risk as provided by the original WASH-1400 risk assessments. In contrast to the WASH-1400 studies, at least two of the NUREG-1150 risk assessments will include an analysis of risks due to earthquakes, fires, floods, etc., which are collectively known as eternal events. This paper summarizes the methods to be used in the external event analysis for NUREG-1150 and the results obtained to date. The two plants for which external events are being considered are Surry and Peach Bottom, a PWR and BWR respectively. The external event analyses (through core damage frequency calculations) were completed in June 1989, with final documentation available in September. In contrast to most past external event analyses, wherein rudimentary systems models were developed reflecting each external event under consideration, the simplified NUREG-1150 analyses are based on the availability of the full internal event PRA systems models (event trees and fault trees) and make use of extensive computer-aided screening to reduce them to sequence cut sets important to each external event. This provides two major advantages in that consistency and scrutability with respect to the internal event analysis is achieved, and the full gamut of random and test/maintenance unavailabilities are automatically included, while only those probabilistically important survive the screening process. Thus, full benefit of the internal event analysis is obtained by performing the internal and external event analyses sequentially

  17. NPP unusual events: data, analysis and application

    Tolstykh, V.

    1990-01-01

    Subject of the paper are the IAEA cooperative patterns of unusual events data treatment and utilization of the operating safety experience feedback. The Incident Reporting System (IRS) and the Analysis of Safety Significant Event Team (ASSET) are discussed. The IRS methodology in collection, handling, assessment and dissemination of data on NPP unusual events (deviations, incidents and accidents) occurring during operations, surveillance and maintenance is outlined by the reports gathering and issuing practice, the experts assessment procedures and the parameters of the system. After 7 years of existence the IAEA-IRS contains over 1000 reports and receives 1.5-4% of the total information on unusual events. The author considers the reports only as detailed technical 'records' of events requiring assessment. The ASSET approaches implying an in-depth occurrences analysis directed towards level-1 PSA utilization are commented on. The experts evaluated root causes for the reported events and some trends are presented. Generally, internal events due to unexpected paths of water in the nuclear installations, occurrences related to the integrity of the primary heat transport systems, events associated with the engineered safety systems and events involving human factor represent the large groups deserving close attention. Personal recommendations on how to use the events related information use for NPP safety improvement are given. 2 tabs (R.Ts)

  18. Data analysis of event tape and connection

    Gong Huili

    1995-01-01

    The data analysis on the VAX-11/780 computer is briefly described, the data is from the recorded event tape of JUHU data acquisition system on the PDP-11/44 computer. The connection of the recorded event tapes of the XSYS data acquisition system on VAX computer is also introduced

  19. Event History Analysis in Quantitative Genetics

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  20. Interpretation Analysis as a Competitive Event.

    Nading, Robert M.

    Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…

  1. Key parameters analysis of hybrid HEMP simulator

    Mao Congguang; Zhou Hui

    2009-01-01

    According to the new standards on the high-altitude electromagnetic pulse (HEMP) developed by International Electrotechnical Commission (IEC), the target parameter requirements of the key structure of the hybrid HEMP simulator are decomposed. Firstly, the influences of the different excitation sources and biconical structures to the key parameters of the radiated electric field wave shape are investigated and analyzed. Then based on the influence curves the target parameter requirements of the pulse generator are proposed. Finally the appropriate parameters of the biconical structure and the excitation sources are chosen, and the computational result of the electric field in free space is presented. The results are of great value for the design of the hybrid HEMP simulator. (authors)

  2. Human reliability analysis using event trees

    Heslinga, G.

    1983-01-01

    The shut-down procedure of a technologically complex installation as a nuclear power plant consists of a lot of human actions, some of which have to be performed several times. The procedure is regarded as a chain of modules of specific actions, some of which are analyzed separately. The analysis is carried out by making a Human Reliability Analysis event tree (HRA event tree) of each action, breaking down each action into small elementary steps. The application of event trees in human reliability analysis implies more difficulties than in the case of technical systems where event trees were mainly used until now. The most important reason is that the operator is able to recover a wrong performance; memory influences play a significant role. In this study these difficulties are dealt with theoretically. The following conclusions can be drawn: (1) in principle event trees may be used in human reliability analysis; (2) although in practice the operator will recover his fault partly, theoretically this can be described as starting the whole event tree again; (3) compact formulas have been derived, by which the probability of reaching a specific failure consequence on passing through the HRA event tree after several times of recovery is to be calculated. (orig.)

  3. Human performance analysis of industrial radiography radiation exposure events

    Reece, W.J.; Hill, S.G.

    1995-01-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures

  4. Application of the International Life Sciences Institute Key Events Dose-Response Framework to food contaminants.

    Fenner-Crisp, Penelope A

    2012-12-01

    Contaminants are undesirable constituents in food. They may be formed during production of a processed food, present as a component in a source material, deliberately added to substitute for the proper substance, or the consequence of poor food-handling practices. Contaminants may be chemicals or pathogens. Chemicals generally degrade over time and become of less concern as a health threat. Pathogens have the ability to multiply, potentially resulting in an increased threat level. Formal structures have been lacking for systematically generating and evaluating hazard and exposure data for bioactive agents when problem situations arise. We need to know what the potential risk may be to determine whether intervention to reduce or eliminate contact with the contaminant is warranted. We need tools to aid us in assembling and assessing all available relevant information in an expeditious and scientifically sound manner. One such tool is the International Life Sciences Institute (ILSI) Key Events Dose-Response Framework (KEDRF). Developed as an extension of the WHO's International Program on Chemical Safety/ILSI mode of action/human relevance framework, it allows risk assessors to understand not only how a contaminant exerts its toxicity but also the dose response(s) for each key event and the ultimate outcome, including whether a threshold exists. This presentation will illustrate use of the KEDRF with case studies included in its development (chloroform and Listeriaonocytogenes) after its publication in the peer-reviewed scientific literature (chromium VI) and in a work in progress (3-monochloro-1, 2-propanediol).

  5. Management of investment-construction projects basing on the matrix of key events

    Morozenko Andrey Aleksandrovich

    2016-11-01

    Full Text Available The article considers the current problematic issues in the management of investment-construction projects, examines the questions of efficiency increase of construction operations on the basis of the formation of a reflex-adaptive organizational structure. The authors analyzed the necessity of forming a matrix of key events in the investment-construction project (ICP, which will create the optimal structure of the project, basing on the work program for its implementation. For convenience of representing programs of the project implementation in time the authors make recommendations to consolidate the works into separate, economically independent functional blocks. It is proposed to use an algorithm of forming the matrix of an investment-construction project, considering the economic independence of the functional blocks and stages of the ICP implementation. The use of extended network model is justified, which is supplemented by organizational and structural constraints at different stages of the project, highlighting key events fundamentally influencing the further course of the ICP implementation.

  6. Statistical analysis of solar proton events

    V. Kurt

    2004-06-01

    Full Text Available A new catalogue of 253 solar proton events (SPEs with energy >10MeV and peak intensity >10 protons/cm2.s.sr (pfu at the Earth's orbit for three complete 11-year solar cycles (1970-2002 is given. A statistical analysis of this data set of SPEs and their associated flares that occurred during this time period is presented. It is outlined that 231 of these proton events are flare related and only 22 of them are not associated with Ha flares. It is also noteworthy that 42 of these events are registered as Ground Level Enhancements (GLEs in neutron monitors. The longitudinal distribution of the associated flares shows that a great number of these events are connected with west flares. This analysis enables one to understand the long-term dependence of the SPEs and the related flare characteristics on the solar cycle which are useful for space weather prediction.

  7. Sentiment analysis on tweets for social events

    Zhou, Xujuan; Tao, Xiaohui; Yong, Jianming

    2013-01-01

    Sentiment analysis or opinion mining is an important type of text analysis that aims to support decision making by extracting and analyzing opinion oriented text, identifying positive and negative opinions, and measuring how positively or negatively an entity (i.e., people, organization, event......, location, product, topic, etc.) is regarded. As more and more users express their political and religious views on Twitter, tweets become valuable sources of people's opinions. Tweets data can be efficiently used to infer people's opinions for marketing or social studies. This paper proposes a Tweets...... Sentiment Analysis Model (TSAM) that can spot the societal interest and general people's opinions in regard to a social event. In this paper, Australian federal election 2010 event was taken as an example for sentiment analysis experiments. We are primarily interested in the sentiment of the specific...

  8. Event analysis in a primary substation

    Jaerventausta, P; Paulasaari, H [Tampere Univ. of Technology (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    The target of the project was to develop applications which observe the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: communication with the SCADA, the internal clock to produce time stamped event data, and the capability to register certain values during the fault. Using the above features some new functions for event analysis were developed in the project

  9. Crop damage by primates: quantifying the key parameters of crop-raiding events.

    Graham E Wallace

    Full Text Available Human-wildlife conflict often arises from crop-raiding, and insights regarding which aspects of raiding events determine crop loss are essential when developing and evaluating deterrents. However, because accounts of crop-raiding behaviour are frequently indirect, these parameters are rarely quantified or explicitly linked to crop damage. Using systematic observations of the behaviour of non-human primates on farms in western Uganda, this research identifies number of individuals raiding and duration of raid as the primary parameters determining crop loss. Secondary factors include distance travelled onto farm, age composition of the raiding group, and whether raids are in series. Regression models accounted for greater proportions of variation in crop loss when increasingly crop and species specific. Parameter values varied across primate species, probably reflecting differences in raiding tactics or perceptions of risk, and thereby providing indices of how comfortable primates are on-farm. Median raiding-group sizes were markedly smaller than the typical sizes of social groups. The research suggests that key parameters of raiding events can be used to measure the behavioural impacts of deterrents to raiding. Furthermore, farmers will benefit most from methods that discourage raiding by multiple individuals, reduce the size of raiding groups, or decrease the amount of time primates are on-farm. This study demonstrates the importance of directly relating crop loss to the parameters of raiding events, using systematic observations of the behaviour of multiple primate species.

  10. Crop Damage by Primates: Quantifying the Key Parameters of Crop-Raiding Events

    Wallace, Graham E.; Hill, Catherine M.

    2012-01-01

    Human-wildlife conflict often arises from crop-raiding, and insights regarding which aspects of raiding events determine crop loss are essential when developing and evaluating deterrents. However, because accounts of crop-raiding behaviour are frequently indirect, these parameters are rarely quantified or explicitly linked to crop damage. Using systematic observations of the behaviour of non-human primates on farms in western Uganda, this research identifies number of individuals raiding and duration of raid as the primary parameters determining crop loss. Secondary factors include distance travelled onto farm, age composition of the raiding group, and whether raids are in series. Regression models accounted for greater proportions of variation in crop loss when increasingly crop and species specific. Parameter values varied across primate species, probably reflecting differences in raiding tactics or perceptions of risk, and thereby providing indices of how comfortable primates are on-farm. Median raiding-group sizes were markedly smaller than the typical sizes of social groups. The research suggests that key parameters of raiding events can be used to measure the behavioural impacts of deterrents to raiding. Furthermore, farmers will benefit most from methods that discourage raiding by multiple individuals, reduce the size of raiding groups, or decrease the amount of time primates are on-farm. This study demonstrates the importance of directly relating crop loss to the parameters of raiding events, using systematic observations of the behaviour of multiple primate species. PMID:23056378

  11. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new

  12. Attack Graph Construction for Security Events Analysis

    Andrey Alexeevich Chechulin

    2014-09-01

    Full Text Available The paper is devoted to investigation of the attack graphs construction and analysis task for a network security evaluation and real-time security event processing. Main object of this research is the attack modeling process. The paper contains the description of attack graphs building, modifying and analysis technique as well as overview of implemented prototype for network security analysis based on attack graph approach.

  13. The Unfolding of LGBT Lives: Key Events Associated With Health and Well-being in Later Life.

    Fredriksen-Goldsen, Karen I; Bryan, Amanda E B; Jen, Sarah; Goldsen, Jayn; Kim, Hyun-Jun; Muraco, Anna

    2017-02-01

    Life events are associated with the health and well-being of older adults. Using the Health Equity Promotion Model, this article explores historical and environmental context as it frames life experiences and adaptation of lesbian, gay, bisexual, and transgender (LGBT) older adults. This was the largest study to date of LGBT older adults to identify life events related to identity development, work, and kin relationships and their associations with health and quality of life (QOL). Using latent profile analysis (LPA), clusters of life events were identified and associations between life event clusters were tested. On average, LGBT older adults first disclosed their identities in their 20s; many experienced job-related discrimination. More had been in opposite-sex marriage than in same-sex marriage. Four clusters emerged: "Retired Survivors" were the oldest and one of the most prevalent groups; "Midlife Bloomers" first disclosed their LGBT identities in mid-40s, on average; "Beleaguered At-Risk" had high rates of job-related discrimination and few social resources; and "Visibly Resourced" had a high degree of identity visibility and were socially and economically advantaged. Clusters differed significantly in mental and physical health and QOL, with the Visibly Resourced faring best and Beleaguered At-Risk faring worst on most indicators; Retired Survivors and Midlife Bloomers showed similar health and QOL. Historical and environmental contexts frame normative and non-normative life events. Future research will benefit from the use of longitudinal data and an assessment of timing and sequencing of key life events in the lives of LGBT older adults. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. The Unfolding of LGBT Lives: Key Events Associated With Health and Well-being in Later Life

    Fredriksen-Goldsen, Karen I.; Bryan, Amanda E. B.; Jen, Sarah; Goldsen, Jayn; Kim, Hyun-Jun; Muraco, Anna

    2017-01-01

    Purpose of the Study: Life events are associated with the health and well-being of older adults. Using the Health Equity Promotion Model, this article explores historical and environmental context as it frames life experiences and adaptation of lesbian, gay, bisexual, and transgender (LGBT) older adults. Design and Methods: This was the largest study to date of LGBT older adults to identify life events related to identity development, work, and kin relationships and their associations with health and quality of life (QOL). Using latent profile analysis (LPA), clusters of life events were identified and associations between life event clusters were tested. Results: On average, LGBT older adults first disclosed their identities in their 20s; many experienced job-related discrimination. More had been in opposite-sex marriage than in same-sex marriage. Four clusters emerged: “Retired Survivors” were the oldest and one of the most prevalent groups; “Midlife Bloomers” first disclosed their LGBT identities in mid-40s, on average; “Beleaguered At-Risk” had high rates of job-related discrimination and few social resources; and “Visibly Resourced” had a high degree of identity visibility and were socially and economically advantaged. Clusters differed significantly in mental and physical health and QOL, with the Visibly Resourced faring best and Beleaguered At-Risk faring worst on most indicators; Retired Survivors and Midlife Bloomers showed similar health and QOL. Implications: Historical and environmental contexts frame normative and non-normative life events. Future research will benefit from the use of longitudinal data and an assessment of timing and sequencing of key life events in the lives of LGBT older adults. PMID:28087792

  15. Synopsis of key persons, events, and associations in the history of Latino psychology.

    Padilla, Amado M; Olmedo, Esteban

    2009-10-01

    In this article, we present a brief synopsis of six early Latino psychologists, several key conferences, the establishment of research centers, and early efforts to create an association for Latino psychologists. Our chronology runs from approximately 1930 to 2000. This history is a firsthand account of how these early leaders, conferences, and efforts to bring Latinos and Latinas together served as a backdrop to current research and practice in Latino psychology. This history of individuals and events is also intertwined with the American Psychological Association and the National Institute of Mental Health and efforts by Latino psychologists to obtain the professional support necessary to lay down the roots of a Latino presence in psychology. Copyright 2009 APA, all rights reserved.

  16. Analysis of the differential-phase-shift-keying protocol in the quantum-key-distribution system

    Rong-Zhen, Jiao; Chen-Xu, Feng; Hai-Qiang, Ma

    2009-01-01

    The analysis is based on the error rate and the secure communication rate as functions of distance for three quantum-key-distribution (QKD) protocols: the Bennett–Brassard 1984, the Bennett–Brassard–Mermin 1992, and the coherent differential-phase-shift keying (DPSK) protocols. We consider the secure communication rate of the DPSK protocol against an arbitrary individual attack, including the most commonly considered intercept-resend and photon-number splitting attacks, and concluded that the simple and efficient differential-phase-shift-keying protocol allows for more than 200 km of secure communication distance with high communication rates. (general)

  17. Event tree analysis using artificial intelligence techniques

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  18. Disruptive event analysis: volcanism and igneous intrusion

    Crowe, B.M.

    1979-01-01

    Three basic topics are addressed for the disruptive event analysis: first, the range of disruptive consequences of a radioactive waste repository by volcanic activity; second, the possible reduction of the risk of disruption by volcanic activity through selective siting of a repository; and third, the quantification of the probability of repository disruption by volcanic activity

  19. Key-space analysis of double random phase encryption technique

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  20. Parallel processor for fast event analysis

    Hensley, D.C.

    1983-01-01

    Current maximum data rates from the Spin Spectrometer of approx. 5000 events/s (up to 1.3 MBytes/s) and minimum analysis requiring at least 3000 operations/event require a CPU cycle time near 70 ns. In order to achieve an effective cycle time of 70 ns, a parallel processing device is proposed where up to 4 independent processors will be implemented in parallel. The individual processors are designed around the Am2910 Microsequencer, the AM29116 μP, and the Am29517 Multiplier. Satellite histogramming in a mass memory system will be managed by a commercial 16-bit μP system

  1. Dynamic Event Tree Analysis Through RAVEN

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  2. Safety culture in nuclear installations: Bangladesh perspectives and key lessons learned from major events

    Jalil, A.; Rabbani, G.

    2002-01-01

    Steps necessary to be taken to ensure safety in nuclear installations are suggested. One of the steps suggested is enhancing the safety culture. It is necessary to gain a common understanding of the concept itself, the development stages of safety culture by way of good management practices and leadership for safety culture improvement in the long-term. International topical meetings on safety culture may serve as an important forum for exchange of experiences. From such conventions new initiatives and programmes may crop up which when implemented around the world is very likely to improve safety management and thus boost up the safety culture in nuclear installations. International co-operation and learning are to be prompted to facilitate the sharing of the achievements to face the challenges involved in the management of safety and fixing priorities for future work and identify areas of co-operations. Key lessons learned from some major events have been reported. Present status and future trend of nuclear safety culture in Bangladesh have been dealt with. (author)

  3. Poisson-event-based analysis of cell proliferation.

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  4. Simple Public Key Infrastructure Protocol Analysis and Design

    Vidergar, Alexander G

    2005-01-01

    ...). This thesis aims at proving the applicability of the Simple Public Key Infrastructure (SPKI) as a means of PKC. The strand space approach of Guttman and Thayer is used to provide an appropriate model for analysis...

  5. Multistate event history analysis with frailty

    Govert Bijwaard

    2014-05-01

    Full Text Available Background: In survival analysis a large literature using frailty models, or models with unobserved heterogeneity, exists. In the growing literature and modelling on multistate models, this issue is only in its infant phase. Ignoring frailty can, however, produce incorrect results. Objective: This paper presents how frailties can be incorporated into multistate models, with an emphasis on semi-Markov multistate models with a mixed proportional hazard structure. Methods: First, the aspects of frailty modeling in univariate (proportional hazard, Cox and multivariate event history models are addressed. The implications of choosing shared or correlated frailty is highlighted. The relevant differences with recurrent events data are covered next. Multistate models are event history models that can have both multivariate and recurrent events. Incorporating frailty in multistate models, therefore, brings all the previously addressed issues together. Assuming a discrete frailty distribution allows for a very general correlation structure among the transition hazards in a multistate model. Although some estimation procedures are covered the emphasis is on conceptual issues. Results: The importance of multistate frailty modeling is illustrated with data on labour market and migration dynamics of recent immigrants to the Netherlands.

  6. Analysis hierarchical model for discrete event systems

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  7. System risk evolution analysis and risk critical event identification based on event sequence diagram

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  8. Using variable transformations to perform common event analysis

    Worrell, R.B.

    1977-01-01

    Any analytical method for studying the effect of common events on the behavior of a system is considered as being a form of common event analysis. The particular common events that are involved often represent quite different phenomena, and this has led to the development of different kinds of common event analysis. For example, common mode failure analysis, common cause analysis, critical location analysis, etc., are all different kinds of common event analysis for which the common events involved represent different phenomena. However, the problem that must be solved for each of these different kinds of common event analysis is essentially the same: Determine the effect of common events on the behavior of a system. Thus, a technique that is useful in achieving one kind of common event analysis is often useful in achieving other kinds of common event analysis

  9. Probabilistic analysis of extreme wind events

    Chaviaropoulos, P.K. [Center for Renewable Energy Sources (CRES), Pikermi Attikis (Greece)

    1997-12-31

    A vital task in wind engineering and meterology is to understand, measure, analyse and forecast extreme wind conditions, due to their significant effects on human activities and installations like buildings, bridges or wind turbines. The latest version of the IEC standard (1996) pays particular attention to the extreme wind events that have to be taken into account when designing or certifying a wind generator. Actually, the extreme wind events within a 50 year period are those which determine the ``static`` design of most of the wind turbine components. The extremes which are important for the safety of wind generators are those associated with the so-called ``survival wind speed``, the extreme operating gusts and the extreme wind direction changes. A probabilistic approach for the analysis of these events is proposed in this paper. Emphasis is put on establishing the relation between extreme values and physically meaningful ``site calibration`` parameters, like probability distribution of the annual wind speed, turbulence intensity and power spectra properties. (Author)

  10. Contingency Analysis of Cascading Line Outage Events

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  11. Application of Key Events Dose Response Framework to defining the upper intake level of leucine in young men.

    Pencharz, Paul B; Russell, Robert M

    2012-12-01

    Leucine is sold in large doses in health food stores and is ingested by weight-training athletes. The safety of ingestion of large doses of leucine is unknown. Before designing chronic high-dose leucine supplementation experiments, we decided to determine the effect of graded doses of leucine in healthy participants. The Key Events Dose Response Framework is an organizational and analytical framework that dissects the various biologic steps (key events) that occur between exposure to a substance and an eventual adverse effect. Each biologic event is looked at for its unique dose-response characteristics. For nutrients, there are a number of biologic homeostatic mechanisms that work to keep circulating/tissue levels in a safe, nontoxic range. If a response mechanism at a particular key event is especially vulnerable and easily overwhelmed, this is known as a determining event, because this event drives the overall slope or shape of the dose-response relationship. In this paper, the Key Events Dose Framework has been applied to the problem of leucine toxicity and leucine's tolerable upper level. After analyzing the experimental data vis a vis key events for leucine leading to toxicity, it became evident that the rate of leucine oxidation was the determining event. A dose-response study has been conducted to graded intakes of leucine in healthy human adult male volunteers. All participants were started at the mean requirement level of leucine [50 mg/(kg · d)] and the highest leucine intake was 1250 mg/( kg · d), which is 25 times the mean requirement. No gut intolerance was seen. Blood glucose fell progressively but remained within normal values without any changes in plasma insulin. Maximal leucine oxidation levels occurred at an intake of 550 mg leucine/( kg · d), after which plasma leucine progressively increased and plasma ammonia also increased in response to leucine intakes >500 mg/( kg · d). Thus, the "key determining event" appears to be when the

  12. Analysis of Key Factors Driving Japan’s Military Normalization

    2017-09-01

    no change to our policy of not giving in to terrorism.”40 Though the prime minister was democratically supported, Koizumi’s leadership style took...of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and...analysis of the key driving factors of Japan’s normalization. The areas of prime ministerial leadership , regional security threats, alliance issues, and

  13. Prism reactor system design and analysis of postulated unscrammed events

    Van Tuyle, G.J.; Slovik, G.C.

    1991-08-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristic and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. 6 refs., 4 figs

  14. PRISM reactor system design and analysis of postulated unscrammed events

    Van Tuyle, G.J.; Slovik, G.C.

    1991-01-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristic and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. (author)

  15. PRISM reactor system design and analysis of postulated unscrammed events

    Van Tuyle, G.J.; Slovik, G.C.; Rosztoczy, Z.; Lane, J.

    1991-01-01

    Key safety characteristics of the PRISM reactor system include the passive reactor shutdown characteristics and the passive shutdown heat removal system, RVACS. While these characteristics are simple in principle, the physical processes are fairly complex, particularly for the passive reactor shutdown. It has been possible to adapt independent safety analysis codes originally developed for the Clinch River Breeder Reactor review, although some limitations remain. In this paper, the analyses of postulated unscrammed events are discussed, along with limitations in the predictive capabilities and plans to correct the limitations in the near future. 6 refs., 4 figs

  16. Key components of financial-analysis education for clinical nurses.

    Lim, Ji Young; Noh, Wonjung

    2015-09-01

    In this study, we identified key components of financial-analysis education for clinical nurses. We used a literature review, focus group discussions, and a content validity index survey to develop key components of financial-analysis education. First, a wide range of references were reviewed, and 55 financial-analysis education components were gathered. Second, two focus group discussions were performed; the participants were 11 nurses who had worked for more than 3 years in a hospital, and nine components were agreed upon. Third, 12 professionals, including professors, nurse executive, nurse managers, and an accountant, participated in the content validity index. Finally, six key components of financial-analysis education were selected. These key components were as follows: understanding the need for financial analysis, introduction to financial analysis, reading and implementing balance sheets, reading and implementing income statements, understanding the concepts of financial ratios, and interpretation and practice of financial ratio analysis. The results of this study will be used to develop an education program to increase financial-management competency among clinical nurses. © 2015 Wiley Publishing Asia Pty Ltd.

  17. DISRUPTIVE EVENT BIOSPHERE DOSE CONVERSION FACTOR ANALYSIS

    M.A. Wasiolek

    2005-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The Biosphere Model Report (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis'' (Figure 1-1). The objective of this analysis was to develop the BDCFs for the volcanic

  18. Preliminary safety analysis for key design features of KALIMER

    Hahn, D. H.; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, S. O.; Lee, Y. B.; Jeong, K. S

    2000-07-01

    KAERI is currently developing the conceptual design of a liquid metal reactor, KALIMER(Korea Advanced Liquid Metal Reactor) under the long-term nuclear R and D program. In this report, descriptions of the KALIMER safety design features and safety analyses results for selected ATWS accidents are presented. First, the basic approach to achieve the safety goal is introduced in chapter 1, and the safety evaluation procedure for the KALIMER design is described in chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events. In chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure design performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram(ATWS) have been performed to investigate the KALIMER system response to the events. They are categorized as bounding events(BEs) because of their low probability of occurrence. In chapter 4, the design of the KALIMER containment dome and the results of its performance analysis are presented. The designs of the existing LMR containment and the KALIMER containment dome have been compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core kinetics and hydraulic behavior during HCDA in chapter 5. Mathematical formulations have been developed in the framework of the modified bethe-tait method, and scoping analyses have been performed for the KALIMER core behavior during super-prompt critical excursions.

  19. Disruptive Event Biosphere Dose Conversion Factor Analysis

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  20. Disruptive Event Biosphere Dose Conversion Factor Analysis

    M. A. Wasiolek

    2003-01-01

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  1. Human reliability analysis of dependent events

    Swain, A.D.; Guttmann, H.E.

    1977-01-01

    In the human reliability analysis in WASH-1400, the continuous variable of degree of interaction among human events was approximated by selecting four points on this continuum to represent the entire continuum. The four points selected were identified as zero coupling (i.e., zero dependence), complete coupling (i.e., complete dependence), and two intermediate points--loose coupling (a moderate level of dependence) and tight coupling (a high level of dependence). The paper expands the WASH-1400 treatment of common mode failure due to the interaction of human activities. Mathematical expressions for the above four levels of dependence are derived for parallel and series systems. The psychological meaning of each level of dependence is illustrated by examples, with probability tree diagrams to illustrate the use of conditional probabilities resulting from the interaction of human actions in nuclear power plant tasks

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Tipping the Balance: Hepatotoxicity and the Four Apical Key Events of Hepatic Steatosis

    Adverse outcome pathways (AOPs) are descriptive biological sequences that start from a molecular initiating event (MIE) and end with an adverse health outcome. AOPs provide biological context for high throughput chemical testing and further prioritize environmental health risk r...

  4. Technology and economic impacts of mega-sports events: A key issue? Exploratory insights from literature

    Chanaron Jean Jacques

    2014-01-01

    Full Text Available Mega-sport events such as Olympic Games or Football World Cup are always presented as providing the hosting nation and/or city with huge benefits. Supporters of such events quote economic, social and cultural impacts for the region as well as contributions to scientific and technological progress and innovation. obviously, they need to politically justify the impressive and growing financial investment required by organizing Olympic Games or World Cup. The article aims at looking at a quite abundant academic literature with the objectives of defining the various potential impacts and the methods used for their assessment. It concludes that there is no universal and scientifically valid model for evaluating the benefits of mega-sport events and that organizers should be very cautious when arguing in favor of deciding to host such events.

  5. Key steps in the strategic analysis of a dental practice.

    Armstrong, J L; Boardman, A E; Vining, A R

    1999-01-01

    As dentistry is becoming increasingly competitive, dentists must focus more on strategic analysis. This paper lays out seven initial steps that are the foundation of strategic analysis. It introduces and describes the use of service-customer matrices and location-proximity maps as tools in competitive positioning. The paper also contains a brief overview of the role of differentiation and cost-control in determining key success factors for dental practices.

  6. Disruptive Event Biosphere Dose Conversion Factor Analysis

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  7. Disruptive Event Biosphere Dose Conversion Factor Analysis

    M. Wasiolek

    2004-01-01

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this analysis was to develop the BDCFs for the volcanic ash

  8. Disruptive Event Biosphere Doser Conversion Factor Analysis

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  9. Analysis of external events - Nuclear Power Plant Dukovany

    Hladky, Milan

    2000-01-01

    PSA of external events at level 1 covers internal events, floods, fires, other external events are not included yet. Shutdown PSA takes into account internal events, floods, fires, heavy load drop, other external events are not included yet. Final safety analysis report was conducted after 10 years of operation for all Dukovany operational units. Probabilistic approach was used for analysis of aircraft drop and external man-induced events. The risk caused by man-induced events was found to be negligible and was accepted by State Office for Nuclear Safety (SONS)

  10. Transcriptome and metabolome of synthetic Solanum autotetraploids reveal key genomic stress events following polyploidization.

    Fasano, Carlo; Diretto, Gianfranco; Aversano, Riccardo; D'Agostino, Nunzio; Di Matteo, Antonio; Frusciante, Luigi; Giuliano, Giovanni; Carputo, Domenico

    2016-06-01

    Polyploids are generally classified as autopolyploids, derived from a single species, and allopolyploids, arising from interspecific hybridization. The former represent ideal materials with which to study the consequences of genome doubling and ascertain whether there are molecular and functional rules operating following polyploidization events. To investigate whether the effects of autopolyploidization are common to different species, or if species-specific or stochastic events are prevalent, we performed a comprehensive transcriptomic and metabolomic characterization of diploids and autotetraploids of Solanum commersonii and Solanum bulbocastanum. Autopolyploidization remodelled the transcriptome and the metabolome of both species. In S. commersonii, differentially expressed genes (DEGs) were highly enriched in pericentromeric regions. Most changes were stochastic, suggesting a strong genotypic response. However, a set of robustly regulated transcripts and metabolites was also detected, including purine bases and nucleosides, which are likely to underlie a common response to polyploidization. We hypothesize that autopolyploidization results in nucleotide pool imbalance, which in turn triggers a genomic shock responsible for the stochastic events observed. The more extensive genomic stress and the higher number of stochastic events observed in S. commersonii with respect to S. bulbocastanum could be the result of the higher nucleoside depletion observed in this species. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  11. Event shape analysis in ultrarelativistic nuclear collisions

    Kopecna, Renata; Tomasik, Boris

    2016-01-01

    We present a novel method for sorting events. So far, single variables like flow vector magnitude were used for sorting events. Our approach takes into account the whole azimuthal angle distribution rather than a single variable. This method allows us to determine the good measure of the event shape, providing a multiplicity-independent insight. We discuss the advantages and disadvantages of this approach, the possible usage in femtoscopy, and other more exclusive experimental studies.

  12. Economic Multipliers and Mega-Event Analysis

    Victor Matheson

    2004-01-01

    Critics of economic impact studies that purport to show that mega-events such as the Olympics bring large benefits to the communities “lucky” enough to host them frequently cite the use of inappropriate multipliers as a primary reason why these impact studies overstate the true economic gains to the hosts of these events. This brief paper shows in a numerical example how mega-events may lead to inflated multipliers and exaggerated claims of economic benefits.

  13. Root cause analysis of relevant events

    Perez, Silvia S.; Vidal, Patricia G.

    2000-01-01

    During 1998 the research work followed more specific guidelines, which entailed focusing exclusively on the two selected methods (ASSET and HPIP) and incorporating some additional human behaviour elements based on the documents of reference. Once resident inspectors were incorporated in the project (and trained accordingly), events occurring in Argentine nuclear power plants were analysed. Some events were analysed (all of them from Atucha I and Embalse nuclear power plant), concluding that the systematic methodology used allows us to investigate also minor events that were precursors of the events selected. (author)

  14. The analysis of the initiating events in thorium-based molten salt reactor

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  15. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Brinkmann, Markus; Eichbaum, Kathrin [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Kammann, Ulrike [Thünen-Institute of Fisheries Ecology, Palmaille 9, 22767 Hamburg (Germany); Hudjetz, Sebastian [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Cofalla, Catrina [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Buchinger, Sebastian; Reifferscheid, Georg [Federal Institute of Hydrology (BFG), Department G3: Biochemistry, Ecotoxicology, Am Mainzer Tor 1, 56068 Koblenz (Germany); Schüttrumpf, Holger [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Preuss, Thomas [Department of Environmental Biology and Chemodynamics, Institute for Environmental Research,ABBt- Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); and others

    2014-07-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios.

  16. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Brinkmann, Markus; Eichbaum, Kathrin; Kammann, Ulrike; Hudjetz, Sebastian; Cofalla, Catrina; Buchinger, Sebastian; Reifferscheid, Georg; Schüttrumpf, Holger; Preuss, Thomas

    2014-01-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios

  17. ELIMINATION OF THE DISADVANTAGES OF SCHEDULING-NETWORK PLANNING BY APPLYING THE MATRIX OF KEY PROJECT EVENTS

    Morozenko Andrey Aleksandrovich

    2017-07-01

    Full Text Available The article discusses the current disadvantages of the scheduling-network planning in the management of the terms of investment-construction project. Problems associated with the construction of the schedule and the definitions of the duration of the construction project are being studied. The problems of project management for the management apparatus are shown, which consists in the absence of mechanisms for prompt response to deviations in the parameters of the scheduling-network diagram. A new approach to planning the implementation of an investment-construction project based on a matrix of key events and a rejection of the current practice of determining the duration based on inauthentic regulatory data. An algorithm for determining the key events of the project is presented. For increase the reliability of the organizational structure, the load factor of the functional block in the process of achieving the key event is proposed. Recommendations for improving the interaction of the participants in the investment-construction project are given.

  18. Analysis for Human-related Events during the Overhaul

    Kim, Ji Tae; Kim, Min Chull; Choi, Dong Won; Lee, Durk Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2011-10-15

    The event frequency due to human error is decreasing among 20 operating Nuclear Power Plants (NPPs) excluding the NPP (Shin-Kori unit-1) in the commissioning stage since 2008. However, the events due to human error during an overhaul (O/H) occur annually (see Table I). An analysis for human-related events during the O/H was performed. Similar problems were identified for each event from the analysis and also, organizational and safety cultural factors were also identified

  19. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    Guerrier, C. [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Holcman, D., E-mail: david.holcman@ens.fr [Applied Mathematics and Computational Biology, IBENS, Ecole Normale Supérieure, 46 rue d' Ulm, 75005 Paris (France); Mathematical Institute, Oxford OX2 6GG, Newton Institute (United Kingdom)

    2017-07-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  20. Multiscale models and stochastic simulation methods for computing rare but key binding events in cell biology

    Guerrier, C.; Holcman, D.

    2017-01-01

    The main difficulty in simulating diffusion processes at a molecular level in cell microdomains is due to the multiple scales involving nano- to micrometers. Few to many particles have to be simulated and simultaneously tracked while there are exploring a large portion of the space for binding small targets, such as buffers or active sites. Bridging the small and large spatial scales is achieved by rare events representing Brownian particles finding small targets and characterized by long-time distribution. These rare events are the bottleneck of numerical simulations. A naive stochastic simulation requires running many Brownian particles together, which is computationally greedy and inefficient. Solving the associated partial differential equations is also difficult due to the time dependent boundary conditions, narrow passages and mixed boundary conditions at small windows. We present here two reduced modeling approaches for a fast computation of diffusing fluxes in microdomains. The first approach is based on a Markov mass-action law equations coupled to a Markov chain. The second is a Gillespie's method based on the narrow escape theory for coarse-graining the geometry of the domain into Poissonian rates. The main application concerns diffusion in cellular biology, where we compute as an example the distribution of arrival times of calcium ions to small hidden targets to trigger vesicular release.

  1. Glaciological parameters of disruptive event analysis

    Bull, C.

    1979-01-01

    The following disruptive events caused by ice sheets are considered: continental glaciation, erosion, loading and subsidence, deep ground water recharge, flood erosion, isostatic rebound rates, melting, and periodicity of ice ages

  2. A Fourier analysis of extreme events

    Mikosch, Thomas Valentin; Zhao, Yuwei

    2014-01-01

    The extremogram is an asymptotic correlogram for extreme events constructed from a regularly varying stationary sequence. In this paper, we define a frequency domain analog of the correlogram: a periodogram generated from a suitable sequence of indicator functions of rare events. We derive basic ...... properties of the periodogram such as the asymptotic independence at the Fourier frequencies and use this property to show that weighted versions of the periodogram are consistent estimators of a spectral density derived from the extremogram....

  3. Automatic measurement of key ski jumping phases and temporal events with a wearable system.

    Chardonnens, Julien; Favre, Julien; Le Callennec, Benoit; Cuendet, Florian; Gremion, Gérald; Aminian, Kamiar

    2012-01-01

    We propose a new method, based on inertial sensors, to automatically measure at high frequency the durations of the main phases of ski jumping (i.e. take-off release, take-off, and early flight). The kinematics of the ski jumping movement were recorded by four inertial sensors, attached to the thigh and shank of junior athletes, for 40 jumps performed during indoor conditions and 36 jumps in field conditions. An algorithm was designed to detect temporal events from the recorded signals and to estimate the duration of each phase. These durations were evaluated against a reference camera-based motion capture system and by trainers conducting video observations. The precision for the take-off release and take-off durations (indoor jumping technique did not influence the error of take-off release and take-off. Therefore, the proposed system can provide valuable information for performance evaluation of ski jumpers during training sessions.

  4. Recovery of the coral Montastrea annularis in the Florida Keys after the 1987 Caribbean ``bleaching event''

    Fitt, William K.; Spero, Howard J.; Halas, John; White, Michael W.; Porter, James W.

    1993-07-01

    Many reef-building corals and other cnidarians lost photosynthetic pigments and symbiotic algae (zooxanthellae) during the coral bleaching event in the Caribbean in 1987. The Florida Reef Tract included some of the first documented cases, with widespread bleaching of the massive coral Montastrea annularis beginning in late August. Phototransects at Carysfort Reef showed discoloration of >90% of colonies of this species in March 1988 compared to 0% in July 1986; however no mortality was observed between 1986 and 1988. Samples of corals collected in February and June 1988 had zooxanthellae densities ranging from 0.1 in the most lightly colored corals, to 1.6x106 cells/cm2 in the darker corals. Minimum densities increased to 0.5x106 cells/cm2 by August 1989. Chlorophyll- a content of zooxanthellae and zooxanthellar mitotic indices were significantly higher in corals with lower densities of zooxanthellae, suggesting that zooxanthellar at low densities may be more nutrientsufficient than those in unbleached corals. Ash-free dry weight of coral tissue was positively correlated with zooxanthellae density at all sample times and was significantly lower in June 1988 compared to August 1989. Proteins and lipids per cm2 were significantly higher in August 1989 than in February or June, 1988. Although recovery of zooxanthellae density and coral pigmentation to normal levels may occur in less than one year, regrowth of tissue biomass and energy stores lost during the period of low symbiont densities may take significantly longer.

  5. An Analysis of Key Factors in Developing a Smart City

    Aidana Šiurytė

    2016-06-01

    Full Text Available The concept Smart City is used widely but it is perceived differently as well. Literature review reveals key elements of the Smart City – Information and Communication Technologies and Smart Citizens. Nevertheless, raising public awareness is not a priority of local municipalities which are trying to develop cities. Focus group discussion aims to analyse citizens’ insights in regards to the Smart City and their contribution to creation of it. Case study of Vilnius examines a position of mu-nicipality in developing city as smart. Study contains suggestions for the improvement of communication in the city. Methods employed: comparative literature analysis, focus group investigation, case study.

  6. A Survey of Key Technology of Network Public Opinion Analysis

    Li Su Ying

    2016-01-01

    Full Text Available The internet has become an important base for internet users to make comments because of its interactivity and fast dissemination. The outbreak of internet public opinion has become a major risk for network information security. Domestic and foreign researchers had carried out extensive and in-depth study on public opinion. Fruitful results have achieved in the basic theory research and emergency handling and other aspects of public opinion. But research on the public opinion in China is still in the initial stage, the key technology of the public opinion analysis is still as a starting point for in-depth study and discussion.

  7. External events analysis of the Ignalina Nuclear Power Plant

    Liaukonis, Mindaugas; Augutis, Juozas

    1999-01-01

    This paper presents analysis of external events impact on the safe operation of the Ignalina Nuclear Power Plant (INPP) safety systems. Analysis was based on the probabilistic estimation and modelling of the external hazards. The screening criteria were applied to the number of external hazards. The following external events such as aircraft failure on the INPP, external flooding, fire, extreme winds requiring further bounding study were analysed. Mathematical models were developed and event probabilities were calculated. External events analysis showed rather limited external events danger to Ignalina NPP. Results of the analysis were compared to analogous analysis in western NPPs and no great differences were specified. Calculations performed show that external events can not significantly influence the safety level of the Ignalina NPP operation. (author)

  8. Statistical analysis of hydrodynamic cavitation events

    Gimenez, G.; Sommer, R.

    1980-10-01

    The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.

  9. Research on Visual Analysis Methods of Terrorism Events

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  10. Risk and sensitivity analysis in relation to external events

    Alzbutas, R.; Urbonas, R.; Augutis, J.

    2001-01-01

    This paper presents risk and sensitivity analysis of external events impacts on the safe operation in general and in particular the Ignalina Nuclear Power Plant safety systems. Analysis is based on the deterministic and probabilistic assumptions and assessment of the external hazards. The real statistic data are used as well as initial external event simulation. The preliminary screening criteria are applied. The analysis of external event impact on the NPP safe operation, assessment of the event occurrence, sensitivity analysis, and recommendations for safety improvements are performed for investigated external hazards. Such events as aircraft crash, extreme rains and winds, forest fire and flying parts of the turbine are analysed. The models are developed and probabilities are calculated. As an example for sensitivity analysis the model of aircraft impact is presented. The sensitivity analysis takes into account the uncertainty features raised by external event and its model. Even in case when the external events analysis show rather limited danger, the sensitivity analysis can determine the highest influence causes. These possible variations in future can be significant for safety level and risk based decisions. Calculations show that external events cannot significantly influence the safety level of the Ignalina NPP operation, however the events occurrence and propagation can be sufficiently uncertain.(author)

  11. Probabilistic analysis of external events with focus on the Fukushima event

    Kollasko, Heiko; Jockenhoevel-Barttfeld, Mariana; Klapp, Ulrich

    2014-01-01

    External hazards are those natural or man-made hazards to a site and facilities that are originated externally to both the site and its processes, i.e. the duty holder may have very little or no control over the hazard. External hazards can have the potential of causing initiating events at the plant, typically transients like e.g., loss of offsite power. Simultaneously, external events may affect safety systems required to control the initiating event and, where applicable, also back-up systems implemented for risk-reduction. The plant safety may especially be threatened when loads from external hazards exceed the load assumptions considered in the design of safety-related systems, structures and components. Another potential threat is given by hazards inducing initiating events not considered in the safety demonstration otherwise. An example is loss of offsite power combined with prolonged plant isolation. Offsite support, e.g., delivery of diesel fuel oil, usually credited in the deterministic safety analysis may not be possible in this case. As the Fukushima events have shown, the biggest threat is likely given by hazards inducing both effects. Such hazards may well be dominant risk contributors even if their return period is very high. In order to identify relevant external hazards for a certain Nuclear Power Plant (NPP) location, a site specific screening analysis is performed, both for single events and for combinations of external events. As a result of the screening analysis, risk significant and therefore relevant (screened-in) single external events and combinations of them are identified for a site. The screened-in events are further considered in a detailed event tree analysis in the frame of the Probabilistic Safety Analysis (PSA) to calculate the core damage/large release frequency resulting from each relevant external event or from each relevant combination. Screening analyses of external events performed at AREVA are based on the approach provided

  12. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  13. Implementing recovery: an analysis of the key technologies in Scotland

    2011-01-01

    Background Over the past ten years the promotion of recovery has become a stated aim of mental health policies within a number of English speaking countries, including Scotland. Implementation of a recovery approach involves a significant reorientation of mental health services and practices, which often poses significant challenges for reformers. This article examines how four key technologies of recovery have assisted in the move towards the creation of a recovery-oriented mental health system in Scotland. Methods Drawing on documentary analysis and a series of interviews we examine the construction and implementation of four key recovery 'technologies' as they have been put to use in Scotland: recovery narratives, the Scottish Recovery Indicator (SRI), Wellness Recovery Action Planning (WRAP) and peer support. Results Our findings illuminate how each of these technologies works to instantiate, exemplify and disseminate a 'recovery orientation' at different sites within the mental health system in order to bring about a 'recovery oriented' mental health system. They also enable us to identify some of the factors that facilitate or hinder the effectiveness of those technologies in bringing about a change in how mental health services are delivered in Scotland. These finding provide a basis for some general reflections on the utility of 'recovery technologies' to implement a shift towards recovery in mental health services in Scotland and elsewhere. Conclusions Our analysis of this process within the Scottish context will be valuable for policy makers and service coordinators wishing to implement recovery values within their own national mental health systems. PMID:21569633

  14. Implementing recovery: an analysis of the key technologies in Scotland

    Sturdy Steve

    2011-05-01

    Full Text Available Abstract Background Over the past ten years the promotion of recovery has become a stated aim of mental health policies within a number of English speaking countries, including Scotland. Implementation of a recovery approach involves a significant reorientation of mental health services and practices, which often poses significant challenges for reformers. This article examines how four key technologies of recovery have assisted in the move towards the creation of a recovery-oriented mental health system in Scotland. Methods Drawing on documentary analysis and a series of interviews we examine the construction and implementation of four key recovery 'technologies' as they have been put to use in Scotland: recovery narratives, the Scottish Recovery Indicator (SRI, Wellness Recovery Action Planning (WRAP and peer support. Results Our findings illuminate how each of these technologies works to instantiate, exemplify and disseminate a 'recovery orientation' at different sites within the mental health system in order to bring about a 'recovery oriented' mental health system. They also enable us to identify some of the factors that facilitate or hinder the effectiveness of those technologies in bringing about a change in how mental health services are delivered in Scotland. These finding provide a basis for some general reflections on the utility of 'recovery technologies' to implement a shift towards recovery in mental health services in Scotland and elsewhere. Conclusions Our analysis of this process within the Scottish context will be valuable for policy makers and service coordinators wishing to implement recovery values within their own national mental health systems.

  15. Satellite-Observed Black Water Events off Southwest Florida: Implications for Coral Reef Health in the Florida Keys National Marine Sanctuary

    Brian Lapointe

    2013-01-01

    Full Text Available A “black water” event, as observed from satellites, occurred off southwest Florida in 2012. Satellite observations suggested that the event started in early January and ended in mid-April 2012. The black water patch formed off central west Florida and advected southward towards Florida Bay and the Florida Keys with the shelf circulation, which was confirmed by satellite-tracked surface drifter trajectories. Compared with a previous black water event in 2002, the 2012 event was weaker in terms of spatial and temporal coverage. An in situ survey indicated that the 2012 black water patch contained toxic K. brevis and had relatively low CDOM (colored dissolved organic matter and turbidity but high chlorophyll-a concentrations, while salinity was somewhat high compared with historical values. Further analysis revealed that the 2012 black water was formed by the K. brevis bloom initiated off central west Florida in late September 2011, while river runoff, Trichodesmium and possibly submarine groundwater discharge also played important roles in its formation. Black water patches can affect benthic coral reef communities by decreasing light availability at the bottom, and enhanced nutrient concentrations from black water patches support massive macroalgae growth that can overgrow coral reefs. It is thus important to continue the integrated observations where satellites provide synoptic and repeated observations of such adverse water quality events.

  16. Key terms for the assessment of the safety of vaccines in pregnancy: Results of a global consultative process to initiate harmonization of adverse event definitions.

    Munoz, Flor M; Eckert, Linda O; Katz, Mark A; Lambach, Philipp; Ortiz, Justin R; Bauwens, Jorgen; Bonhoeffer, Jan

    2015-11-25

    The variability of terms and definitions of Adverse Events Following Immunization (AEFI) represents a missed opportunity for optimal monitoring of safety of immunization in pregnancy. In 2014, the Brighton Collaboration Foundation and the World Health Organization (WHO) collaborated to address this gap. Two Brighton Collaboration interdisciplinary taskforces were formed. A landscape analysis included: (1) a systematic literature review of adverse event definitions used in vaccine studies during pregnancy; (2) a worldwide stakeholder survey of available terms and definitions; (3) and a series of taskforce meetings. Based on available evidence, taskforces proposed key terms and concept definitions to be refined, prioritized, and endorsed by a global expert consultation convened by WHO in Geneva, Switzerland in July 2014. Using pre-specified criteria, 45 maternal and 62 fetal/neonatal events were prioritized, and key terms and concept definitions were endorsed. In addition recommendations to further improve safety monitoring of immunization in pregnancy programs were specified. This includes elaboration of disease concepts into standardized case definitions with sufficient applicability and positive predictive value to be of use for monitoring the safety of immunization in pregnancy globally, as well as the development of guidance, tools, and datasets in support of a globally concerted approach. There is a need to improve the safety monitoring of immunization in pregnancy programs. A consensus list of terms and concept definitions of key events for monitoring immunization in pregnancy is available. Immediate actions to further strengthen monitoring of immunization in pregnancy programs are identified and recommended. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. ANALYSIS OF EVENT TOURISM IN RUSSIA, ITS FUNCTIONS, WAYS TO IMPROVE THE EFFICIENCY OF EVENT

    Mikhail Yur'evich Grushin

    2016-01-01

    Full Text Available This article considers one of the important directions of development of the national economy in the area of tourist services – development of event tourism in the Russian Federation. Today the market of event management in Russia is in the process of formation, therefore its impact on the socio-economic development of regions and Russia as a whole is minimal, and the analysis of the influence is not performed. This problem comes to the fore in the regions of Russia, specializing in the creation of event-direction tourist-recreational cluster. The article provides an analysis of the existing market of event management and event tourism functions. Providing the ways to improve the efficiency of event management and recommendations for the organizer of events in the regions. The article shows the specific role of event tourism in the national tourism and provides direction for the development of organizational and methodical recommendations on its formation in the regions of Russia and the creation of an effective management system at the regional level. The purpose of this article is to analyze the emerging in Russia event tourism market and its specifics. On the basis of these studies are considered folding patterns of the new market and the assessment of its impact on the modern national tourism industry. Methodology. To complete this article are used comparative and economic and statistical analysis methods. Conclusions/significance. The practical importance of this article is in the elimination of existing in the national tourism industry contradictions: on the one hand, in the Russian Federation is annually held a large amount events activities, including world-class in all regions say about tourist trips to the event, but the event tourism does not exist yet. In all regions, there is an internal and inbound tourism, but it has nothing to do with the event tourism. The article has a practical conclusions demonstrate the need to adapt the

  18. Comparative analysis as a basic research orientation: Key methodological problems

    N P Narbut

    2015-12-01

    Full Text Available To date, the Sociological Laboratory of the Peoples’ Friendship University of Russia has accumulated a vast experience in the field of cross-cultural studies reflected in the publications based on the results of mass surveys conducted in Moscow, Maikop, Beijing, Guangzhou, Prague, Belgrade, and Pristina. However, these publications mainly focus on the comparisons of the empirical data rather than methodological and technical issues, that is why the aim of this article is to identify key problems of the comparative analysis in cross-cultural studies that become evident only if you conduct an empirical research yourself - from the first step of setting the problem and approving it by all the sides (countries involved to the last step of interpreting and comparing the data obtained. The authors are sure that no sociologist would ever doubt the necessity and importance of comparative analysis in the broadest sense of the word, but at the same time very few are ready to discuss its key methodological challenges and prefer to ignore them completely. We summarize problems of the comparative analysis in sociology as follows: (1 applying research techniques to the sample in another country - both in translating and adapting them to different social realities and worldview (in particular, the problematic status of standardization and qualitative approach; (2 choosing “right” respondents to question and relevant cases (cultures to study; (3 designing the research scheme, i.e. justifying the sequence of steps (what should go first - methodology or techniques; (4 accepting the procedures that are correct within one country for cross-cultural work (whether or not that is an appropriate choice.

  19. Second-order analysis of semiparametric recurrent event processes.

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  20. Analysis of catchments response to severe drought event for ...

    Nafiisah

    The run sum analysis method was a sound method which indicates in ... intensity and duration of stream flow depletion between nearby catchments. ... threshold level analysis method, and allows drought events to be described in more.

  1. The Run 2 ATLAS Analysis Event Data Model

    SNYDER, S; The ATLAS collaboration; NOWAK, M; EIFERT, T; BUCKLEY, A; ELSING, M; GILLBERG, D; MOYSE, E; KOENEKE, K; KRASZNAHORKAY, A

    2014-01-01

    During the LHC's first Long Shutdown (LS1) ATLAS set out to establish a new analysis model, based on the experience gained during Run 1. A key component of this is a new Event Data Model (EDM), called the xAOD. This format, which is now in production, provides the following features: A separation of the EDM into interface classes that the user code directly interacts with, and data storage classes that hold the payload data. The user sees an Array of Structs (AoS) interface, while the data is stored in a Struct of Arrays (SoA) format in memory, thus making it possible to efficiently auto-vectorise reconstruction code. A simple way of augmenting and reducing the information saved for different data objects. This makes it possible to easily decorate objects with new properties during data analysis, and to remove properties that the analysis does not need. A persistent file format that can be explored directly with ROOT, either with or without loading any additional libraries. This allows fast interactive naviga...

  2. Preliminary safety analysis of unscrammed events for KLFR

    Kim, S.J.; Ha, G.S.

    2005-01-01

    The report presents the design features of KLFR; Safety Analysis Code; steady-state calculation results and analysis results of unscrammed events. The calculations of the steady-state and unscrammed events have been performed for the conceptual design of KLFR using SSC-K code. UTOP event results in no fuel damage and no centre-line melting. The inherent safety features are demonstrated through the analysis of ULOHS event. Although the analysis of ULOF has much uncertainties in the pump design, the analysis results show the inherent safety characteristics. 6% flow of rated flow of natural circulation is formed in the case of ULOF. In the metallic fuel rod, the cladding temperature is somewhat high due to the low heat transfer coefficient of lead. ULOHS event should be considered in design of RVACS for long-term cooling

  3. Social Network Analysis Identifies Key Participants in Conservation Development.

    Farr, Cooper M; Reed, Sarah E; Pejchar, Liba

    2018-05-01

    Understanding patterns of participation in private lands conservation, which is often implemented voluntarily by individual citizens and private organizations, could improve its effectiveness at combating biodiversity loss. We used social network analysis (SNA) to examine participation in conservation development (CD), a private land conservation strategy that clusters houses in a small portion of a property while preserving the remaining land as protected open space. Using data from public records for six counties in Colorado, USA, we compared CD participation patterns among counties and identified actors that most often work with others to implement CDs. We found that social network characteristics differed among counties. The network density, or proportion of connections in the network, varied from fewer than 2 to nearly 15%, and was higher in counties with smaller populations and fewer CDs. Centralization, or the degree to which connections are held disproportionately by a few key actors, was not correlated strongly with any county characteristics. Network characteristics were not correlated with the prevalence of wildlife-friendly design features in CDs. The most highly connected actors were biological and geological consultants, surveyors, and engineers. Our work demonstrates a new application of SNA to land-use planning, in which CD network patterns are examined and key actors are identified. For better conservation outcomes of CD, we recommend using network patterns to guide strategies for outreach and information dissemination, and engaging with highly connected actor types to encourage widespread adoption of best practices for CD design and stewardship.

  4. Economic Analysis on Key Challenges for Sustainable Aquaculture Development

    Gedefaw Abate, Tenaw

    challenges that could obstruct its sustainable development, such as a lack of suitable feed, which includes fishmeal, fish oil and live feed, and negative environmental externalities. If the aquaculture industry is to reach its full potential, it must be both environmentally and economically sustainable...... environmental externalities. A sustainable supply of high-quality live feeds at reasonable prices is absolutely essential for aquaculture hatcheries because many commercially produced high-value marine fish larval species, such as flounder, grouper, halibut, tuna and turbot, require live feed for their early...... developmental stage. The key challenge in this regard is that the conventional used live feed items, Artemia and rotifers, are nutritionally deficient. Thus, the first main purpose of the thesis is carrying out an economic analysis of the feasibility of commercial production and the use of an alternative live...

  5. Top event prevention analysis: A deterministic use of PRA

    Worrell, R.B.; Blanchard, D.P.

    1996-01-01

    This paper describes the application of Top Event Prevention Analysis. The analysis finds prevention sets which are combinations of basic events that can prevent the occurrence of a fault tree top event such as core damage. The problem analyzed in this application is that of choosing a subset of Motor-Operated Valves (MOVs) for testing under the Generic Letter 89-10 program such that the desired level of safety is achieved while providing economic relief from the burden of testing all safety-related valves. A brief summary of the method is given, and the process used to produce a core damage expression from Level 1 PRA models for a PWR is described. The analysis provides an alternative to the use of importance measures for finding the important combination of events in a core damage expression. This application of Top Event Prevention Analysis to the MOV problem was achieve with currently available software

  6. Satellite-Observed Black Water Events off Southwest Florida: Implications for Coral Reef Health in the Florida Keys National Marine Sanctuary

    Zhao, Jun; Hu, Chuanmin; Lapointe, Brian; Melo, Nelson; Johns, Elizabeth; Smith, Ryan

    2013-01-01

    A “black water” event, as observed from satellites, occurred off southwest Florida in 2012. Satellite observations suggested that the event started in early January and ended in mid-April 2012. The black water patch formed off central west Florida and advected southward towards Florida Bay and the Florida Keys with the shelf circulation, which was confirmed by satellite-tracked surface drifter trajectories. Compared with a previous black water event in 2002, the 2012 event was weaker in terms...

  7. Resonant experience in emergent events of analysis

    Revsbæk, Line

    2018-01-01

    Theory, and the traditions of thought available and known to us, give shape to what we are able to notice of our field of inquiry, and so also of our practice of research. Building on G. H. Mead’s Philosophy of the Present (1932), this paper draws attention to ‘emergent events’ of analysis when...... in responsive relating to (case study) others is made generative as a dynamic in and of case study analysis. Using a case of being a newcomer (to research communities) researching newcomer innovation (of others), ‘resonant experience’ is illustrated as a heuristic in interview analysis to simultaneously...

  8. Safety Analysis for Key Design Features of KALIMER-600 Design Concept

    Lee, Yong Bum; Kwon, Y. M.; Kim, E. K.; Suk, S. D.; Chang, W. P.; Jeong, H. Y.; Ha, K. S

    2007-02-15

    This report contains the safety analyses of the KALIMER-600 conceptual design which KAERI has been developing under the Long-term Nuclear R and D Program. The analyses have been performed reflecting the design developments during the second year of the 4th design phase in the program. The specific presentations are the key design features with the safety principles for achieving the safety objectives, the event categorization and safety criteria, and results on the safety analyses for the DBAs and ATWS events, the containment performance, and the channel blockages. The safety analyses for both the DBAs and ATWS events have been performed using SSC-K version 1.3., and the results have shown the fulfillment of the safety criteria for DBAs with conservative assumptions. The safety margins as well as the inherent safety also have been confirmed for the ATWS events. For the containment performance analysis, ORIGEN-2.1 and CONTAIN-LMR have been used. In results, the structural integrity has been acceptable and the evaluated exposure dose rate has been complied with 10 CFR 100 and PAG limits. The analysis results for flow blockages of 6-subchannels, 24-subchannels, and 54- subchannels with the MATRA-LMR-FB code, have assured the integrity of subassemblies.

  9. External events analysis for the Savannah River Site K reactor

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be ''external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 x 10 -4 per year, from which seismic events are the major contributor (1.2 x 10 -4 per year). Fire initiated events contribute 1.4 x 10 -7 per year, tornados 5.8 x 10 -7 per year, dam failures 1.5 x 10 -6 per year and the crane failure scenario less than 10 -4 per year to the core melt frequency. 8 refs., 3 figs., 5 tabs

  10. Analysis of key technologies for virtual instruments metrology

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  11. Formal Analysis of Key Integrity in PKCS#11

    Falcone, Andrea; Focardi, Riccardo

    PKCS#11 is a standard API to cryptographic devices such as smarcards, hardware security modules and usb crypto-tokens. Though widely adopted, this API has been shown to be prone to attacks in which a malicious user gains access to the sensitive keys stored in the devices. In 2008, Delaune, Kremer and Steel proposed a model to formally reason on this kind of attacks. We extend this model to also describe flaws that are based on integrity violations of the stored keys. In particular, we consider scenarios in which a malicious overwriting of keys might fool honest users into using attacker's own keys, while performing sensitive operations. We further enrich the model with a trusted key mechanism ensuring that only controlled, non-tampered keys are used in cryptographic operations, and we show how this modified API prevents the above mentioned key-replacement attacks.

  12. Preliminary safety analysis for key design features of KALIMER with breakeven core

    Hahn, Do Hee; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, Y. B.; Jeong, K. S

    2001-06-01

    KAERI is currently developing the conceptual design of a Liquid Metal Reactor, KALIMER (Korea Advanced Liquid MEtal Reactor) under the Long-term Nuclear R and D Program. KALIMER addresses key issues regarding future nuclear power plants such as plant safety, economics, proliferation, and waste. In this report, descriptions of safety design features and safety analyses results for selected ATWS accidents for the breakeven core KALIMER are presented. First, the basic approach to achieve the safety goal is introduced in Chapter 1, and the safety evaluation procedure for the KALIMER design is described in Chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events.In Chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure benign performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram (ATWS) have been performed to investigate the KALIMER system response to the events. In Chapter 4, the design of the KALIMER containment dome and the results of its performance analyses are presented. The design of the existing containment and the KALIMER containment dome are compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core energetics behavior during HCDA in Chapter 5. Sensitivity analyses have been performed for the KALIMER core behavior during super-prompt critical excursions, using mathematical formulations developed in the framework of the Modified Bethe-Tait method. Work energy potential was then calculated based on the isentropic fuel expansion model.

  13. A Fourier analysis of extremal events

    Zhao, Yuwei

    is the extremal periodogram. The extremal periodogram shares numerous asymptotic properties with the periodogram of a linear process in classical time series analysis: the asymptotic distribution of the periodogram ordinates at the Fourier frequencies have a similar form and smoothed versions of the periodogram...

  14. Event analysis using a massively parallel processor

    Bale, A.; Gerelle, E.; Messersmith, J.; Warren, R.; Hoek, J.

    1990-01-01

    This paper describes a system for performing histogramming of n-tuple data at interactive rates using a commercial SIMD processor array connected to a work-station running the well-known Physics Analysis Workstation software (PAW). Results indicate that an order of magnitude performance improvement over current RISC technology is easily achievable

  15. Multi-Unit Initiating Event Analysis for a Single-Unit Internal Events Level 1 PSA

    Kim, Dong San; Park, Jin Hee; Lim, Ho Gon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The Fukushima nuclear accident in 2011 highlighted the importance of considering the risks from multi-unit accidents at a site. The ASME/ANS probabilistic risk assessment (PRA) standard also includes some requirements related to multi-unit aspects, one of which (IE-B5) is as follows: 'For multi-unit sites with shared systems, DO NOT SUBSUME multi-unit initiating events if they impact mitigation capability [1].' However, the existing single-unit PSA models do not explicitly consider multi-unit initiating events and hence systems shared by multiple units (e.g., alternate AC diesel generator) are fully credited for the single unit and ignores the need for the shared systems by other units at the same site [2]. This paper describes the results of the multi-unit initiating event (IE) analysis performed as a part of the at-power internal events Level 1 probabilistic safety assessment (PSA) for an OPR1000 single unit ('reference unit'). In this study, a multi-unit initiating event analysis for a single-unit PSA was performed, and using the results, dual-unit LOOP initiating event was added to the existing PSA model for the reference unit (OPR1000 type). Event trees were developed for dual-unit LOOP and dual-unit SBO which can be transferred from dual- unit LOOP. Moreover, CCF basic events for 5 diesel generators were modelled. In case of simultaneous SBO occurrences in both units, this study compared two different assumptions on the availability of the AAC D/G. As a result, when dual-unit LOOP initiating event was added to the existing single-unit PSA model, the total CDF increased by 1∼ 2% depending on the probability that the AAC D/G is available to a specific unit in case of simultaneous SBO in both units.

  16. Key Sustainability Performance Indicator Analysis for Czech Breweries

    Edward Kasem

    2015-01-01

    Full Text Available Sustainability performance can be said to be an ability of an organization to remain productive over time and hold on to its potential for maintaining long-term profitability. Since the brewery sector is one of the most important and leading markets in the foodstuff industry of the Czech Republic, this study depicts the Czech breweries’ formal entry into sustainability reporting and performance. The purpose of this paper is to provide an efficiency level evaluation which would represent the level of corporate performance of Czech breweries. For this reason, Data Envelopment Analysis (DEA is introduced. In order to apply it, we utilize a set of key performance indicators (KPIs based on two international standard frameworks: the Global Reporting Initiative (GRI and its GRI 4 guidelines, and the guideline KPIs for ESG 3.0, which was published by the DVFA Society. Four sustainability dimensions (economic, environmental, social and governance are covered, making it thus possible to adequately evaluate sustainability performance in Czech breweries. The main output is not only the efficiency score of the company but also the input weights. These weights are used to determine the contribution of particular criteria to the breweries’ achieved score. According to the achieved efficiency results for Czech breweries, the percentage of women supervising the company does not affect the sustainability performance.

  17. A Sensitivity Analysis Approach to Identify Key Environmental Performance Factors

    Xi Yu

    2014-01-01

    Full Text Available Life cycle assessment (LCA is widely used in design phase to reduce the product’s environmental impacts through the whole product life cycle (PLC during the last two decades. The traditional LCA is restricted to assessing the environmental impacts of a product and the results cannot reflect the effects of changes within the life cycle. In order to improve the quality of ecodesign, it is a growing need to develop an approach which can reflect the changes between the design parameters and product’s environmental impacts. A sensitivity analysis approach based on LCA and ecodesign is proposed in this paper. The key environmental performance factors which have significant influence on the products’ environmental impacts can be identified by analyzing the relationship between environmental impacts and the design parameters. Users without much environmental knowledge can use this approach to determine which design parameter should be first considered when (redesigning a product. A printed circuit board (PCB case study is conducted; eight design parameters are chosen to be analyzed by our approach. The result shows that the carbon dioxide emission during the PCB manufacture is highly sensitive to the area of PCB panel.

  18. Analysis of event-mode data with Interactive Data Language

    De Young, P.A.; Hilldore, B.B.; Kiessel, L.M.; Peaslee, G.F.

    2003-01-01

    We have developed an analysis package for event-mode data based on Interactive Data Language (IDL) from Research Systems Inc. This high-level language is high speed, array oriented, object oriented, and has extensive visual (multi-dimensional plotting) and mathematical functions. We have developed a general framework, written in IDL, for the analysis of a variety of experimental data that does not require significant customization for each analysis. Unlike many traditional analysis package, spectra and gates are applied after data are read and are easily changed as analysis proceeds without rereading the data. The events are not sequentially processed into predetermined arrays subject to predetermined gates

  19. Balboa: A Framework for Event-Based Process Data Analysis

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  20. Analysis of unprotected overcooling events in the Integral Fast Reactor

    Vilim, R.B.

    1989-01-01

    Simple analytic models are developed for predicting the response of a metal fueled, liquid-metal cooled reactor to unprotected overcooling events in the balance of plant. All overcooling initiators are shown to fall into two categories. The first category contains these events for which there is no final equilibrium state of constant overcooling, as in the case for a large steam leak. These events are analyzed using a non-flow control mass approach. The second category contains those events which will eventually equilibrate, such as a loss of feedwater heaters. A steady flow control volume analysis shows that these latter events ultimately affect the plant through the feedwater inlet to the steam generator. The models developed for analyzing these two categories provide upper bounds for the reactor's passive response to overcooling accident initiators. Calculation of these bounds for a prototypic plant indicate that failure limits -- eutectic melting, sodium boiling, fuel pin failure -- are not exceeded in any overcooling event. 2 refs

  1. Repeated Time-to-event Analysis of Consecutive Analgesic Events in Postoperative Pain

    Juul, Rasmus Vestergaard; Rasmussen, Sten; Kreilgaard, Mads

    2015-01-01

    BACKGROUND: Reduction in consumption of opioid rescue medication is often used as an endpoint when investigating analgesic efficacy of drugs by adjunct treatment, but appropriate methods are needed to analyze analgesic consumption in time. Repeated time-to-event (RTTE) modeling is proposed as a way...... to describe analgesic consumption by analyzing the timing of consecutive analgesic events. METHODS: Retrospective data were obtained from 63 patients receiving standard analgesic treatment including morphine on request after surgery following hip fracture. Times of analgesic events up to 96 h after surgery...... were extracted from hospital medical records. Parametric RTTE analysis was performed with exponential, Weibull, or Gompertz distribution of analgesic events using NONMEM®, version 7.2 (ICON Development Solutions, USA). The potential influences of night versus day, sex, and age were investigated...

  2. Sovereign Default Analysis through Extreme Events Identification

    Vasile George MARICA

    2015-06-01

    Full Text Available This paper investigates contagion in international credit markets through the use of a novel jump detection technique proposed by Chan and Maheuin (2002. This econometrical methodology is preferred because it is non-linear by definition and not a subject to volatility bias. Also, the identified jumps in CDS premiums are considered as outliers positioned beyond any stochastic movement that can and is already modelled through well-known linear analysis. Though contagion is hard to define, we show that extreme discrete movements in default probabilities inferred from CDS premiums can lead to sound economic conclusions about the risk profile of sovereign nations in international bond markets. We find evidence of investor sentiment clustering for countries with unstable political regimes or that are engaged in armed conflict. Countries that have in their recent history faced currency or financial crises are less vulnerable to external unexpected shocks. First we present a brief history of sovereign defaults with an emphasis on their increased frequency and geographical reach, as financial markets become more and more integrated. We then pass to a literature review of the most important definitions for contagion, and discuss what quantitative methods are available to detect the presence of contagion. The paper continues with the details for the methodology of jump detection through non-linear modelling and its use in the field of contagion identification. In the last sections we present the estimation results for simultaneous jumps between emerging markets CDS and draw conclusions on the difference of behavior in times of extreme movement versus tranquil periods.

  3. Discrete event simulation versus conventional system reliability analysis approaches

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  4. Glaciological parameters of disruptive event analysis

    Bull, C.

    1980-04-01

    The possibility of complete glaciation of the earth is small and probably need not be considered in the consequence analysis by the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program. However, within a few thousand years an ice sheet may well cover proposed waste disposal sites in Michigan. Those in the Gulf Coast region and New Mexico are unlikely to be ice covered. The probability of ice cover at Hanford in the next million years is finite, perhaps about 0.5. Sea level will fluctuate as a result of climatic changes. As ice sheets grow, sea level will fall. Melting of ice sheets will be accompanied by a rise in sea level. Within the present interglacial period there is a definite chance that the West Antarctic ice sheet will melt. Ice sheets are agents of erosion, and some estimates of the amount of material they erode have been made. As an average over the area glaciated by late Quaternary ice sheets, only a few tens of meters of erosion is indicated. There were perhaps 3 meters of erosion per glaciation cycle. Under glacial conditions the surface boundary conditions for ground water recharge will be appreciably changed. In future glaciations melt-water rivers generally will follow pre-existing river courses. Some salt dome sites in the Gulf Coast region could be susceptible to changes in the course of the Mississippi River. The New Mexico site, which is on a high plateau, seems to be immune from this type of problem. The Hanford Site is only a few miles from the Columbia River, and in the future, lateral erosion by the Columbia River could cause changes in its course. A prudent assumption in the AEGIS study is that the present interglacial will continue for only a limited period and that subsequently an ice sheet will form over North America. Other factors being equal, it seems unwise to site a nuclear waste repository (even at great depth) in an area likely to be glaciated

  5. Safety analysis for key design features of KALIMER-600 design concept

    Lee, Yong-Bum; Kwon, Y. M.; Kim, E. K.; Suk, S. D.; Chang, W. P.; Joeng, H. Y.; Ha, K. S.; Heo, S.

    2005-03-01

    KAERI is developing the conceptual design of a Liquid Metal Reactor, KALIMER-600 (Korea Advanced LIquid MEtal Reactor) under the Long-term Nuclear R and D Program. KALIMER-600 addresses key issues regarding future nuclear power plants such as plant safety, economics, proliferation, and waste. In this report, key safety design features are described and safety analyses results for typical ATWS accidents, containment design basis accidents, and flow blockages in the KALIMER design are presented. First, the basic approach to achieve the safety goal and main design features of KALIMER-600 are introduced in Chapter 1, and the event categorization and acceptance criteria for the KALIMER-600 safety analysis are described in Chapter 2, In Chapter 3, results of inherent safety evaluations for the KALIMER-600 conceptual design are presented. The KALIMER-600 core and plant system are designed to assure benign performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram (ATWS) have been performed using the SSC-K code to investigate the KALIMER-600 system response to the events. The objectives of Chapter 4, are to assess the response of KALIMER-600 containment to the design basis accidents and to evaluate whether the consequences are acceptable or not in the aspect of structural integrity and the exposure dose rate. In Chapter 5, the analysis of flow blockage for KALIMER-600 with the MATRA-LMR-FB code, which has been developed for the internal flow blockage in a LMR subassembly, are described. The cases with a blockage of 6-subchannel, 24-subchannel, and 54-subchannel are analyzed

  6. Screening key candidate genes and pathways involved in insulinoma by microarray analysis.

    Zhou, Wuhua; Gong, Li; Li, Xuefeng; Wan, Yunyan; Wang, Xiangfei; Li, Huili; Jiang, Bin

    2018-06-01

    Insulinoma is a rare type tumor and its genetic features remain largely unknown. This study aimed to search for potential key genes and relevant enriched pathways of insulinoma.The gene expression data from GSE73338 were downloaded from Gene Expression Omnibus database. Differentially expressed genes (DEGs) were identified between insulinoma tissues and normal pancreas tissues, followed by pathway enrichment analysis, protein-protein interaction (PPI) network construction, and module analysis. The expressions of candidate key genes were validated by quantitative real-time polymerase chain reaction (RT-PCR) in insulinoma tissues.A total of 1632 DEGs were obtained, including 1117 upregulated genes and 514 downregulated genes. Pathway enrichment results showed that upregulated DEGs were significantly implicated in insulin secretion, and downregulated DEGs were mainly enriched in pancreatic secretion. PPI network analysis revealed 7 hub genes with degrees more than 10, including GCG (glucagon), GCGR (glucagon receptor), PLCB1 (phospholipase C, beta 1), CASR (calcium sensing receptor), F2R (coagulation factor II thrombin receptor), GRM1 (glutamate metabotropic receptor 1), and GRM5 (glutamate metabotropic receptor 5). DEGs involved in the significant modules were enriched in calcium signaling pathway, protein ubiquitination, and platelet degranulation. Quantitative RT-PCR data confirmed that the expression trends of these hub genes were similar to the results of bioinformatic analysis.The present study demonstrated that candidate DEGs and enriched pathways were the potential critical molecule events involved in the development of insulinoma, and these findings were useful for better understanding of insulinoma genesis.

  7. Analysis of Secret Key Randomness Exploiting the Radio Channel Variability

    Taghrid Mazloum

    2015-01-01

    Full Text Available A few years ago, physical layer based techniques have started to be considered as a way to improve security in wireless communications. A well known problem is the management of ciphering keys, both regarding the generation and distribution of these keys. A way to alleviate such difficulties is to use a common source of randomness for the legitimate terminals, not accessible to an eavesdropper. This is the case of the fading propagation channel, when exact or approximate reciprocity applies. Although this principle has been known for long, not so many works have evaluated the effect of radio channel properties in practical environments on the degree of randomness of the generated keys. To this end, we here investigate indoor radio channel measurements in different environments and settings at either 2.4625 GHz or 5.4 GHz band, of particular interest for WIFI related standards. Key bits are extracted by quantizing the complex channel coefficients and their randomness is evaluated using the NIST test suite. We then look at the impact of the carrier frequency, the channel variability in the space, time, and frequency degrees of freedom used to construct a long secret key, in relation to the nature of the radio environment such as the LOS/NLOS character.

  8. Serious adverse events with infliximab: analysis of spontaneously reported adverse events.

    Hansen, Richard A; Gartlehner, Gerald; Powell, Gregory E; Sandler, Robert S

    2007-06-01

    Serious adverse events such as bowel obstruction, heart failure, infection, lymphoma, and neuropathy have been reported with infliximab. The aims of this study were to explore adverse event signals with infliximab by using a long period of post-marketing experience, stratifying by indication. The relative reporting of infliximab adverse events to the U.S. Food and Drug Administration (FDA) was assessed with the public release version of the adverse event reporting system (AERS) database from 1968 to third quarter 2005. On the basis of a systematic review of adverse events, Medical Dictionary for Regulatory Activities (MedDRA) terms were mapped to predefined categories of adverse events, including death, heart failure, hepatitis, infection, infusion reaction, lymphoma, myelosuppression, neuropathy, and obstruction. Disproportionality analysis was used to calculate the empiric Bayes geometric mean (EBGM) and corresponding 90% confidence intervals (EB05, EB95) for adverse event categories. Infliximab was identified as the suspect medication in 18,220 reports in the FDA AERS database. We identified a signal for lymphoma (EB05 = 6.9), neuropathy (EB05 = 3.8), infection (EB05 = 2.9), and bowel obstruction (EB05 = 2.8). The signal for granulomatous infections was stronger than the signal for non-granulomatous infections (EB05 = 12.6 and 2.4, respectively). The signals for bowel obstruction and infusion reaction were specific to patients with IBD; this suggests potential confounding by indication, especially for bowel obstruction. In light of this additional evidence of risk of lymphoma, neuropathy, and granulomatous infections, clinicians should stress this risk in the shared decision-making process.

  9. Chronic Absenteeism: A Key Indicator of Student Success. Policy Analysis

    Rafa, Alyssa

    2017-01-01

    Research shows that chronic absenteeism can affect academic performance in later grades and is a key early warning sign that a student is more likely to drop out of high school. Several states enacted legislation to address this issue, and many states are currently discussing the utility of chronic absenteeism as an indicator of school quality or…

  10. Time to Tenure in Spanish Universities: An Event History Analysis

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility. PMID:24116199

  11. Time to tenure in Spanish universities: an event history analysis.

    Sanz-Menéndez, Luis; Cruz-Castro, Laura; Alva, Kenedy

    2013-01-01

    Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  12. Time to tenure in Spanish universities: an event history analysis.

    Luis Sanz-Menéndez

    Full Text Available Understanding how institutional incentives and mechanisms for assigning recognition shape access to a permanent job is important. This study, based on data from questionnaire survey responses and publications of 1,257 university science, biomedical and engineering faculty in Spain, attempts to understand the timing of getting a permanent position and the relevant factors that account for this transition, in the context of dilemmas between mobility and permanence faced by organizations. Using event history analysis, the paper looks at the time to promotion and the effects of some relevant covariates associated to academic performance, social embeddedness and mobility. We find that research productivity contributes to career acceleration, but that other variables are also significantly associated to a faster transition. Factors associated to the social elements of academic life also play a role in reducing the time from PhD graduation to tenure. However, mobility significantly increases the duration of the non-tenure stage. In contrast with previous findings, the role of sex is minor. The variations in the length of time to promotion across different scientific domains is confirmed, with faster career advancement for those in the Engineering and Technological Sciences compared with academics in the Biological and Biomedical Sciences. Results show clear effects of seniority, and rewards to loyalty, in addition to some measurements of performance and quality of the university granting the PhD, as key elements speeding up career advancement. Findings suggest the existence of a system based on granting early permanent jobs to those that combine social embeddedness and team integration with some good credentials regarding past and potential future performance, rather than high levels of mobility.

  13. Tight finite-key analysis for quantum cryptography.

    Tomamichel, Marco; Lim, Charles Ci Wen; Gisin, Nicolas; Renner, Renato

    2012-01-17

    Despite enormous theoretical and experimental progress in quantum cryptography, the security of most current implementations of quantum key distribution is still not rigorously established. One significant problem is that the security of the final key strongly depends on the number, M, of signals exchanged between the legitimate parties. Yet, existing security proofs are often only valid asymptotically, for unrealistically large values of M. Another challenge is that most security proofs are very sensitive to small differences between the physical devices used by the protocol and the theoretical model used to describe them. Here we show that these gaps between theory and experiment can be simultaneously overcome by using a recently developed proof technique based on the uncertainty relation for smooth entropies.

  14. Microprocessor event analysis in parallel with Camac data acquisition

    Cords, D.; Eichler, R.; Riege, H.

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a Camac System (GEC-ELLIOTT System Crate) and shares the Camac access with a Nord-1OS computer. Interfaces have been designed and tested for execution of Camac cycles, communication with the Nord-1OS computer and DMA-transfer from Camac to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-1OS computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the result of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-1OS buffer will be reset and the event omitted from further processing. (orig.)

  15. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K. (and others)

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  16. Microprocessor event analysis in parallel with CAMAC data acquisition

    Cords, D; Riege, H

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a CAMAC System (GEC-ELLIOTT System Crate) and shares the CAMAC access with a Nord-10S computer. Interfaces have been designed and tested for execution of CAMAC cycles, communication with the Nord-10S computer and DMA-transfer from CAMAC to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-10S computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the results of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-10S buffer will be reset and the event omitted from further processing. (5 refs).

  17. Numerical limit analysis of keyed shear joints in concrete structures

    Herfelt, Morten Andersen; Poulsen, Peter Noe; Hoang, Linh Cao

    2016-01-01

    This paper concerns the shear capacity of keyed joints, which are transversely reinforced with overlapping U-bar loops. It is known from experimental studies that the discontinuity of the transverse reinforcement affects the capacity as well as the failure mode; however, to the best knowledge...... theorem and uses the modified Mohr-Coulomb yield criterion, which is formulated for second-order cone programming. The model provides a statically admissible stress field as well as the failure mode. Twenty-four different test specimens are modelled and the calculations are compared to the experimental...

  18. Florbetaben PET in the Early Diagnosis of Alzheimer's Disease: A Discrete Event Simulation to Explore Its Potential Value and Key Data Gaps

    Guo, Shien; Getsios, Denis; Hernandez, Luis; Cho, Kelly; Lawler, Elizabeth; Altincatal, Arman; Lanes, Stephan; Blankenburg, Michael

    2012-01-01

    The growing understanding of the use of biomarkers in Alzheimer's disease (AD) may enable physicians to make more accurate and timely diagnoses. Florbetaben, a beta-amyloid tracer used with positron emission tomography (PET), is one of these diagnostic biomarkers. This analysis was undertaken to explore the potential value of florbetaben PET in the diagnosis of AD among patients with suspected dementia and to identify key data that are needed to further substantiate its value. A discrete event simulation was developed to conduct exploratory analyses from both US payer and societal perspectives. The model simulates the lifetime course of disease progression for individuals, evaluating the impact of their patient management from initial diagnostic work-up to final diagnosis. Model inputs were obtained from specific analyses of a large longitudinal dataset from the New England Veterans Healthcare System and supplemented with data from public data sources and assumptions. The analyses indicate that florbetaben PET has the potential to improve patient outcomes and reduce costs under certain scenarios. Key data on the use of florbetaben PET, such as its influence on time to confirmation of final diagnosis, treatment uptake, and treatment persistency, are unavailable and would be required to confirm its value. PMID:23326754

  19. Florbetaben PET in the Early Diagnosis of Alzheimer's Disease: A Discrete Event Simulation to Explore Its Potential Value and Key Data Gaps

    Shien Guo

    2012-01-01

    Full Text Available The growing understanding of the use of biomarkers in Alzheimer's disease (AD may enable physicians to make more accurate and timely diagnoses. Florbetaben, a beta-amyloid tracer used with positron emission tomography (PET, is one of these diagnostic biomarkers. This analysis was undertaken to explore the potential value of florbetaben PET in the diagnosis of AD among patients with suspected dementia and to identify key data that are needed to further substantiate its value. A discrete event simulation was developed to conduct exploratory analyses from both US payer and societal perspectives. The model simulates the lifetime course of disease progression for individuals, evaluating the impact of their patient management from initial diagnostic work-up to final diagnosis. Model inputs were obtained from specific analyses of a large longitudinal dataset from the New England Veterans Healthcare System and supplemented with data from public data sources and assumptions. The analyses indicate that florbetaben PET has the potential to improve patient outcomes and reduce costs under certain scenarios. Key data on the use of florbetaben PET, such as its influence on time to confirmation of final diagnosis, treatment uptake, and treatment persistency, are unavailable and would be required to confirm its value.

  20. Safety analysis reports. Current status (third key report)

    1999-01-01

    A review of Ukrainian regulations and laws concerned with Nuclear power and radiation safety is presented with an overview of the requirements for the Safety Analysis Report Contents. Status of Safety Analysis Reports (SAR) is listed for each particular Ukrainian NPP including SAR development schedules. Organisational scheme of SAR development works includes: general technical co-ordination on Safety Analysis Report development; list of leading organisations and utilization of technical support within international projects

  1. Event history analysis and the cross-section

    Keiding, Niels

    2006-01-01

    Examples are given of problems in event history analysis, where several time origins (generating calendar time, age, disease duration, time on study, etc.) are considered simultaneously. The focus is on complex sampling patterns generated around a cross-section. A basic tool is the Lexis diagram....

  2. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-01-01

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility

  3. Root Cause Analysis: Learning from Adverse Safety Events.

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. © RSNA, 2015.

  4. Physics analysis of the gang partial rod drive event

    Boman, C.; Frost, R.L.

    1992-08-01

    During the routine positioning of partial-length control rods in Gang 3 on the afternoon of Monday, July 27, 1992, the partial-length rods continued to drive into the reactor even after the operator released the controlling toggle switch. In response to this occurrence, the Safety Analysis and Engineering Services Group (SAEG) requested that the Applied Physics Group (APG) analyze the gang partial rod drive event. Although similar accident scenarios were considered in analysis for Chapter 15 of the Safety Analysis Report (SAR), APG and SAEG conferred and agreed that this particular type of gang partial-length rod motion event was not included in the SAR. This report details this analysis

  5. LOSP-initiated event tree analysis for BWR

    Watanabe, Norio; Kondo, Masaaki; Uno, Kiyotaka; Chigusa, Takeshi; Harami, Taikan

    1989-03-01

    As a preliminary study of 'Japanese Model Plant PSA', a LOSP (loss of off-site power)-initiated Event Tree Analysis for a Japanese typical BWR was carried out solely based on the open documents such as 'Safety Analysis Report'. The objectives of this analysis are as follows; - to delineate core-melt accident sequences initiated by LOSP, - to evaluate the importance of core-melt accident sequences in terms of occurrence frequency, and - to develop a foundation of plant information and analytical procedures for efficiently performing further 'Japanese Model Plant PSA'. This report describes the procedure and results of the LOSP-initiated Event Tree Analysis. In this analysis, two types of event trees, Functional Event Tree and Systemic Event Tree, were developed to delineate core-melt accident sequences and to quantify their frequencies. Front-line System Event Tree was prepared as well to provide core-melt sequence delineation for accident progression analysis of Level 2 PSA which will be followed in a future. Applying U.S. operational experience data such as component failure rates and a LOSP frequency, we obtained the following results; - The total frequency of core-melt accident sequences initiated by LOSP is estimated at 5 x 10 -4 per reactor-year. - The dominant sequences are 'Loss of Decay Heat Removal' and 'Loss of Emergency Electric Power Supply', which account for more than 90% of the total core-melt frequency. In this analysis, a higher value of 0.13/R·Y was used for the LOSP frequency than experiences in Japan and any recovery action was not considered. In fact, however, there has been no experience of LOSP event in Japanese nuclear power plants so far and it is also expected that offsite power and/or PCS would be recovered before core melt. Considering Japanese operating experience and recovery factors will reduce the total core-melt frequency to less than 10 -6 per reactor-year. (J.P.N.)

  6. Evaluation of Fourier integral. Spectral analysis of seismic events

    Chitaru, Cristian; Enescu, Dumitru

    2003-01-01

    Spectral analysis of seismic events represents a method for great earthquake prediction. The seismic signal is not a sinusoidal signal; for this, it is necessary to find a method for best approximation of real signal with a sinusoidal signal. The 'Quanterra' broadband station allows the data access in numerical and/or graphical forms. With the numerical form we can easily make a computer program (MSOFFICE-EXCEL) for spectral analysis. (authors)

  7. Twitter data analysis: temporal and term frequency analysis with real-time event

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  8. Events

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  9. Top event prevention analysis - a deterministic use of PRA

    Blanchard, D.P.; Worrell, R.B.

    1995-01-01

    Risk importance measures are popular for many applications of probabilistic analysis. Inherent in the derivation of risk importance measures are implicit assumptions that those using these numerical results should be aware of in their decision making. These assumptions and potential limitations include the following: (1) The risk importance measures are derived for a single event at a time and are therefore valid only if all other event probabilities are unchanged at their current values. (2) The results for which risk importance measures are derived may not be complete for reasons such as truncation

  10. Static Analysis for Event-Based XML Processing

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  11. Using discriminant analysis as a nucleation event classification method

    S. Mikkonen

    2006-01-01

    Full Text Available More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  12. External events analysis in PSA studies for Czech NPPs

    Holy, J.; Hustak, S.; Kolar, L.; Jaros, M.; Hladky, M.; Mlady, O.

    2014-01-01

    The purpose of the paper is to summarize current status of natural external hazards analysis in the PSA projects maintained in Czech Republic for both Czech NPPs - Dukovany and Temelin. The focus of the presentation is put upon the basic milestones in external event analysis effort - identification of external hazards important for Czech NPPs sites, screening out of the irrelevant hazards, modeling of plant response to the initiating events, including the basic activities regarding vulnerability and fragility analysis (supported with on-site analysis), quantification of accident sequences, interpretation of results and development of measures decreasing external events risk. The following external hazards are discussed in the paper, which have been addressed during several last years in PSA projects for Czech NPPs: 1)seismicity, 2)extremely low temperature 3)extremely high temperature 4)extreme wind 5)extreme precipitation (water, snow) 6)transport of dangerous substances (as an example of man-made hazard with some differences identified in comparison with natural hazards) 7)other hazards, which are not considered as very important for Czech NPPs, were screened out in the initial phase of the analysis, but are known as potential problem areas abroad. The paper is a result of coordinated effort with participation of experts and staff from engineering support organization UJV Rez, a.s. and NPPs located in Czech Republic - Dukovany and Temelin. (authors)

  13. Transcriptome analysis elucidates key developmental components of bryozoan lophophore development

    Wong, Yue Him

    2014-10-10

    The most recent phylogenomic study suggested that Bryozoa (Ectoprocta), Brachiopoda, and Phoronida are monophyletic, implying that the lophophore of bryozoans, phoronids and brachiopods is a synapomorphy. Understanding the molecular mechanisms of the lophophore development of the Lophophorata clade can therefore provide us a new insight into the formation of the diverse morphological traits in metazoans. In the present study, we profiled the transcriptome of the Bryozoan (Ectoproct) Bugula neritina during the swimming larval stage (SW) and the early (4 h) and late (24 h) metamorphic stages using the Illumina HiSeq2000 platform. Various genes that function in development, the immune response and neurogenesis showed differential expression levels during metamorphosis. In situ hybridization of 23 genes that participate in the Wnt, BMP, Notch, and Hedgehog signaling pathways revealed their regulatory roles in the development of the lophophore and the ancestrula digestive tract. Our findings support the hypothesis that developmental precursors of the lophophore and the ancestrula digestive tract are pre-patterned by the differential expression of key developmental genes according to their fate. This study provides a foundation to better understand the developmental divergence and/or convergence among developmental precursors of the lophophore of bryozoans, branchiopods and phoronids.

  14. Analysis of system and of course of events

    Hoertner, H.; Kersting, E.J.; Puetter, B.M.

    1986-01-01

    The analysis of the system and of the course of events is used to determine the frequency of core melt-out accidents and to describe the safety-related boundary conditions of appropriate accidents. The lecture is concerned with the effect of system changes in the reference plant and the effect of triggering events not assessed in detail or not sufficiently assessed in detail in phase A of the German Risk Study on the frequency of core melt-out accidents, the minimum requirements for system functions for controlling triggering events, i.e. to prevent core melt-out accidents, the reliability data important for reliability investigations and frequency assessments. (orig./DG) [de

  15. Application and Use of PSA-based Event Analysis in Belgium

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  16. Fumonisin exposure in women linked to inhibition of an enzyme that is a key event in farm and laboratory animal diseases.

    Fumonisin B1 (FB1) is a toxic chemical produced by molds. The molds that produce fumonisin are common in corn. Consumption of contaminated corn by farm animals has been shown to be the cause of animal disease. The proximate cause (key event) in the induction of diseases in animals is inhibition of t...

  17. EVNTRE, Code System for Event Progression Analysis for PRA

    2002-01-01

    1 - Description of program or function: EVNTRE is a generalized event tree processor that was developed for use in probabilistic risk analysis of severe accident progressions for nuclear power plants. The general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. EVNTRE efficiently processes large, complex event trees. It can assign probabilities to event tree branch points in several different ways, classify pathways or outcomes into user-specified groupings, and sample input distributions of probabilities and parameters. PSTEVNT, a post-processor program used to sort and reclassify the 'binned' data output from EVNTRE and generate summary tables, is included. 2 - Methods: EVNTRE processes event trees that are cast in the form of questions or events, with multiple choice answers for each question. Split fractions (probabilities or frequencies that sum to unity) are either supplied or calculated for the branches of each question in a path-dependent manner. EVNTRE traverses the tree, enumerating the leaves of the tree and calculating their probabilities or frequencies based upon the initial probability or frequency and the split fractions for the branches taken along the corresponding path to an individual leaf. The questions in the event tree are usually grouped to address specific phases of time regimes in the progression of the scenario or severe accident. Grouping or binning of each path through the event tree in terms of a small number of characteristics or attributes is allowed. Boolean expressions of the branches taken are used to select the appropriate values of the characteristics of interest for the given path. Typically, the user specifies a cutoff tolerance for the frequency of a pathway to terminate further exploration. Multiple sets of input to an event tree can be processed by using Monte Carlo sampling to generate

  18. Incident sequence analysis; event trees, methods and graphical symbols

    1980-11-01

    When analyzing incident sequences, unwanted events resulting from a certain cause are looked for. Graphical symbols and explanations of graphical representations are presented. The method applies to the analysis of incident sequences in all types of facilities. By means of the incident sequence diagram, incident sequences, i.e. the logical and chronological course of repercussions initiated by the failure of a component or by an operating error, can be presented and analyzed simply and clearly

  19. Analysis of operation events for HFETR emergency diesel generator set

    Li Zhiqiang; Ji Xifang; Deng Hong

    2015-01-01

    By the statistic analysis of the historical failure data of the emergency diesel generator set, the specific mode, the attribute, and the direct and root origin for each failure are reviewed and summarized. Considering the current status of the emergency diesel generator set, the preventive measures and solutions in terms of operation, handling and maintenance are proposed, and the potential events for the emergency diesel generator set are analyzed. (authors)

  20. Practical guidance for statistical analysis of operational event data

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies

  1. Root cause analysis of critical events in neurosurgery, New South Wales.

    Perotti, Vanessa; Sheridan, Mark M P

    2015-09-01

    Adverse events reportedly occur in 5% to 10% of health care episodes. Not all adverse events are the result of error; they may arise from systemic faults in the delivery of health care. Catastrophic events are not only physically devastating to patients, but they also attract medical liability and increase health care costs. Root cause analysis (RCA) has become a key tool for health care services to understand those adverse events. This study is a review of all the RCA case reports involving neurosurgical patients in New South Wales between 2008 and 2013. The case reports and data were obtained from the Clinical Excellence Commission database. The data was then categorized by the root causes identified and the recommendations suggested by the RCA committees. Thirty-two case reports were identified in the RCA database. Breaches in policy account for the majority of root causes identified, for example, delays in transfer of patients or wrong-site surgery, which always involved poor adherence to correct patient and site identification procedures. The RCA committees' recommendations included education for staff, and improvements in rostering and procedural guidelines. RCAs have improved the patient safety profile; however, the RCA committees have no power to enforce any recommendation or ensure compliance. A single RCA may provide little learning beyond the unit and staff involved. However, through aggregation of RCA data and dissemination strategies, health care workers can learn from adverse events and prevent future events from occurring. © 2015 Royal Australasian College of Surgeons.

  2. Performance Analysis: Work Control Events Identified January - August 2010

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting in each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded

  3. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  4. Comparison of methods for dependency determination between human failure events within human reliability analysis

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  5. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  6. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  7. Interactive analysis of human error factors in NPP operation events

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  8. Detection of Abnormal Events via Optical Flow Feature Analysis

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  9. Detection of Abnormal Events via Optical Flow Feature Analysis

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  10. Vulnerability analysis of a PWR to an external event

    Aruety, S.; Ilberg, D.; Hertz, Y.

    1980-01-01

    The Vulnerability of a Nuclear Power Plant (NPP) to external events is affected by several factors such as: the degree of redundancy of the reactor systems, subsystems and components; the separation of systems provided in the general layout; the extent of the vulnerable area, i.e., the area which upon being affected by an external event will result in system failure; and the time required to repair or replace the systems, when allowed. The present study offers a methodology, using Probabilistic Safety Analysis, to evaluate the relative importance of the above parameters in reducing the vulnerability of reactor safety systems. Several safety systems of typical PWR's are analyzed as examples. It was found that the degree of redundancy and physical separation of the systems has the most prominent effect on the vulnerability of the NPP

  11. Analysis of manufacturing based on object oriented discrete event simulation

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  12. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  13. Bisphosphonates and risk of cardiovascular events: a meta-analysis.

    Dae Hyun Kim

    Full Text Available Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV events, atrial fibrillation, myocardial infarction (MI, stroke, and CV death in adults with or at risk for low bone mass.A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs and 95% confidence intervals (CIs of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed.Absolute risks over 25-36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84-1.14]; I2 = 0.0%, atrial fibrillation (41 trials; 1.08 [0.92-1.25]; I2 = 0.0%, MI (10 trials; 0.96 [0.69-1.34]; I2 = 0.0%, stroke (10 trials; 0.99 [0.82-1.19]; I2 = 5.8%, and CV death (14 trials; 0.88 [0.72-1.07]; I2 = 0.0% with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96-1.61]; I2 = 0.0%, not for oral bisphosphonates (26 trials; 1.02 [0.83-1.24]; I2 = 0.0%. The CV effects did not vary by subgroups or study quality.Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large reduction in fractures with bisphosphonates, changes in

  14. The Key Events Dose-Response Framework: a cross-disciplinary mode-of-action based approach to examining dose-response and thresholds.

    Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S

    2009-09-01

    The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.

  15. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  16. Analysis of warm convective rain events in Catalonia

    Ballart, D.; Figuerola, F.; Aran, M.; Rigo, T.

    2009-09-01

    Between the end of September and November, events with high amounts of rainfall are quite common in Catalonia. The high sea surface temperature of the Mediterranean Sea near to the Catalan Coast is one of the most important factors that help to the development of this type of storms. Some of these events have particular characteristics: elevated rain rate during short time periods, not very deep convection and low lightning activity. Consequently, the use of remote sensing tools for the surveillance is quite useless or limited. With reference to the high rain efficiency, this is caused by internal mechanisms of the clouds, and also by the air mass where the precipitation structure is developed. As aforementioned, the contribution of the sea to the air mass is very relevant, not only by the increase of the big condensation nuclei, but also by high temperature of the low layers of the atmosphere, where are allowed clouds with 5 or 6 km of particles in liquid phase. In fact, the freezing level into these clouds can be detected by -15ºC. Due to these characteristics, this type of rainy structures can produce high quantities of rainfall in a relatively brief period of time, and, in the case to be quasi-stationary, precipitation values at surface could be very important. From the point of view of remote sensing tools, the cloud nature implies that the different tools and methodologies commonly used for the analysis of heavy rain events are not useful. This is caused by the following features: lightning are rarely observed, the top temperatures of clouds are not cold enough to be enhanced in the satellite imagery, and, finally, reflectivity radar values are lower than other heavy rain cases. The third point to take into account is the vulnerability of the affected areas. An elevated percentage of the Catalan population lives in the coastal region. In the central coast of Catalonia, the urban areas are surrounded by a not very high mountain range with small basins and

  17. Analysis of mechanics of verbal manipulation with key words of social vocabulary exemplified in journalistic article

    Наталья Александровна Бубнова

    2012-03-01

    Full Text Available The article deals with the analysis of mechanism of speech manipulation on readers' consciousness by means of socially marked key words, forming four concept groups: power, nation, wealth, poverty (on the material of journalistic article.

  18. Analysis of selected methods for the recovery of encrypted WEP key

    Wójtowicz, Sebastian; Belka, Radosław

    2014-11-01

    This paper deals with some of the WEP (Wired Equivalent Privacy) key decryption methods based on aircrack-ng software, which was embedded in Backtrack operating system (Linux distribution). The 64-bit (40-bit) and 128-bit (104- bit) key encrypted with RC4 cipher weakness was shown. Research methods were made in different network environments. In this work we compared different types of keys to check how strong the RC4 stream cipher can be. The 40-bit and 104-bit WEP key has been tested on IEEE 802.11 based wireless LAN using laptop with live-CD Linux operating system. A short analysis of key creation methods was performed to compare the amount of time necessary to decrypt random and nonrandom WEP keys.

  19. Cadmium-induced immune abnormality is a key pathogenic event in human and rat models of preeclampsia.

    Zhang, Qiong; Huang, Yinping; Zhang, Keke; Huang, Yanjun; Yan, Yan; Wang, Fan; Wu, Jie; Wang, Xiao; Xu, Zhangye; Chen, Yongtao; Cheng, Xue; Li, Yong; Jiao, Jinyu; Ye, Duyun

    2016-11-01

    With increased industrial development, cadmium is an increasingly important environmental pollutant. Studies have identified various adverse effects of cadmium on human beings. However, the relationships between cadmium pollution and the pathogenesis of preeclampsia remain elusive. The objective of this study is to explore the effects of cadmium on immune system among preeclamptic patients and rats. The results showed that the cadmium levels in the peripheral blood of preeclamptic patients were significantly higher than those observed in normal pregnancy. Based on it, a novel rat model of preeclampsia was established by the intraperitoneal administration of cadmium chloride (CdCl2) (0.125 mg of Cd/kg body weight) on gestational days 9-14. Key features of preeclampsia, including hypertension, proteinuria, placental abnormalities and small foetal size, appeared in pregnant rats after the administration of low-dose of CdCl2. Cadmium increased immunoglobulin production, mainly angiotensin II type 1-receptor-agonistic autoantibodies (AT1-AA), by increasing the expression of activation-induced cytosine deaminase (AID) in B cells. AID is critical for the maturation of antibody and autoantibody responses. In addition, angiotensin II type 1-receptor-agonistic autoantibody, which emerged recently as a potential pathogenic contributor to PE, was responsible for the deposition of complement component 5 (C5) in kidneys of pregnant rats via angiotensin II type 1 receptor (AT1R) activation. C5a is a fragment of C5 that is released during C5 activation. Selectively interfering with C5a signalling by a complement C5a receptor-specific antagonist significantly attenuated hypertension and proteinuria in Cd-injected pregnant rats. Our results suggest that cadmium induces immune abnormalities that may be a key pathogenic contributor to preeclampsia and provide new insights into treatment strategies of preeclampsia. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Analysis and Modelling of Taste and Odour Events in a Shallow Subtropical Reservoir

    Edoardo Bertone

    2016-08-01

    Full Text Available Understanding and predicting Taste and Odour events is as difficult as critical for drinking water treatment plants. Following a number of events in recent years, a comprehensive statistical analysis of data from Lake Tingalpa (Queensland, Australia was conducted. Historical manual sampling data, as well as data remotely collected by a vertical profiler, were collected; regression analysis and self-organising maps were the used to determine correlations between Taste and Odour compounds and potential input variables. Results showed that the predominant Taste and Odour compound was geosmin. Although one of the main predictors was the occurrence of cyanobacteria blooms, it was noticed that the cyanobacteria species was also critical. Additionally, water temperature, reservoir volume and oxidised nitrogen availability, were key inputs determining the occurrence and magnitude of the geosmin peak events. Based on the results of the statistical analysis, a predictive regression model was developed to provide indications on the potential occurrence, and magnitude, of peaks in geosmin concentration. Additionally, it was found that the blue green algae probe of the lake’s vertical profiler has the potential to be used as one of the inputs for an automated geosmin early warning system.

  1. Analysis of the stability of events occurred in Laguna Verde

    Castillo D, R.; Ortiz V, J.; Calleros M, G.

    2005-01-01

    The new fuel designs for operation cycles more long have regions of uncertainty bigger that those of the old fuels, and therefore, they can have oscillations of power when an event is presented that causes that the reactor operates to high power and low flow of coolant. During the start up of the reactor there are continued procedures that avoid that oscillations are presented with that which makes sure the stable behavior of the reactor. However, when the reactor is operating to nominal conditions and they are shot or they are transferred to low speed the recirculation pumps, it cannot make sure that the reactor doesn't present oscillations of power when entering to the restricted operation regions. The methods of stability analysis commonly use signs of neutronic noise that require to be stationary, but after a transitory one where they commonly get lost the recirculation pumps the signs they don't have the required characteristics, for what they are used with certain level of uncertainty by the limited validity of the models. In this work the Prony method is used to determine the reactor stability, starting from signs of transitory and it is compared with autoregressive models. Four events are analyzed happened in the Laguna Verde power plant where the reactor was in the area of high power and low flow of coolant, giving satisfactory results. (Author)

  2. Liquefaction induced by modern earthquakes as a key to paleoseismicity: A case study of the 1988 Saguenay event

    Tuttle, M.; Cowie, P.; Wolf, L.

    1992-01-01

    Liquefaction features, including sand dikes, sills, and sand-filled craters, that formed at different distances from the epicenter of the 1988 (Mw 5.9) Saguenay earthquake are compared with one another and with older features. Modern liquefaction features decrease in size with increasing distance from the Saguenay epicenter. This relationship suggests that the size of liquefaction features may be used to determine source zones of past earthquakes and to estimate attenuation of seismic energy. Pre-1988 liquefaction features are cross-cut by the 1988 features. Although similar in morphology to the modern features, the pre-1988 features are more weathered and considerably larger in size. The larger pre-1988 features are located in the Ferland area, whereas the smallest pre-1988 feature occurs more than 37 km to the southwest. This spatial distribution of different size features suggests that an unidentified earthquake source zone (in addition to the one that generated the Saguenay earthquake) may exist in the Laurentide-Saguenay region. Structural relationships of the liquefaction features indicate that one, possibly two, earthquakes induced liquefaction in the region prior to 1988. The age of only one pre-1988 feature is well-constrained at 340 ± 70 radiocarbon years BP. If the 1663 earthquake was responsible for the formation of this feature, this event may have been centered in the Laurentide-Saguenay region rather than in the Charlevoix seismic zone

  3. Formal Analysis of BPMN Models Using Event-B

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  4. Root cause analysis for fire events at nuclear power plants

    1999-09-01

    Fire hazard has been identified as a major contributor to a plant' operational safety risk. The International nuclear power community (regulators, operators, designers) has been studying and developing tools for defending against this hazed. Considerable advances have been achieved during past two decades in design and regulatory requirements for fire safety, fire protection technology and related analytical techniques. The IAEA endeavours to provide assistance to Member States in improving fire safety in nuclear power plants. A task was launched by IAEA in 1993 with the purpose to develop guidelines and good practices, to promote advanced fire safety assessment techniques, to exchange state of the art information, and to provide engineering safety advisory services and training in the implementation of internationally accepted practices. This TECDOC addresses a systematic assessment of fire events using the root cause analysis methodology, which is recognized as an important element of fire safety assessment

  5. Differential Fault Analysis on CLEFIA with 128, 192, and 256-Bit Keys

    Takahashi, Junko; Fukunaga, Toshinori

    This paper describes a differential fault analysis (DFA) attack against CLEFIA. The proposed attack can be applied to CLEFIA with all supported keys: 128, 192, and 256-bit keys. DFA is a type of side-channel attack. This attack enables the recovery of secret keys by injecting faults into a secure device during its computation of the cryptographic algorithm and comparing the correct ciphertext with the faulty one. CLEFIA is a 128-bit blockcipher with 128, 192, and 256-bit keys developed by the Sony Corporation in 2007. CLEFIA employs a generalized Feistel structure with four data lines. We developed a new attack method that uses this characteristic structure of the CLEFIA algorithm. On the basis of the proposed attack, only 2 pairs of correct and faulty ciphertexts are needed to retrieve the 128-bit key, and 10.78 pairs on average are needed to retrieve the 192 and 256-bit keys. The proposed attack is more efficient than any previously reported. In order to verify the proposed attack and estimate the calculation time to recover the secret key, we conducted an attack simulation using a PC. The simulation results show that we can obtain each secret key within three minutes on average. This result shows that we can obtain the entire key within a feasible computational time.

  6. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  7. Depletion of Key Meiotic Genes and Transcriptome-Wide Abiotic Stress Reprogramming Mark Early Preparatory Events Ahead of Apomeiotic Transition

    Jubin N Shah

    2016-10-01

    Full Text Available Molecular dissection of apomixis - an asexual reproductive mode - is anticipated to solve the enigma of loss of meiotic sex, and to help fixing elite agronomic traits. The Brassicaceae genus Boechera comprises of both sexual and apomictic species, permitting comparative analyses of meiotic circumvention (apomeiosis and parthenogenesis. Whereas previous studies reported local transcriptome changes during these events, it remained unclear whether global changes associated with hybridization, polyploidy and environmental adaptation that arose during evolution of Boechera might hint as (epigenetic regulators of early development prior apomictic initiation. To identify these signatures during vegetative stages, we compared seedling RNA-seq transcriptomes of an obligate triploid apomict and a diploid sexual, both isolated from a drought-prone habitat. Uncovered were several genes differentially expressed between sexual and apomictic seedlings, including homologues of meiotic genes ASYNAPTIC 1 (ASY1 and MULTIPOLAR SPINDLE 1 (MPS1 that were down-regulated in apomicts. An intriguing class of apomict-specific deregulated genes included several NAC transcription factors, homologues of which are known to be transcriptionally reprogrammed during abiotic stress in other plants. Deregulation of both meiotic and stress-response genes during seedling stages might possibly be important in preparation for meiotic circumvention, as similar transcriptional alteration was discernible in apomeiotic floral buds too. Furthermore, we noted that the apomict showed better tolerance to osmotic stress in vitro than the sexual, in conjunction with significant upregulation of a subset of NAC genes. In support of the current model that DNA methylation epigenetically regulates stress, ploidy, hybridization and apomixis, we noted that ASY1, MPS1 and NAC019 homologues were deregulated in Boechera seedlings upon DNA demethylation, and ASY1 in particular seems to be repressed by

  8. Six key elements' analysis of FAC effective management in nuclear power plant

    Zhong Zhaojiang; Chen Hanming

    2010-01-01

    Corporate Commitment, Analysis, Operating Experience, Inspection, Training and Engineering Judgment, Long-Term Strategy are the six key elements of FAC effective management in nuclear power plant. Corporate commitment is the economy base of FAC management and ensure of management system, Analysis is the method of FAC's optimization and consummation, Operating experience is the reference and complementarity of FAC, Inspection is the base of accumulating FAC data, Training and engineering judgment is the technical complementarity and deepening, Long-term strategy is successful key of FAC management. Six key elements supplement each other, and make up of a full system of FAC effective management. For present FAC management in our national nuclear power plant, six key elements are the core and bring out the best in each other to found the FAC effective management system and prevent great FAC occurrence. (authors)

  9. Climate Central World Weather Attribution (WWA) project: Real-time extreme weather event attribution analysis

    Haustein, Karsten; Otto, Friederike; Uhe, Peter; Allen, Myles; Cullen, Heidi

    2015-04-01

    Extreme weather detection and attribution analysis has emerged as a core theme in climate science over the last decade or so. By using a combination of observational data and climate models it is possible to identify the role of climate change in certain types of extreme weather events such as sea level rise and its contribution to storm surges, extreme heat events and droughts or heavy rainfall and flood events. These analyses are usually carried out after an extreme event has occurred when reanalysis and observational data become available. The Climate Central WWA project will exploit the increasing forecast skill of seasonal forecast prediction systems such as the UK MetOffice GloSea5 (Global seasonal forecasting system) ensemble forecasting method. This way, the current weather can be fed into climate models to simulate large ensembles of possible weather scenarios before an event has fully emerged yet. This effort runs along parallel and intersecting tracks of science and communications that involve research, message development and testing, staged socialization of attribution science with key audiences, and dissemination. The method we employ uses a very large ensemble of simulations of regional climate models to run two different analyses: one to represent the current climate as it was observed, and one to represent the same events in the world that might have been without human-induced climate change. For the weather "as observed" experiment, the atmospheric model uses observed sea surface temperature (SST) data from GloSea5 (currently) and present-day atmospheric gas concentrations to simulate weather events that are possible given the observed climate conditions. The weather in the "world that might have been" experiments is obtained by removing the anthropogenic forcing from the observed SSTs, thereby simulating a counterfactual world without human activity. The anthropogenic forcing is obtained by comparing the CMIP5 historical and natural simulations

  10. Analysis of core damage frequency: Surry, Unit 1 internal events

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the accident sequence analysis of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed and described here is an extensive of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments form numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.05-E-5 per year, with a 95% upper bound of 1.34E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency. 49 refs., 52 figs., 70 tabs

  11. Trend analysis of explosion events at overseas nuclear power plants

    Shimada, Hiroki

    2008-01-01

    We surveyed failures caused by disasters (e.g., severe storms, heavy rainfall, earthquakes, explosions and fires) which occurred during the 13 years from 1995 to 2007 at overseas nuclear power plants (NPPs) from the nuclear information database of the Institute of Nuclear Safety System. Incorporated (INSS). The results revealed that explosions were the second most frequent type of failure after fires. We conducted a trend analysis on such explosion events. The analysis by equipment, cause, and effect on the plant showed that the explosions occurred mainly at electrical facilities, and thus it is essential to manage the maintenance of electrical facilities for preventing explosions. In addition, it was shown that explosions at transformers and batteries, which have never occurred at Japan's NPPs, accounted for as much as 55% of all explosions. The fact infers that this difference is attributable to the difference in maintenance methods of transformers (condition based maintenance adopted by NPPs) and workforce organization of batteries (inspections performed by utilities' own maintenance workers at NPPs). (author)

  12. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    Maria Cleofe Giorgino

    2017-11-01

    Full Text Available Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR. The features of this tool are that it aims to represent the multidimensional impact of the organization’s activity and assumes materiality as a guiding principle of the report drafting. Adopting the event study methodology associated with a statistical significance test for categorical data, our results verify that an organization’s release of IR is able to produce a statistically significant impact on the related share prices. Moreover, the term “integrated” assigned to the reports plays a significant role in the impact on capital markets. Our findings have beneficial implications for both researchers and practitioners, adding new evidence for the IR usefulness as a corporate disclosure tool and the effect of an organization’s decision to disclose material information.

  14. Pertussis outbreak in Polish shooters with adverse event analysis

    Monika Skrzypiec-Spring

    2017-04-01

    Full Text Available In addition to different injuries, infections are the most common reason for giving up training altogether or reducing its volume and intensity, as well as a lack of opportunities to participate in sports competitions. Nowadays, a slow but constant re‑emergence of pertussis, especially among teenagers and young adults, including athletes, can be observed. This paper describes an outbreak of pertussis among professional Polish shooters, focusing on the transmission of Bordetella pertussis infection between members of the national team, its influence on performance capacity and adverse event analysis. From 9 June, 2015 to 31 July, 2015, a total of 4 confirmed and suspected cases of pertussis were reported among members of the Polish Sport Shooting National Team, their relatives and acquaintances. Pertussis significantly decreased exercise performance of the first athlete, a 35-year-old woman, interrupted her training, and finally resulted in failure to win a medal or quota place. Pertussis also significantly decreased performance of the second athlete, a 25-year-old shooter. The other cases emerged in their families. Whooping cough is a real threat to athletes and should be prevented. Preventive measures include appropriate immunization, constant medical supervision, as well as early isolation, diagnostic tests and treatment of all infected sport team members. Regular administration of booster doses of the acellular pertussis vaccine (Tdpa every 5 years seems reasonable.

  15. Analysis of Multi Muon Events in the L3 Detector

    Schmitt, Volker

    2000-01-01

    The muon density distribution in air showers initiated by osmi parti les is sensitive to the hemi al omposition of osmi rays. The density an be measured via the multipli ity distribution in a nite size dete tor, as it is L3. With a shallow depth of 30 meters under ground, the dete tor provides an ex ellent fa ility to measure a high muon rate, but being shielded from the hadroni and ele troni shower omponent. Subje t of this thesis is the des ription of the L3 Cosmi s experiment (L3+C), whi h is taking data sin e May 1999 and the analysis of muon bundles in the large magneti spe trometer of L3. The new osmi trigger and readout system is brie y des ribed. The in uen e of dierent primaries on the multipli ity distribution has been investigated using Monte Carlo event samples, generated with the CORSIKA program. The simulation results showed that L3+C measures in the region of the \\knee" of the primary spe trum of osmi rays. A new pattern re ognition has been developed and added to the re onstru tion ode, whi h ...

  16. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  17. Civil protection and Damaging Hydrogeological Events: comparative analysis of the 2000 and 2015 events in Calabria (southern Italy

    O. Petrucci

    2017-11-01

    Full Text Available Calabria (southern Italy is a flood prone region, due to both its rough orography and fast hydrologic response of most watersheds. During the rainy season, intense rain affects the region, triggering floods and mass movements that cause economic damage and fatalities. This work presents a methodological approach to perform the comparative analysis of two events affecting the same area at a distance of 15 years, by collecting all the qualitative and quantitative features useful to describe both rain and damage. The aim is to understand if similar meteorological events affecting the same area can have different outcomes in terms of damage. The first event occurred between 8 and 10 September 2000, damaged 109 out of 409 municipalities of the region and killed 13 people in a campsite due to a flood. The second event, which occurred between 30 October and 1 November 2015, damaged 79 municipalities, and killed a man due to a flood. The comparative analysis highlights that, despite the exceptionality of triggering daily rain was higher in the 2015 event, the damage caused by the 2000 event to both infrastructures and belongings was higher, and it was strongly increased due to the 13 flood victims. We concluded that, in the 2015 event, the management of pre-event phases, with the issuing of meteorological alert, and the emergency management, with the preventive evacuation of people in hazardous situations due to landslides or floods, contributed to reduce the number of victims.

  18. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    Anderson, Dale [Los Alamos National Laboratory; Selby, Neil [AWE Blacknest

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  19. Analysis of early initiating event(s) in radiation-induced thymic lymphomagenesis

    Muto, Masahiro; Ying Chen; Kubo, Eiko; Mita, Kazuei

    1996-01-01

    Since the T cell receptor rearrangement is a sequential process and unique to the progeny of each clone, we investigated the early initiating events in radiation-induced thymic lymphomagenesis by comparing the oncogenic alterations with the pattern of γ T cell receptor (TCR) rearrangements. We reported previously that after leukemogenic irradiation, preneoplastic cells developed, albeit infrequently, from thymic leukemia antigen-2 + (TL-2 + ) thymocytes. Limited numbers of TL-2 + cells from individual irradiated B10.Thy-1.1 mice were injected into B10.Thy-1.2 mice intrathymically, and the common genetic changes among the donor-type T cell lymphomas were investigated with regard to p53 gene and chromosome aberrations. The results indicated that some mutations in the p53 gene had taken place in these lymphomas, but there was no common mutation among the donor-type lymphomas from individual irradiated mice, suggesting that these mutations were late-occurring events in the process of oncogenesis. On the other hand, there were common chromosome aberrations or translocations such as trisomy 15, t(7F; 10C), t(1A; 13D) or t(6A; XB) among the donor-type lymphomas derived from half of the individual irradiated mice. This indicated that the aberrations/translocations, which occurred in single progenitor cells at the early T cell differentiation either just before or after γ T cell receptor rearrangements, might be important candidates for initiating events. In the donor-type lymphomas from the other half of the individual irradiated mice, microgenetic changes were suggested to be initial events and also might take place in single progenitor cells just before or right after γ TCR rearrangements. (author)

  20. Key Concept Identification: A Comprehensive Analysis of Frequency and Topical Graph-Based Approaches

    Muhammad Aman

    2018-05-01

    Full Text Available Automatic key concept extraction from text is the main challenging task in information extraction, information retrieval and digital libraries, ontology learning, and text analysis. The statistical frequency and topical graph-based ranking are the two kinds of potentially powerful and leading unsupervised approaches in this area, devised to address the problem. To utilize the potential of these approaches and improve key concept identification, a comprehensive performance analysis of these approaches on datasets from different domains is needed. The objective of the study presented in this paper is to perform a comprehensive empirical analysis of selected frequency and topical graph-based algorithms for key concept extraction on three different datasets, to identify the major sources of error in these approaches. For experimental analysis, we have selected TF-IDF, KP-Miner and TopicRank. Three major sources of error, i.e., frequency errors, syntactical errors and semantical errors, and the factors that contribute to these errors are identified. Analysis of the results reveals that performance of the selected approaches is significantly degraded by these errors. These findings can help us develop an intelligent solution for key concept extraction in the future.

  1. Analysis of thermal fatigue events in light water reactors

    Okuda, Yasunori [Institute of Nuclear Safety System Inc., Seika, Kyoto (Japan)

    2000-09-01

    Thermal fatigue events, which may cause shutdown of nuclear power stations by wall-through-crack of pipes of RCRB (Reactor Coolant Pressure Boundary), are reported by licensees in foreign countries as well as in Japan. In this paper, thermal fatigue events reported in anomalies reports of light water reactors inside and outside of Japan are investigated. As a result, it is clarified that the thermal fatigue events can be classified in seven patterns by their characteristics, and the trend of the occurrence of the events in PWRs (Pressurized Water Reactors) has stronger co-relation to operation hours than that in BWRs (Boiling Water Reactors). Also, it is concluded that precise identification of locations where thermal fatigue occurs and its monitoring are important to prevent the thermal fatigue events by aging or miss modification. (author)

  2. Transcriptome analysis reveals key differentially expressed genes involved in wheat grain development

    Yonglong Yu

    2016-04-01

    Full Text Available Wheat seed development is an important physiological process of seed maturation and directly affects wheat yield and quality. In this study, we performed dynamic transcriptome microarray analysis of an elite Chinese bread wheat cultivar (Jimai 20 during grain development using the GeneChip Wheat Genome Array. Grain morphology and scanning electron microscope observations showed that the period of 11–15 days post-anthesis (DPA was a key stage for the synthesis and accumulation of seed starch. Genome-wide transcriptional profiling and significance analysis of microarrays revealed that the period from 11 to 15 DPA was more important than the 15–20 DPA stage for the synthesis and accumulation of nutritive reserves. Series test of cluster analysis of differential genes revealed five statistically significant gene expression profiles. Gene ontology annotation and enrichment analysis gave further information about differentially expressed genes, and MapMan analysis revealed expression changes within functional groups during seed development. Metabolic pathway network analysis showed that major and minor metabolic pathways regulate one another to ensure regular seed development and nutritive reserve accumulation. We performed gene co-expression network analysis to identify genes that play vital roles in seed development and identified several key genes involved in important metabolic pathways. The transcriptional expression of eight key genes involved in starch and protein synthesis and stress defense was further validated by qRT-PCR. Our results provide new insight into the molecular mechanisms of wheat seed development and the determinants of yield and quality.

  3. Internal event analysis of Laguna Verde Unit 1 Nuclear Power Plant. System Analysis

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis of Laguna Verde Unit 1 Nuclear Power Plant , CNSNS-TR-004, in five volumes. The reports are organized as follows: CNSNS-TR-004 Volume 1: Introduction and Methodology. CNSNS-TR-004 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR-004 Volume 3: System Analysis. CNSNS-TR-004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR-004 Volume 5: Appendices A, B and C. This volume presents the results of the system analysis for the Laguna Verde Unit 1 Nuclear Power Plant. The system analysis involved the development of logical models for all the systems included in the accident sequence event tree headings, and for all the support systems required to operate the front line systems. For the Internal Event analysis for Laguna Verde, 16 front line systems and 5 support systems were included. Detailed fault trees were developed for most of the important systems. Simplified fault trees focusing on major faults were constructed for those systems that can be adequately represent,ed using this kind of modeling. For those systems where fault tree models were not constructed, actual data were used to represent the dominant failures of the systems. The main failures included in the fault trees are hardware failures, test and maintenance unavailabilities, common cause failures, and human errors. The SETS and TEMAC codes were used to perform the qualitative and quantitative fault tree analyses. (Author)

  4. Yucca Mountain Feature, Event, and Process (FEP) Analysis

    Freeze, G.

    2005-01-01

    A Total System Performance Assessment (TSPA) model was developed for the U.S. Department of Energy (DOE) Yucca Mountain Project (YMP) to help demonstrate compliance with applicable postclosure regulatory standards and support the License Application (LA). Two important precursors to the development of the TSPA model were (1) the identification and screening of features, events, and processes (FEPs) that might affect the Yucca Mountain disposal system (i.e., FEP analysis), and (2) the formation of scenarios from screened in (included) FEPs to be evaluated in the TSPA model (i.e., scenario development). YMP FEP analysis and scenario development followed a five-step process: (1) Identify a comprehensive list of FEPs potentially relevant to the long-term performance of the disposal system. (2) Screen the FEPs using specified criteria to identify those FEPs that should be included in the TSPA analysis and those that can be excluded from the analysis. (3) Form scenarios from the screened in (included) FEPs. (4) Screen the scenarios using the same criteria applied to the FEPs to identify any scenarios that can be excluded from the TSPA, as appropriate. (5) Specify the implementation of the scenarios in the computational modeling for the TSPA, and document the treatment of included FEPs. This paper describes the FEP analysis approach (Steps 1 and 2) for YMP, with a brief discussion of scenario formation (Step 3). Details of YMP scenario development (Steps 3 and 4) and TSPA modeling (Step 5) are beyond scope of this paper. The identification and screening of the YMP FEPs was an iterative process based on site-specific information, design, and regulations. The process was iterative in the sense that there were multiple evaluation and feedback steps (e.g., separate preliminary, interim, and final analyses). The initial YMP FEP list was compiled from an existing international list of FEPs from other radioactive waste disposal programs and was augmented by YMP site- and design

  5. Using Key Part-of-Speech Analysis to Examine Spoken Discourse by Taiwanese EFL Learners

    Lin, Yen-Liang

    2015-01-01

    This study reports on a corpus analysis of samples of spoken discourse between a group of British and Taiwanese adolescents, with the aim of exploring the statistically significant differences in the use of grammatical categories between the two groups of participants. The key word method extended to a part-of-speech level using the web-based…

  6. Analysis of Faraday Mirror in Auto-Compensating Quantum Key Distribution

    Wei Ke-Jin; Ma Hai-Qiang; Li Rui-Xue; Zhu Wu; Liu Hong-Wei; Zhang Yong; Jiao Rong-Zhen

    2015-01-01

    The ‘plug and play’ quantum key distribution system is the most stable and the earliest commercial system in the quantum communication field. Jones matrix and Jones calculus are widely used in the analysis of this system and the improved version, which is called the auto-compensating quantum key distribution system. Unfortunately, existing analysis has two drawbacks: only the auto-compensating process is analyzed and existing systems do not fully consider laser phase affected by a Faraday mirror (FM). In this work, we present a detailed analysis of the output of light pulse transmitting in a plug and play quantum key distribution system that contains only an FM, by Jones calculus. A similar analysis is made to a home-made auto-compensating system which contains two FMs to compensate for environmental effects. More importantly, we show that theoretical and experimental results are different in the plug and play interferometric setup due to the fact that a conventional Jones matrix of FM neglected an additional phase π on alternative polarization direction. To resolve the above problem, we give a new Jones matrix of an FM according to the coordinate rotation. This new Jones matrix not only resolves the above contradiction in the plug and play interferometric setup, but also is suitable for the previous analyses about auto-compensating quantum key distribution. (paper)

  7. Event-shape analysis: Sequential versus simultaneous multifragment emission

    Cebra, D.A.; Howden, S.; Karn, J.; Nadasen, A.; Ogilvie, C.A.; Vander Molen, A.; Westfall, G.D.; Wilson, W.K.; Winfield, J.S.; Norbeck, E.

    1990-01-01

    The Michigan State University 4π array has been used to select central-impact-parameter events from the reaction 40 Ar+ 51 V at incident energies from 35 to 85 MeV/nucleon. The event shape in momentum space is an observable which is shown to be sensitive to the dynamics of the fragmentation process. A comparison of the experimental event-shape distribution to sequential- and simultaneous-decay predictions suggests that a transition in the breakup process may have occurred. At 35 MeV/nucleon, a sequential-decay simulation reproduces the data. For the higher energies, the experimental distributions fall between the two contrasting predictions

  8. Event tree analysis for the system of hybrid reactor

    Yang Yongwei; Qiu Lijian

    1993-01-01

    The application of probabilistic risk assessment for fusion-fission hybrid reactor is introduced. A hybrid reactor system has been analysed using event trees. According to the character of the conceptual design of Hefei Fusion-fission Experimental Hybrid Breeding Reactor, the probabilities of the event tree series induced by 4 typical initiating events were calculated. The results showed that the conceptual design is safe and reasonable. through this paper, the safety character of hybrid reactor system has been understood more deeply. Some suggestions valuable to safety design for hybrid reactor have been proposed

  9. Water resources and environmental input-output analysis and its key study issues: a review

    YANG, Z.; Xu, X.

    2013-12-01

    Used to study the material and energy flow in socioeconomic system, Input-Output Analysis(IOA) had been an effective analysis tool since its appearance. The research fields of Input-Output Analysis were increasingly expanded and studied in depth with the development of fundamental theory. In this paper, starting with introduction of theory development, the water resources input-output analysis and environmental input-output analysis had been specifically reviewed, and two key study issues mentioned as well. Input-Occupancy-Output Analysis and Grey Input-Output Analysis whose proposal and development were introduced firstly could be regard as the effective complements of traditional IOA theory. Because of the hypotheses of homogeneity, stability and proportionality, Input-Occupancy-Output Analysis and Grey Input-Output Analysis always had been restricted in practical application inevitably. In the applied study aspect, with investigation of abundant literatures, research of water resources input-output analysis and environmental input-output analysis had been comprehensively reviewed and analyzed. The regional water resources flow between different economic sectors had been systematically analyzed and stated, and several types of environmental input-output analysis models combined with other effective analysis tools concluded. In two perspectives in terms of external and inland aspect, the development of water resources and environmental input-output analysis model had been explained, and several typical study cases in recent years listed respectively. By the aid of sufficient literature analysis, the internal development tendency and study hotspot had also been summarized. In recent years, Chinese literatures reporting water resources consumption analysis and virtue water study had occupied a large share. Water resources consumption analysis had always been the emphasis of inland water resources IOA. Virtue water study had been considered as the new hotspot of

  10. Passage Key Inlet, Florida; CMS Modeling and Borrow Site Impact Analysis

    2016-06-01

    Impact Analysis by Kelly R. Legault and Sirisha Rayaprolu PURPOSE: This Coastal and Hydraulics Engineering Technical Note (CHETN) describes the...driven sediment transport at Passage Key Inlet. This analysis resulted in issuing a new Florida Department of Environmental Protection (FDEP) permit to...Funding for this study was provided by the USACE Regional Sediment Management (RSM) Program, a Navigation Research, Development, and Technology Portfolio

  11. Selfish memes: An update of Richard Dawkins’ bibliometric analysis of key papers in sociobiology

    Aaen-Stockdale, Craig

    2017-01-01

    This is an Open Access journal available from http://www.mdpi.com/ In the second edition of The Selfish Gene, Richard Dawkins included a short bibliometric analysis of key papers instrumental to the sociobiological revolution, the intention of which was to support his proposal that ideas spread within a population in an epidemiological manner. In his analysis, Dawkins primarily discussed the influence of an article by British evolutionary biologist William Donald Hamilton which had introdu...

  12. Nonstochastic Analysis of Manufacturing Systems Using Timed-Event Graphs

    Hulgaard, Henrik; Amon, Tod

    1996-01-01

    Using automated methods to analyze the temporal behavior ofmanufacturing systems has proven to be essential and quite beneficial.Popular methodologies include Queueing networks, Markov chains,simulation techniques, and discrete event systems (such as Petrinets). These methodologies are primarily...

  13. Analysis of the Steam Generator Tubes Rupture Initiating Event

    Trillo, A.; Minguez, E.; Munoz, R.; Melendez, E.; Sanchez-Perea, M.; Izquierd, J.M.

    1998-01-01

    In PSA studies, Event Tree-Fault Tree techniques are used to analyse to consequences associated with the evolution of an initiating event. The Event Tree is built in the sequence identification stage, following the expected behaviour of the plant in a qualitative way. Computer simulation of the sequences is performed mainly to determine the allowed time for operator actions, and do not play a central role in ET validation. The simulation of the sequence evolution can instead be performed by using standard tools, helping the analyst obtain a more realistic ET. Long existing methods and tools can be used to automatism the construction of the event tree associated to a given initiator. These methods automatically construct the ET by simulating the plant behaviour following the initiator, allowing some of the systems to fail during the sequence evolution. Then, the sequences with and without the failure are followed. The outcome of all this is a Dynamic Event Tree. The work described here is the application of one such method to the particular case of the SGTR initiating event. The DYLAM scheduler, designed at the Ispra (Italy) JRC of the European Communities, is used to automatically drive the simulation of all the sequences constituting the Event Tree. Similarly to the static Event Tree, each time a system is demanded, two branches are open: one corresponding to the success and the other to the failure of the system. Both branches are followed by the plant simulator until a new system is demanded, and the process repeats. The plant simulation modelling allows the treatment of degraded sequences that enter into the severe accident domain as well as of success sequences in which long-term cooling is started. (Author)

  14. Detecting failure events in buildings: a numerical and experimental analysis

    Heckman, V. M.; Kohler, M. D.; Heaton, T. H.

    2010-01-01

    A numerical method is used to investigate an approach for detecting the brittle fracture of welds associated with beam -column connections in instrumented buildings in real time through the use of time-reversed Green’s functions and wave propagation reciprocity. The approach makes use of a prerecorded catalog of Green’s functions for an instrumented building to detect failure events in the building during a later seismic event by screening continuous data for the presence of wavef...

  15. The analysis of a complex fire event using multispaceborne observations

    Andrei Simona

    2018-01-01

    Full Text Available This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  16. The analysis of a complex fire event using multispaceborne observations

    Andrei, Simona; Carstea, Emil; Marmureanu, Luminita; Ene, Dragos; Binietoglou, Ioannis; Nicolae, Doina; Konsta, Dimitra; Amiridis, Vassilis; Proestakis, Emmanouil

    2018-04-01

    This study documents a complex fire event that occurred on October 2016, in Middle East belligerent area. Two fire outbreaks were detected by different spacecraft monitoring instruments on board of TERRA, CALIPSO and AURA Earth Observation missions. Link with local weather conditions was examined using ERA Interim Reanalysis and CAMS datasets. The detection of the event by multiple sensors enabled a detailed characterization of fires and the comparison with different observational data.

  17. Trend analysis of cables failure events at nuclear power plants

    Fushimi, Yasuyuki

    2007-01-01

    In this study, 152 failure events related with cables at overseas nuclear power plants are selected from Nuclear Information Database, which is owned by The Institute of Nuclear Safety System, and these events are analyzed in view of occurrence, causal factor, and so on. And 15 failure events related with cables at domestic nuclear power plants are selected from Nuclear Information Archives, which is owned by JANTI, and these events are analyzed by the same manner. As a result of comparing both trends, it is revealed following; 1) A cable insulator failure rate is lower at domestic nuclear power plants than at foreign ones. It is thought that a deterioration diagnosis is performed broadly in Japan. 2) Many buried cables failure events have been occupied a significant portion of cables failure events during work activity at overseas plants, however none has been occurred at domestic plants. It is thought that sufficient survey is conducted before excavating activity in Japan. 3) A domestic age related cables failure rate in service is lower than the overseas one and domestic improper maintenance rate is higher than the overseas one. Maintenance worker' a skill improvement is expected in order to reduce improper maintenance. (author)

  18. Preliminary Analysis of the Common Cause Failure Events for Domestic Nuclear Power Plants

    Kang, Daeil; Han, Sanghoon

    2007-01-01

    It is known that the common cause failure (CCF) events have a great effect on the safety and probabilistic safety assessment (PSA) results of nuclear power plants (NPPs). However, the domestic studies have been mainly focused on the analysis method and modeling of CCF events. Thus, the analysis of the CCF events for domestic NPPs were performed to establish a domestic database for the CCF events and to deliver them to the operation office of the international common cause failure data exchange (ICDE) project. This paper presents the analysis results of the CCF events for domestic nuclear power plants

  19. [Analysis on the key factors affecting the inheritance of the acupuncture learning].

    Li, Su-yun; Zhang, Li-jian; Gang, Wei-juan; Xu, Wen-bin; Xu, Qing-yan

    2010-06-01

    On the basis of systematicly reviewing the developmental history of acupuncture and moxibustion and profoundly understanding its academic connotations, the authors of the present article make a summary and analysis on the key factors influencing the development of acupuncturology. These key factors are (1) the emergence of "microacupuncture needle regulating-Qi" and the establishement of their corresponding theory system, (2) a large number of practitioners who inherited the learnings of acupuncturology generations by generations, and abundant medical classical works which recorded the valuable academic thoughts and clinical experience of the predecesors, (3) the application of acupuncture charts and manikins, and (4) modernizing changes of acupuncture learnings after introduction of western medicine to China. Just under the influence of these key factors, the acupuncture medicine separates itself from the level of the simple experience medicine, and has formed a set of special theory system and developed into a mature subject.

  20. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Fedosimova, Anastasiya; Gaitinov, Adigam; Grushevskaya, Ekaterina; Lebedev, Igor

    2017-06-01

    In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC), short-range multiparticle correlations (SC) and mixed type (MT) in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  1. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Fedosimova Anastasiya

    2017-01-01

    Full Text Available In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC, short-range multiparticle correlations (SC and mixed type (MT in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  2. Analysis of water hammer events in nuclear power plants

    Sato, Masahiro; Yanagi, Chihiro

    1999-01-01

    A water hammer issue in nuclear power plants was one of unresolved safety issues listed by the United States Nuclear Regulatory Commission and was regarded as resolved. But later on, the water hammer events are still experienced intermittently, while the number of the events is decreasing. We collected water hammer events of PWRs in Japan and the United States and relevant documents, analyzed them, and studied corrective actions taken by Japanese plants. As a result, it is confirmed that preventive measured in design, operation etc. have been already taken and that mitigation mechanisms against water hammer have also been considered. However, it is clarified that attention should be continuously paid to operation of valves and/or pumps, as the prevention of water hammer still relies on operation. (author)

  3. Initiating events in the safety probabilistic analysis of nuclear power plants

    Stasiulevicius, R.

    1989-01-01

    The importance of the initiating event in the probabilistic safety analysis of nuclear power plants are discussed and the basic procedures necessary for preparing reports, quantification and grouping of the events are described. The examples of initiating events with its occurence medium frequency, included those calculated for OCONEE reactor and Angra-1 reactor are presented. (E.G.)

  4. Event Sequence Analysis of the Air Intelligence Agency Information Operations Center Flight Operations

    Larsen, Glen

    1998-01-01

    This report applies Event Sequence Analysis, methodology adapted from aircraft mishap investigation, to an investigation of the performance of the Air Intelligence Agency's Information Operations Center (IOC...

  5. Identification and analysis of external event combinations for Hanhikivi 1PRA

    Helander, Juho [Fennovoima Oy, Helsinki (Finland)

    2017-03-15

    Fennovoima's nuclear power plant, Hanhikivi 1, Pyhäjoki, Finland, is currently in design phase, and its construction is scheduled to begin in 2018 and electricity production in 2024. The objective of this paper is to produce a preliminary list of safety-significant external event combinations including preliminary probability estimates, to be used in the probabilistic risk assessment of Hanhikivi 1 plant. Starting from the list of relevant single events, the relevant event combinations are identified based on seasonal variation, preconditions related to different events, and dependencies (fundamental and cascade type) between events. Using this method yields 30 relevant event combinations of two events for the Hanhikivi site. The preliminary probability of each combination is evaluated, and event combinations with extremely low probability are excluded from further analysis. Event combinations of three or more events are identified by adding possible events to the remaining combinations of two events. Finally, 10 relevant combinations of two events and three relevant combinations of three events remain. The results shall be considered preliminary and will be updated after evaluating more detailed effects of different events on plant safety.

  6. Adverse events with use of antiepileptic drugs: a prescription and event symmetry analysis

    Tsiropoulos, Ioannis; Andersen, Morten; Hallas, Jesper

    2009-01-01

    Database (OPED) for the period of 1 August 1990-31 December 2006, and diagnoses from the County Hospital register for the period of 1994-2006 to perform sequence symmetry analysis. The method assesses the distribution of disease entities and prescription of other drugs (ODs), before and after initiation...

  7. Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment

    Mercurio, D.

    2011-01-01

    Current Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) methodologies model the evolution of accident sequences in Nuclear Power Plants (NPPs) mainly based on Logic Trees. The evolution of these sequences is a result of the interactions between the crew and plant; in current PRA methodologies, simplified models of these complex interactions are used. In this study, the Accident Dynamic Simulator (ADS), a modeling framework based on the Discrete Dynamic Event Tree (DDET), has been used for the simulation of crew-plant interactions during potential accident scenarios in NPPs. In addition, an operator/crew model has been developed to treat the response of the crew to the plant. The 'crew model' is made up of three operators whose behavior is guided by a set of rules-of-behavior (which represents the knowledge and training of the operators) coupled with written and mental procedures. In addition, an approach for addressing the crew timing variability in DDETs has been developed and implemented based on a set of HRA data from a simulator study. Finally, grouping techniques were developed and applied to the analysis of the scenarios generated by the crew-plant simulation. These techniques support the post-simulation analysis by grouping similar accident sequences, identifying the key contributing events, and quantifying the conditional probability of the groups. These techniques are used to characterize the context of the crew actions in order to obtain insights for HRA. The model has been applied for the analysis of a Small Loss Of Coolant Accident (SLOCA) event for a Pressurized Water Reactor (PWR). The simulation results support an improved characterization of the performance conditions or context of operator actions, which can be used in an HRA, in the analysis of the reliability of the actions. By providing information on the evolution of system indications, dynamic of cues, crew timing in performing procedure steps, situation

  8. Analysis of Paks NPP Personnel Activity during Safety Related Event Sequences

    Bareith, A.; Hollo, Elod; Karsa, Z.; Nagy, S.

    1998-01-01

    Within the AGNES Project (Advanced Generic and New Evaluation of Safety) the Level-1 PSA model of the Paks NPP Unit 3 was developed in form of a detailed event tree/fault tree structure (53 initiating events, 580 event sequences, 6300 basic events are involved). This model gives a good basis for quantitative evaluation of potential consequences of actually occurred safety-related events, i.e. for precursor event studies. To make these studies possible and efficient, the current qualitative event analysis practice should be reviewed and a new additional quantitative analysis procedure and system should be developed and applied. The present paper gives an overview of the method outlined for both qualitative and quantitative analyses of the operator crew activity during off-normal situations. First, the operator performance experienced during past operational events is discussed. Sources of raw information, the qualitative evaluation process, the follow-up actions, as well as the documentation requirements are described. Second, the general concept of the proposed precursor event analysis is described. Types of modeled interactions and the considered performance influences are presented. The quantification of the potential consequences of the identified precursor events is based on the task-oriented, Level-1 PSA model of the plant unit. A precursor analysis system covering the evaluation of operator activities is now under development. Preliminary results gained during a case study evaluation of a past historical event are presented. (authors)

  9. Events in time: Basic analysis of Poisson data

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given

  10. Applications of heavy ion microprobe for single event effects analysis

    Reed, Robert A.; Vizkelethy, Gyorgy; Pellish, Jonathan A.; Sierawski, Brian; Warren, Kevin M.; Porter, Mark; Wilkinson, Jeff; Marshall, Paul W.; Niu, Guofu; Cressler, John D.; Schrimpf, Ronald D.; Tipton, Alan; Weller, Robert A.

    2007-01-01

    The motion of ionizing-radiation-induced rogue charge carriers in a semiconductor can create unwanted voltage and current conditions within a microelectronic circuit. If sufficient unwanted charge or current occurs on a sensitive node, a variety of single event effects (SEEs) can occur with consequences ranging from trivial to catastrophic. This paper describes the application of heavy ion microprobes to assist with calibration and validation of SEE modeling approaches

  11. Events in time: Basic analysis of Poisson data

    Engelhardt, M.E.

    1994-09-01

    The report presents basic statistical methods for analyzing Poisson data, such as the member of events in some period of time. It gives point estimates, confidence intervals, and Bayesian intervals for the rate of occurrence per unit of time. It shows how to compare subsets of the data, both graphically and by statistical tests, and how to look for trends in time. It presents a compound model when the rate of occurrence varies randomly. Examples and SAS programs are given.

  12. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-01

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.

  13. Resilience in carbonate production despite three coral bleaching events in 5 years on an inshore patch reef in the Florida Keys.

    Manzello, Derek P; Enochs, Ian C; Kolodziej, Graham; Carlton, Renée; Valentino, Lauren

    2018-01-01

    The persistence of coral reef frameworks requires that calcium carbonate (CaCO 3 ) production by corals and other calcifiers outpaces CaCO 3 loss via physical, chemical, and biological erosion. Coral bleaching causes declines in CaCO 3 production, but this varies with bleaching severity and the species impacted. We conducted census-based CaCO 3 budget surveys using the established ReefBudget approach at Cheeca Rocks, an inshore patch reef in the Florida Keys, annually from 2012 to 2016. This site experienced warm-water bleaching in 2011, 2014, and 2015. In 2017, we obtained cores of the dominant calcifying coral at this site, Orbicella faveolata , to understand how calcification rates were impacted by bleaching and how they affected the reef-wide CaCO 3 budget. Bleaching depressed O. faveolata growth and the decline of this one species led to an overestimation of mean (± std. error) reef-wide CaCO 3 production by + 0.68 (± 0.167) to + 1.11 (± 0.236) kg m -2  year -1 when using the static ReefBudget coral growth inputs. During non-bleaching years, the ReefBudget inputs slightly underestimated gross production by - 0.10 (± 0.022) to - 0.43 (± 0.100) kg m -2  year -1 . Carbonate production declined after the first year of back-to-back bleaching in 2014, but then increased after 2015 to values greater than the initial surveys in 2012. Cheeca Rocks is an outlier in the Caribbean and Florida Keys in terms of coral cover, carbonate production, and abundance of O. faveolata , which is threatened under the Endangered Species Act. Given the resilience of this site to repeated bleaching events, it may deserve special management attention.

  14. A proof-of-concept model for the identification of the key events in the infection process with specific reference to Pseudomonas aeruginosa in corneal infections

    Ilias Soumpasis

    2015-11-01

    Full Text Available Background: It is a common medical practice to characterise an infection based on the causative agent and to adopt therapeutic and prevention strategies targeting the agent itself. However, from an epidemiological perspective, exposure to a microbe can be harmless to a host as a result of low-level exposure or due to host immune response, with opportunistic infection only occurring as a result of changes in the host, pathogen, or surrounding environment. Methods: We have attempted to review systematically the key host, pathogen, and environmental factors that may significantly impact clinical outcomes of exposure to a pathogen, using Pseudomonas aeruginosa eye infection as a case study. Results and discussion: Extended contact lens wearing and compromised hygiene may predispose users to microbial keratitis, which can be a severe and vision-threatening infection. P. aeruginosa has a wide array of virulence-associated genes and sensing systems to initiate and maintain cell populations at the corneal surface and beyond. We have adapted the well-known concept of the epidemiological triangle in combination with the classic risk assessment framework (hazard identification, characterisation, and exposure to develop a conceptual pathway-based model that demonstrates the overlapping relationships between the host, the pathogen, and the environment; and to illustrate the key events in P. aeruginosa eye infection. Conclusion: This strategy differs from traditional approaches that consider potential risk factors in isolation, and hopefully will aid the identification of data and models to inform preventive and therapeutic measures in addition to risk assessment. Furthermore, this may facilitate the identification of knowledge gaps to direct research in areas of greatest impact to avert or mitigate adverse outcomes of infection.

  15. Common-Cause Failure Analysis in Event Assessment

    Rasmuson, D.M.; Kelly, D.L.

    2008-01-01

    This paper reviews the basic concepts of modeling common-cause failures (CCFs) in reliability and risk studies and then applies these concepts to the treatment of CCF in event assessment. The cases of a failed component (with and without shared CCF potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g. failure to start and failure to run) is a new feature of this paper, as is the treatment of asymmetry within a common-cause component group

  16. Thermomechanical Stresses Analysis of a Single Event Burnout Process

    Tais, Carlos E.; Romero, Eduardo; Demarco, Gustavo L.

    2009-06-01

    This work analyzes the thermal and mechanical effects arising in a power Diffusion Metal Oxide Semiconductor (DMOS) during a Single Event Burnout (SEB) process. For studying these effects we propose a more detailed simulation structure than the previously used by other authors, solving the mathematical models by means of the Finite Element Method. We use a cylindrical heat generation region, with 5 W, 10 W, 50 W and 100 W for emulating the thermal phenomena occurring during SEB processes, avoiding the complexity of the mathematical treatment of the ion-semiconductor interaction.

  17. Fault trees based on past accidents. Factorial analysis of events

    Vaillant, M.

    1977-01-01

    The method of the fault tree is already useful in the qualitative step before any reliability calculation. The construction of the tree becomes even simpler when we just want to describe how the events happened. Differently from screenplays that introduce several possibilities by means of the conjunction OR, you only have here the conjunction AND, which will not be written at all. This method is presented by INRS (1) for the study of industrial injuries; it may also be applied to material damages. (orig.) [de

  18. Analysis of loss of offsite power events reported in nuclear power plants

    Volkanovski, Andrija, E-mail: Andrija.VOLKANOVSKI@ec.europa.eu [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Ballesteros Avila, Antonio; Peinador Veira, Miguel [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Kančev, Duško [Kernkraftwerk Goesgen-Daeniken AG, CH-4658 Daeniken (Switzerland); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) gGmbH, Schwertnergasse 1, 50667 Köln (Germany); Stephan, Jean-Luc [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 – 92262 Fontenay-aux-Roses Cedex (France)

    2016-10-15

    Highlights: • Loss of offsite power events were identified in four databases. • Engineering analysis of relevant events was done. • The dominant root cause for LOOP are human failures. • Improved maintenance procedures can decrease the number of LOOP events. - Abstract: This paper presents the results of analysis of the loss of offsite power events (LOOP) in four databases of operational events. The screened databases include: the Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases, the IAEA International Reporting System for Operating Experience (IRS) and the U.S. Licensee Event Reports (LER). In total 228 relevant loss of offsite power events were identified in the IRSN database, 190 in the GRS database, 120 in U.S. LER and 52 in IRS database. Identified events were classified in predefined categories. Obtained results show that the largest percentage of LOOP events is registered during On power operational mode and lasted for two minutes or more. The plant centered events is the main contributor to LOOP events identified in IRSN, GRS and IAEA IRS database. The switchyard centered events are the main contributor in events registered in the NRC LER database. The main type of failed equipment is switchyard failures in IRSN and IAEA IRS, main or secondary lines in NRC LER and busbar failures in GRS database. The dominant root cause for the LOOP events are human failures during test, inspection and maintenance followed by human failures due to the insufficient or wrong procedures. The largest number of LOOP events resulted in reactor trip followed by EDG start. The actions that can result in reduction of the number of LOOP events and minimize consequences on plant safety are identified and presented.

  19. Analysis of internal events for the Unit 1 of the Laguna Verde nuclear power station

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    This volume presents the results of the starter event analysis and the event tree analysis for the Unit 1 of the Laguna Verde nuclear power station. The starter event analysis includes the identification of all those internal events which cause a disturbance to the normal operation of the power station and require mitigation. Those called external events stay beyond the reach of this study. For the analysis of the Laguna Verde power station eight transient categories were identified, three categories of loss of coolant accidents (LOCA) inside the container, a LOCA out of the primary container, as well as the vessel break. The event trees analysis involves the development of the possible accident sequences for each category of starter events. Events trees by systems for the different types of LOCA and for all the transients were constructed. It was constructed the event tree for the total loss of alternating current, which represents an extension of the event tree for the loss of external power transient. Also the event tree by systems for the anticipated transients without scram was developed (ATWS). The events trees for the accident sequences includes the sequences evaluation with vulnerable nucleus, that is to say those sequences in which it is had an adequate cooling of nucleus but the remoting systems of residual heat had failed. In order to model adequately the previous, headings were added to the event tree for developing the sequences until the point where be solved the nucleus state. This process includes: the determination of the failure pressure of the primary container, the evaluation of the environment generated in the reactor building as result of the container failure or cracked of itself, the determination of the localization of the components in the reactor building and the construction of boolean expressions to estimate the failure of the subordinated components to an severe environment. (Author)

  20. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    2011-01-01

    organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  1. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    Lum Karl

    2011-03-01

    countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  2. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  3. Trend analysis of fire events at nuclear power plants

    Shimada, Hiroki

    2007-01-01

    We performed trend analyses to compare fire events occurring overseas (1995-2005) and in Japan (1966-2006). We decided to do this after extracting data on incidents (storms, heavy rain, tsunamis, fires, etc.) occurring at overseas nuclear power plants from the Events Occurred at Overseas Nuclear Power Plants recorded in the Nuclear Information Database at the Institute of Nuclear Safety System (INSS) and finding that fires were the most common of the incidents. Analyses compared the number of fires occurring domestically and overseas and analyzed their causes and the effect of the fires on the power plants. As a result, we found that electrical fires caused by such things as current overheating and electric arcing, account for over one half of the domestic and overseas incidents of fire, which indicates that maintenance management of electric facilities is the most important aspect of fire prevention. Also, roughly the same number of operational fires occurred at domestic and overseas plants, judging from the figures for annual occurrences per unit. However, the overall number of fires per unit at domestic facilities is one fourth that of overseas facilities. We surmise that, while management of operations that utilizes fire is comparable for overseas and domestic plants, this disparity results from differences in the way maintenance is carried out at facilities. (author)

  4. Cause analysis and preventives for human error events in Daya Bay NPP

    Huang Weigang; Zhang Li

    1998-01-01

    Daya Bay Nuclear Power Plant is put into commercial operation in 1994 Until 1996, there are 368 human error events in operating and maintenance area, occupying 39% of total events. These events occurred mainly in the processes of maintenance, test equipment isolation and system on-line, in particular in refuelling and maintenance. The author analyses root causes for human errorievents, which are mainly operator omission or error procedure deficiency; procedure not followed; lack of training; communication failures; work management inadequacy. The protective measures and treatment principle for human error events are also discussed, and several examples applying them are given. Finally, it is put forward that key to prevent human error event lies in the coordination and management, person in charge of work, and good work habits of staffs

  5. Multivariate Volatility Impulse Response Analysis of GFC News Events

    D.E. Allen (David); M.J. McAleer (Michael); R.J. Powell (Robert); A.K. Singh (Abhay)

    2015-01-01

    textabstractThis paper applies the Hafner and Herwartz (2006) (hereafter HH) approach to the analysis of multivariate GARCH models using volatility impulse response analysis. The data set features ten years of daily returns series for the New York Stock Exchange Index and the FTSE 100 index from the

  6. Multivariate Volatility Impulse Response Analysis of GFC News Events

    D.E. Allen (David); M.J. McAleer (Michael); R.J. Powell (Robert)

    2015-01-01

    markdownabstract__Abstract__ This paper applies the Hafner and Herwartz (2006) (hereafter HH) approach to the analysis of multivariate GARCH models using volatility impulse response analysis. The data set features ten years of daily returns series for the New York Stock Exchange Index and the

  7. Marginal regression analysis of recurrent events with coarsened censoring times.

    Hu, X Joan; Rosychuk, Rhonda J

    2016-12-01

    Motivated by an ongoing pediatric mental health care (PMHC) study, this article presents weakly structured methods for analyzing doubly censored recurrent event data where only coarsened information on censoring is available. The study extracted administrative records of emergency department visits from provincial health administrative databases. The available information of each individual subject is limited to a subject-specific time window determined up to concealed data. To evaluate time-dependent effect of exposures, we adapt the local linear estimation with right censored survival times under the Cox regression model with time-varying coefficients (cf. Cai and Sun, Scandinavian Journal of Statistics 2003, 30, 93-111). We establish the pointwise consistency and asymptotic normality of the regression parameter estimator, and examine its performance by simulation. The PMHC study illustrates the proposed approach throughout the article. © 2016, The International Biometric Society.

  8. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  9. Use of PSA for the analysis of operational events in nuclear power plants

    Hulsmans, M.

    2006-01-01

    An operational event is a safety-relevant incident that occurred in an industrial installation like a nuclear power plant (NPP). The probabilistic approach to event analysis focuses on the potential consequences of an operational event. Within its scope of application, it provides a quantitative assessment of the risk significance of this event (and similar events): it calculates the risk increase induced by the event. Such analyses may result in a more objective and a more accurate event severity measure than those provided by commonly used qualitative methods. Probabilistic event analysis complements the traditional event analysis approaches that are oriented towards the understanding of the (root) causes of an event. In practice, risk-based precursor analysis consists of the mapping of an operational event on a risk model of the installation, such as a probabilistic safety analysis (PSA) model. Precursor analyses result in an objective risk ranking of safety-significant events, called accident precursors. An unexpectedly high (or low) risk increase value is in itself already an important finding. This assessment also yields a lot of information on the structure of the risk, since the underlying dominant factors can easily be determined. Relevant 'what if' studies on similar events and conditions can be identified and performed (which is generally not considered in conventional event analysis), with the potential to yield even broader findings. The findings of such a structured assessment can be used for other purposes than merely risk ranking. The operational experience feedback process can be improved by helping to identify design measures and operational practices in order to prevent re-occurrence or in order to mitigate future consequences, and even to evaluate their expected effectiveness, contributing to the validation and prioritization of corrective measures. Confirmed and re-occurring precursors with correlated characteristics may point out opportunities

  10. Information analysis of iris biometrics for the needs of cryptology key extraction

    Adamović Saša

    2013-01-01

    Full Text Available The paper presents a rigorous analysis of iris biometric information for the synthesis of an optimized system for the extraction of a high quality cryptology key. Estimations of local entropy and mutual information were identified as segments of the iris most suitable for this purpose. In order to optimize parameters, corresponding wavelets were transformed, in order to obtain the highest possible entropy and mutual information lower in the transformation domain, which set frameworks for the synthesis of systems for the extraction of truly random sequences of iris biometrics, without compromising authentication properties. [Projekat Ministarstva nauke Republike Srbije, br. TR32054 i br. III44006

  11. Analysis of Malaysian Nuclear Agency Key Performance Indicator (KPI) 2005-2013

    Aisya Raihan Abdul Kadir; Hazmimi Kasim; Azlinda Aziz; Noriah Jamal

    2014-01-01

    Malaysia Nuclear Agency (Nuclear Malaysia) was established on 19 September 1972. Since its inception, Nuclear Malaysia has been entrusted with the responsibility to introduce and promote nuclear science and technology for national development. After more than 40 years of operation, Nuclear Malaysia remains significant as an excellent organization of science, technology and innovation. An analysis of the key performance indicator (KPI) achievements in 2005-2013 as indicator to the role of Nuclear Malaysia as a national research institution. It was established to promote, develop and encourage the application of nuclear technology. (author)

  12. Investigation and analysis of hydrogen ignition and explosion events in foreign nuclear power plants

    Okuda, Yasunori [Institute of Nuclear Safety System, Inc., Mihama, Fukui (Japan)

    2002-09-01

    Reports about hydrogen ignition and explosion events in foreign nuclear power plants from 1980 to 2001 were investigated, and 31 events were identified. Analysis showed that they were categorized in (1) outer leakage ignition events and (2) inner accumulation ignition events. The dominant event for PWR (pressurized water reactor) was outer leakage ignition in the main generator, and in BWR (boiling water reactor) it was inner accumulation ignition in the off-gas system. The outer leakage ignition was a result of work process failure with the ignition source, operator error, or main generator hydrogen leakage. The inner accumulation ignition events were caused by equipment failure or insufficient monitoring. With careful preventive measures, the factors leading to these events could be eliminated. (author)

  13. [Analysis on the adverse events of cupping therapy in the application].

    Zhou, Xin; Ruan, Jing-wen; Xing, Bing-feng

    2014-10-01

    The deep analysis has been done on the cases of adverse events and common injury of cupping therapy encountered in recent years in terms of manipulation and patient's constitution. The adverse events of cupping therapy are commonly caused by improper manipulation of medical practitioners, ignoring contraindication and patient's constitution. Clinical practitioners should use cupping therapy cautiously, follow strictly the rules of standard manipulation and medical core system, pay attention to the contraindication and take strict precautions against the occurrence of adverse events.

  14. Key factors of case management interventions for frequent users of healthcare services: a thematic analysis review.

    Hudon, Catherine; Chouinard, Maud-Christine; Lambert, Mireille; Diadiou, Fatoumata; Bouliane, Danielle; Beaudin, Jérémie

    2017-10-22

    The aim of this paper was to identify the key factors of case management (CM) interventions among frequent users of healthcare services found in empirical studies of effectiveness. Thematic analysis review of CM studies. We built on a previously published review that aimed to report the effectiveness of CM interventions for frequent users of healthcare services, using the Medline, Scopus and CINAHL databases covering the January 2004-December 2015 period, then updated to July 2017, with the keywords 'CM' and 'frequent use'. We extracted factors of successful (n=7) and unsuccessful (n=6) CM interventions and conducted a mixed thematic analysis to synthesise findings. Chaudoir's implementation of health innovations framework was used to organise results into four broad levels of factors: (1) ,environmental/organisational level, (2) practitioner level, (3) patient level and (4) programme level. Access to, and close partnerships with, healthcare providers and community services resources were key factors of successful CM interventions that should target patients with the greatest needs and promote frequent contacts with the healthcare team. The selection and training of the case manager was also an important factor to foster patient engagement in CM. Coordination of care, self-management support and assistance with care navigation were key CM activities. The main issues reported by unsuccessful CM interventions were problems with case finding or lack of care integration. CM interventions for frequent users of healthcare services should ensure adequate case finding processes, rigorous selection and training of the case manager, sufficient intensity of the intervention, as well as good care integration among all partners. Other studies could further evaluate the influence of contextual factors on intervention impacts. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted

  15. Analysis of hypoglycemic events using negative binomial models.

    Luo, Junxiang; Qu, Yongming

    2013-01-01

    Negative binomial regression is a standard model to analyze hypoglycemic events in diabetes clinical trials. Adjusting for baseline covariates could potentially increase the estimation efficiency of negative binomial regression. However, adjusting for covariates raises concerns about model misspecification, in which the negative binomial regression is not robust because of its requirement for strong model assumptions. In some literature, it was suggested to correct the standard error of the maximum likelihood estimator through introducing overdispersion, which can be estimated by the Deviance or Pearson Chi-square. We proposed to conduct the negative binomial regression using Sandwich estimation to calculate the covariance matrix of the parameter estimates together with Pearson overdispersion correction (denoted by NBSP). In this research, we compared several commonly used negative binomial model options with our proposed NBSP. Simulations and real data analyses showed that NBSP is the most robust to model misspecification, and the estimation efficiency will be improved by adjusting for baseline hypoglycemia. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Analysis and design of VEK for extreme events - a challenge

    Woelfel, H.P.; Technische Univ. Darmstadt

    2006-01-01

    For analysis and design of the VEK building - especially for design against earthquake and airplane crash - a 3D-integral-model had been developed, being able of yielding any global response quantities - displacements, accelerations, sectional forces, response spectra, global reinforcement - for any load actions from one mathematical model. Especially for airplane crash a so called dynamic design results in reinforcement quantities at every time step and so leads to a realistic and economic design. The advantages of the integral-model had been transferred to the design of the processing installation where the structural analysis of steel structures, vessels and piping had been dealt with in one integral mathematical model. (orig.)

  17. Using sensitivity analysis to identify key factors for the propagation of a plant epidemic.

    Rimbaud, Loup; Bruchou, Claude; Dallot, Sylvie; Pleydell, David R J; Jacquot, Emmanuel; Soubeyrand, Samuel; Thébaud, Gaël

    2018-01-01

    Identifying the key factors underlying the spread of a disease is an essential but challenging prerequisite to design management strategies. To tackle this issue, we propose an approach based on sensitivity analyses of a spatiotemporal stochastic model simulating the spread of a plant epidemic. This work is motivated by the spread of sharka, caused by plum pox virus , in a real landscape. We first carried out a broad-range sensitivity analysis, ignoring any prior information on six epidemiological parameters, to assess their intrinsic influence on model behaviour. A second analysis benefited from the available knowledge on sharka epidemiology and was thus restricted to more realistic values. The broad-range analysis revealed that the mean duration of the latent period is the most influential parameter of the model, whereas the sharka-specific analysis uncovered the strong impact of the connectivity of the first infected orchard. In addition to demonstrating the interest of sensitivity analyses for a stochastic model, this study highlights the impact of variation ranges of target parameters on the outcome of a sensitivity analysis. With regard to sharka management, our results suggest that sharka surveillance may benefit from paying closer attention to highly connected patches whose infection could trigger serious epidemics.

  18. Proteomic analysis reveals heat shock protein 70 has a key role in polycythemia Vera.

    Gallardo, Miguel; Barrio, Santiago; Fernandez, Marisol; Paradela, Alberto; Arenas, Alicia; Toldos, Oscar; Ayala, Rosa; Albizua, Enriqueta; Jimenez, Ana; Redondo, Santiago; Garcia-Martin, Rosa Maria; Gilsanz, Florinda; Albar, Juan Pablo; Martinez-Lopez, Joaquin

    2013-11-19

    JAK-STAT signaling through the JAK2V617F mutation is central to the pathogenesis of myeloproliferative neoplasms (MPN). However, other events could precede the JAK2 mutation. The aim of this study is to analyze the phenotypic divergence between polycytemia vera (PV) and essential thrombocytemia (ET) to find novel therapeutics targets by a proteomic and functional approach to identify alternative routes to JAK2 activation. Through 2D-DIGE and mass spectrometry of granulocyte protein from 20 MPN samples, showed differential expression of HSP70 in PV and ET besides other 60 proteins. Immunohistochemistry of 46 MPN bone marrow samples confirmed HSP70 expression. The median of positive granulocytes was 80% in PV (SD 35%) vs. 23% in ET (SD 34.25%). In an ex vivo model KNK437 was used as an inhibition model assay of HSP70, showed dose-dependent inhibition of cell growth and burst formation unit erythroid (BFU-E) in PV and ET, increased apoptosis in the erythroid lineage, and decreased pJAK2 signaling, as well as a specific siRNA for HSP70. These data suggest a key role for HSP70 in proliferation and survival of the erythroid lineage in PV, and may represent a potential therapeutic target in MPN, especially in PV.

  19. An analysis of fog events at Belgrade International Airport

    Veljović, Katarina; Vujović, Dragana; Lazić, Lazar; Vučković, Vladan

    2015-01-01

    A preliminary study of the occurrence of fog at Belgrade "Nikola Tesla" Airport was carried out using a statistical approach. The highest frequency of fog has occurred in the winter months of December and January and far exceeded the number of fog days in the spring and the beginning of autumn. The exceptionally foggy months, those having an extreme number of foggy days, occurred in January 1989 (18 days), December 1998 (18 days), February 2005 (17 days) and October 2001 (15 days). During the winter months (December, January and February) from 1990 to 2005 (16 years), fog occurred most frequently between 0600 and 1000 hours, and in the autumn, between 0500 and 0800 hours. In summer, fog occurred most frequently between 0300 and 0600 hours. During the 11-year period from 1995 to 2005, it was found that there was a 13 % chance for fog to occur on two consecutive days and a 5 % chance that it would occur 3 days in a row. In October 2001, the fog was observed over nine consecutive days. During the winter half year, 52.3 % of fog events observed at 0700 hours were in the presence of stratus clouds and 41.4 % were without the presence of low clouds. The 6-h cooling observed at the surface preceding the occurrence of fog between 0000 and 0700 hours ranged mainly from 1 to 4 °C. A new method was applied to assess the probability of fog occurrence based on complex fog criteria. It was found that the highest probability of fog occurrence (51.2 %) takes place in the cases in which the relative humidity is above 97 %, the dew-point depression is 0 °C, the cloud base is lower than 50 m and the wind is calm or weak 1 h before the onset of fog.

  20. Survival analysis using S analysis of time-to-event data

    Tableman, Mara

    2003-01-01

    Survival Analysis Using S: Analysis of Time-to-Event Data is designed as a text for a one-semester or one-quarter course in survival analysis for upper-level or graduate students in statistics, biostatistics, and epidemiology. Prerequisites are a standard pre-calculus first course in probability and statistics, and a course in applied linear regression models. No prior knowledge of S or R is assumed. A wide choice of exercises is included, some intended for more advanced students with a first course in mathematical statistics. The authors emphasize parametric log-linear models, while also detailing nonparametric procedures along with model building and data diagnostics. Medical and public health researchers will find the discussion of cut point analysis with bootstrap validation, competing risks and the cumulative incidence estimator, and the analysis of left-truncated and right-censored data invaluable. The bootstrap procedure checks robustness of cut point analysis and determines cut point(s). In a chapter ...

  1. Organization of pulse-height analysis programs for high event rates

    Cohn, C E [Argonne National Lab., Ill. (USA)

    1976-09-01

    The ability of a pulse-height analysis program to handle high event rates can be enhanced by organizing it so as to minimize the time spent in interrupt housekeeping. Specifically, the routine that services the data-ready interrupt from the ADC should test whether another event is ready before performing the interrupt return.

  2. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes...

  3. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Behrmann, Gerd

    1999-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses \\emph{compositionality} and \\emph{dependency analysis} to significantly improve the efficiency of symbolic model checking of state/event models...

  4. Event Reconstruction and Analysis in the R3BRoot Framework

    Kresan, Dmytro; Al-Turany, Mohammad; Bertini, Denis; Karabowicz, Radoslaw; Manafov, Anar; Rybalchenko, Alexey; Uhlig, Florian

    2014-01-01

    The R 3 B experiment (Reaction studies with Relativistic Radioactive Beams) will be built within the future FAIR / GSI (Facility for Antiproton and Ion Research) in Darmstadt, Germany. The international collaboration R 3 B has a scientific program devoted to the physics of stable and radioactive beams at energies between 150 MeV and 1.5 GeV per nucleon. In preparation for the experiment, the R3BRoot software framework is under development, it deliver detector simulation, reconstruction and data analysis. The basic functionalities of the framework are handled by the FairRoot framework which is used also by the other FAIR experiments (CBM, PANDA, ASYEOS, etc) while the R 3 B detector specifics and reconstruction code are implemented inside R3BRoot. In this contribution first results of data analysis from the detector prototype test in November 2012 will be reported, moreover, comparison of the tracker performance versus experimental data, will be presented

  5. Offline analysis of HEP events by ''dynamic perceptron'' neural network

    Perrone, A.L.; Basti, G.; Messi, R.; Pasqualucci, E.; Paoluzi, L.

    1997-01-01

    In this paper we start from a critical analysis of the fundamental problems of the parallel calculus in linear structures and of their extension to the partial solutions obtained with non-linear architectures. Then, we present shortly a new dynamic architecture able to solve the limitations of the previous architectures through an automatic re-definition of the topology. This architecture is applied to real-time recognition of particle tracks in high-energy accelerators. (orig.)

  6. Corporate Disclosure, Materiality, and Integrated Report: An Event Study Analysis

    Maria Cleofe Giorgino; Enrico Supino; Federico Barnabè

    2017-01-01

    Within the extensive literature investigating the impacts of corporate disclosure in supporting the sustainable growth of an organization, few studies have included in the analysis the materiality issue referred to the information being disclosed. This article aims to address this gap, exploring the effect produced on capital markets by the publication of a recent corporate reporting tool, Integrated Report (IR). The features of this tool are that it aims to represent the multidimensional imp...

  7. Analysis of spectral data with rare events statistics

    Ilyushchenko, V.I.; Chernov, N.I.

    1990-01-01

    The case is considered of analyzing experimental data, when the results of individual experimental runs cannot be summed due to large systematic errors. A statistical analysis of the hypothesis about the persistent peaks in the spectra has been performed by means of the Neyman-Pearson test. The computations demonstrate the confidence level for the hypothesis about the presence of a persistent peak in the spectrum is proportional to the square root of the number of independent experimental runs, K. 5 refs

  8. Analysis of 16 plasma vortex events in the geomagnetic tail

    Birn, J.; Hones, E.W. Jr.; Bame, S.J.; Russel, C.T.

    1985-01-01

    The analysis of 16 plasma vortex occurrences in the magnetotail plasma sheet of Hones et al. (1983) is extended. We used two- and three-dimensional plasma measurements and three-dimensional magnetic field measurements to study phase relations, energy propagation, and polarization properties. The results point toward an interpretation as a slow strongly damped MHD eigenmode which is generated by tailward traveling perturbations at the low-latitude interface between plasma sheet and magnetosheath

  9. Ultimate design load analysis of planetary gearbox bearings under extreme events

    Gallego Calderon, Juan Felipe; Natarajan, Anand; Cutululis, Nicolaos Antonio

    2017-01-01

    This paper investigates the impact of extreme events on the planet bearings of a 5 MW gearbox. The system is simulated using an aeroelastic tool, where the turbine structure is modeled, and MATLAB/Simulink, where the drivetrain (gearbox and generator) are modeled using a lumped-parameter approach....... Three extreme events are assessed: low-voltage ride through, emergency stop and normal stop. The analysis is focused on finding which event has the most negative impact on the bearing extreme radial loads. The two latter events are carried out following the guidelines of the International...

  10. ALFA detector, Background removal and analysis for elastic events

    Belaloui, Nazim

    2017-01-01

    I worked on the ALFA project, which has the aim to measure the total cross section in PP collisions as a function of t, the momentum transfer by measuring the scattering angle of the protons. This measurement is done for all available energies; so far 7, 8 and 13 TeV. There are many analysis steps and we have focused on enhancing the signal-to-noise ratio. First of all I tried to be more familiar with ROOT, worked on understanding the code used to access to the data, plotting histograms, then cutting-off background.

  11. Integration of risk matrix and event tree analysis: a natural stone ...

    M Kemal Özfirat

    2017-09-27

    Sep 27, 2017 ... Different types of accidents may occur in natural stone facilities during movement, dimensioning, cutting ... are numerous risk analysis methods such as preliminary ..... machine type and maintenance (MM) event, block control.

  12. Political Shocks and Abnormal Returns During the Taiwan Crisis: An Event Study Analysis

    Steeves, Geoffrey

    2002-01-01

    .... Focusing on the 1996 Taiwan Crisis, by means of event study analysis, this paper attempts to determine the extent to which this political shock affected the Taiwanese, and surrounding Japanese stock markets...

  13. Mixed Methods Analysis of Medical Error Event Reports: A Report from the ASIPS Collaborative

    Harris, Daniel M; Westfall, John M; Fernald, Douglas H; Duclos, Christine W; West, David R; Niebauer, Linda; Marr, Linda; Quintela, Javan; Main, Deborah S

    2005-01-01

    .... This paper presents a mixed methods approach to analyzing narrative error event reports. Mixed methods studies integrate one or more qualitative and quantitative techniques for data collection and analysis...

  14. Key Elements of a Family Intervention for Schizophrenia: A Qualitative Analysis of an RCT.

    Grácio, Jaime; Gonçalves-Pereira, Manuel; Leff, Julian

    2018-03-01

    Schizophrenia is a complex biopsychosocial condition in which expressed emotion in family members is a robust predictor of relapse. Not surprisingly, family interventions are remarkably effective and thus recommended in current treatment guidelines. Their key elements seem to be common therapeutic factors, followed by education and coping skills training. However, few studies have explored these key elements and the process of the intervention itself. We conducted a qualitative and quantitative analysis of the records from a pioneering family intervention trial addressing expressed emotion, published by Leff and colleagues four decades ago. Records were analyzed into categories and data explored using descriptive statistics. This was complemented by a narrative evaluation using an inductive approach based on emotional markers and markers of change. The most used strategies in the intervention were addressing needs, followed by coping skills enhancement, advice, and emotional support. Dealing with overinvolvement and reframing were the next most frequent. Single-family home sessions seemed to augment the therapeutic work conducted in family groups. Overall the intervention seemed to promote cognitive and emotional change in the participants, and therapists were sensitive to the emotional trajectory of each subject. On the basis of our findings, we developed a longitudinal framework for better understanding the process of this treatment approach. © 2016 Family Process Institute.

  15. Analysis of the key enzymes of butyric and acetic acid fermentation in biogas reactors

    Gabris, Christina; Bengelsdorf, Frank R; Dürre, Peter

    2015-01-01

    This study aimed at the investigation of the mechanisms of acidogenesis, which is a key process during anaerobic digestion. To expose possible bottlenecks, specific activities of the key enzymes of acidification, such as acetate kinase (Ack, 0.23–0.99 U mg−1 protein), butyrate kinase (Buk, biogas reactor content from three different biogas reactors. Furthermore, the detection of Ack was successful via Western blot analysis. Quantification of corresponding functional genes encoding Buk (buk) and But (but) was not feasible, although an amplification was possible. Thus, phylogenetic trees were constructed based on respective gene fragments. Four new clades of possible butyrate-producing bacteria were postulated, as well as bacteria of the genera Roseburia or Clostridium identified. The low Buk activity was in contrast to the high specific But activity in the analysed samples. Butyrate formation via Buk activity does barely occur in the investigated biogas reactor. Specific enzyme activities (Ack, Buk and But) in samples drawn from three different biogas reactors correlated with ammonia and ammonium concentrations (NH3 and NH4+-N), and a negative dependency can be postulated. Thus, high concentrations of NH3 and NH4+-N may lead to a bottleneck in acidogenesis due to decreased specific acidogenic enzyme activities. PMID:26086956

  16. Ash fouling monitoring and key variables analysis for coal fired power plant boiler

    Shi Yuanhao

    2015-01-01

    Full Text Available Ash deposition on heat transfer surfaces is still a significant problem in coal-fired power plant utility boilers. The effective ways to deal with this problem are accurate on-line monitoring of ash fouling and soot-blowing. In this paper, an online ash fouling monitoring model based on dynamic mass and energy balance method is developed and key variables analysis technique is introduced to study the internal behavior of soot-blowing system. In this process, artificial neural networks (ANN are used to optimize the boiler soot-blowing model and mean impact values method is utilized to determine a set of key variables. The validity of the models has been illustrated in a real case-study boiler, a 300MW Chinese power station. The results on same real plant data show that both models have good prediction accuracy, while the ANN model II has less input parameters. This work will be the basis of a future development in order to control and optimize the soot-blowing of the coal-fired power plant utility boilers.

  17. JINR supercomputer of the module type for event parallel analysis

    Kolpakov, I.F.; Senner, A.E.; Smirnov, V.A.

    1987-01-01

    A model of a supercomputer with 50 million of operations per second is suggested. Its realization allows one to solve JINR data analysis problems for large spectrometers (in particular DELPHY collaboration). The suggested module supercomputer is based on 32-bit commercial available microprocessor with a processing rate of about 1 MFLOPS. The processors are combined by means of VME standard busbars. MicroVAX-11 is a host computer organizing the operation of the system. Data input and output is realized via microVAX-11 computer periphery. Users' software is based on the FORTRAN-77. The supercomputer is connected with a JINR net port and all JINR users get an access to the suggested system

  18. Security Analysis of Measurement-Device-Independent Quantum Key Distribution in Collective-Rotation Noisy Environment

    Li, Na; Zhang, Yu; Wen, Shuang; Li, Lei-lei; Li, Jian

    2018-01-01

    Noise is a problem that communication channels cannot avoid. It is, thus, beneficial to analyze the security of MDI-QKD in noisy environment. An analysis model for collective-rotation noise is introduced, and the information theory methods are used to analyze the security of the protocol. The maximum amount of information that Eve can eavesdrop is 50%, and the eavesdropping can always be detected if the noise level ɛ ≤ 0.68. Therefore, MDI-QKD protocol is secure as quantum key distribution protocol. The maximum probability that the relay outputs successful results is 16% when existing eavesdropping. Moreover, the probability that the relay outputs successful results when existing eavesdropping is higher than the situation without eavesdropping. The paper validates that MDI-QKD protocol has better robustness.

  19. An in silico analysis of the key genes involved in flavonoid biosynthesis in Citrus sinensis

    Adriano R. Lucheta

    2007-01-01

    Full Text Available Citrus species are known by their high content of phenolic compounds, including a wide range of flavonoids. In plants, these compounds are involved in protection against biotic and abiotic stresses, cell structure, UV protection, attraction of pollinators and seed dispersal. In humans, flavonoid consumption has been related to increasing overall health and fighting some important diseases. The goals of this study were to identify expressed sequence tags (EST in Citrus sinensis (L. Osbeck corresponding to genes involved in general phenylpropanoid biosynthesis and the key genes involved in the main flavonoids pathways (flavanones, flavones, flavonols, leucoanthocyanidins, anthocyanins and isoflavonoids. A thorough analysis of all related putative genes from the Citrus EST (CitEST database revealed several interesting aspects associated to these pathways and brought novel information with promising usefulness for both basic and biotechnological applications.

  20. HiggsToFourLeptonsEV in the ATLAS EventView Analysis Framework

    Lagouri, T; Del Peso, J

    2008-01-01

    ATLAS is one of the four experiments at the Large Hadron Collider (LHC) at CERN. This experiment has been designed to study a large range of physics topics, including searches for previously unobserved phenomena such as the Higgs Boson and super-symmetry. The physics analysis package HiggsToFourLeptonsEV for the Standard Model (SM) Higgs to four leptons channel with ATLAS is presented. The physics goal is to investigate with the ATLAS detector, the SM Higgs boson discovery potential through its observation in the four-lepton (electron and muon) final state. HiggsToFourLeptonsEV is based on the official ATLAS software ATHENA and the EventView (EV) analysis framework. EventView is a highly flexible and modular analysis framework in ATHENA and it is one of several analysis schemes for ATLAS physics user analysis. At the core of the EventView is the representative "view" of an event, which defines the contents of event data suitable for event-level physics analysis. The HiggsToFourLeptonsEV package, presented in ...

  1. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  2. Identification of Key Pathways and Genes in Advanced Coronary Atherosclerosis Using Bioinformatics Analysis

    Xiaowen Tan

    2017-01-01

    Full Text Available Background. Coronary artery atherosclerosis is a chronic inflammatory disease. This study aimed to identify the key changes of gene expression between early and advanced carotid atherosclerotic plaque in human. Methods. Gene expression dataset GSE28829 was downloaded from Gene Expression Omnibus (GEO, including 16 advanced and 13 early stage atherosclerotic plaque samples from human carotid. Differentially expressed genes (DEGs were analyzed. Results. 42,450 genes were obtained from the dataset. Top 100 up- and downregulated DEGs were listed. Functional enrichment analysis and Kyoto Encyclopedia of Genes and Genomes (KEGG identification were performed. The result of functional and pathway enrichment analysis indicted that the immune system process played a critical role in the progression of carotid atherosclerotic plaque. Protein-protein interaction (PPI networks were performed either. Top 10 hub genes were identified from PPI network and top 6 modules were inferred. These genes were mainly involved in chemokine signaling pathway, cell cycle, B cell receptor signaling pathway, focal adhesion, and regulation of actin cytoskeleton. Conclusion. The present study indicated that analysis of DEGs would make a deeper understanding of the molecular mechanisms of atherosclerosis development and they might be used as molecular targets and diagnostic biomarkers for the treatment of atherosclerosis.

  3. THE CCAUV.A-K3 KEY COMPARISON OF PRESSURE RECIPROCITY CALIBRATION OF LS2P MICROPHONES: RESULTS AND ANALYSIS

    Cutanda Henríquez, Vicente; Rasmussen, Knud; Nielsen, Lars

    2006-01-01

    The CCAUV.A-K3 Key Comparison has involved 15 countries organized in two loops with two common laboratories, CENAM and DPLA. The measurements took place in 2003. This is the first CCAUV key comparison organized with more than one loop, and therefore the analysis of the results required a more ela...

  4. Partition of some key regulating services in terrestrial ecosystems: Meta-analysis and review

    Viglizzo, E.F., E-mail: evigliz@cpenet.com.ar [INTA, EEA Anguil, Grupo de Investigaciones en Gestión Ambiental (GIGA), Av. Spinetto 785, 6300 Santa Rosa, La Pampa (Argentina); INCITAP-CONICET, Ruta 35, km 335, 6300 Santa Rosa, La Pampa (Argentina); UNLPam, Facultad de Ciencias Exactas y Naturales, Av. Uruguay 151, 6300 Santa Rosa, La Pampa (Argentina); Jobbágy, E.G. [CONICET, Andes 950, 5700 San Luis, San Luis (Argentina); Grupo de Estudios Ambientales IMASL, Ejército de los, Andes 950, 5700 San Luis, San Luis (Argentina); Ricard, M.F. [INCITAP-CONICET, Ruta 35, km 335, 6300 Santa Rosa, La Pampa (Argentina); UNLPam, Facultad de Ciencias Exactas y Naturales, Av. Uruguay 151, 6300 Santa Rosa, La Pampa (Argentina); Paruelo, J.M. [Laboratorio de Análisis Regional y Teledetección, Departamento de Métodos Cuantitativos Sistemas de información, Facultad de Agronomía and IFEVA, Universidad de Buenos Aires and CONICET, Av. San Martín 4453, 1417 Buenos Aires (Argentina)

    2016-08-15

    Our knowledge about the functional foundations of ecosystem service (ES) provision is still limited and more research is needed to elucidate key functional mechanisms. Using a simplified eco-hydrological scheme, in this work we analyzed how land-use decisions modify the partition of some essential regulatory ES by altering basic relationships between biomass stocks and water flows. A comprehensive meta-analysis and review was conducted based on global, regional and local data from peer-reviewed publications. We analyzed five datasets comprising 1348 studies and 3948 records on precipitation (PPT), aboveground biomass (AGB), AGB change, evapotranspiration (ET), water yield (WY), WY change, runoff (R) and infiltration (I). The conceptual framework was focused on ES that are associated with the ecological functions (e.g., intermediate ES) of ET, WY, R and I. ES included soil protection, carbon sequestration, local climate regulation, water-flow regulation and water recharge. To address the problem of data normality, the analysis included both parametric and non-parametric regression analysis. Results demonstrate that PPT is a first-order biophysical factor that controls ES release at the broader scales. At decreasing scales, ES are partitioned as result of PPT interactions with other biophysical and anthropogenic factors. At intermediate scales, land-use change interacts with PPT modifying ES partition as it the case of afforestation in dry regions, where ET and climate regulation may be enhanced at the expense of R and water-flow regulation. At smaller scales, site-specific conditions such as topography interact with PPT and AGB displaying different ES partition formats. The probable implications of future land-use and climate change on some key ES production and partition are discussed. - Highlights: • The partition of regulatory services in ecosystems poses a major policy challenge. • We examined how partitions occur at the hydrosphere

  5. Partition of some key regulating services in terrestrial ecosystems: Meta-analysis and review

    Viglizzo, E.F.; Jobbágy, E.G.; Ricard, M.F.; Paruelo, J.M.

    2016-01-01

    Our knowledge about the functional foundations of ecosystem service (ES) provision is still limited and more research is needed to elucidate key functional mechanisms. Using a simplified eco-hydrological scheme, in this work we analyzed how land-use decisions modify the partition of some essential regulatory ES by altering basic relationships between biomass stocks and water flows. A comprehensive meta-analysis and review was conducted based on global, regional and local data from peer-reviewed publications. We analyzed five datasets comprising 1348 studies and 3948 records on precipitation (PPT), aboveground biomass (AGB), AGB change, evapotranspiration (ET), water yield (WY), WY change, runoff (R) and infiltration (I). The conceptual framework was focused on ES that are associated with the ecological functions (e.g., intermediate ES) of ET, WY, R and I. ES included soil protection, carbon sequestration, local climate regulation, water-flow regulation and water recharge. To address the problem of data normality, the analysis included both parametric and non-parametric regression analysis. Results demonstrate that PPT is a first-order biophysical factor that controls ES release at the broader scales. At decreasing scales, ES are partitioned as result of PPT interactions with other biophysical and anthropogenic factors. At intermediate scales, land-use change interacts with PPT modifying ES partition as it the case of afforestation in dry regions, where ET and climate regulation may be enhanced at the expense of R and water-flow regulation. At smaller scales, site-specific conditions such as topography interact with PPT and AGB displaying different ES partition formats. The probable implications of future land-use and climate change on some key ES production and partition are discussed. - Highlights: • The partition of regulatory services in ecosystems poses a major policy challenge. • We examined how partitions occur at the hydrosphere

  6. Preterm Versus Term Children: Analysis of Sedation/Anesthesia Adverse Events and Longitudinal Risk.

    Havidich, Jeana E; Beach, Michael; Dierdorf, Stephen F; Onega, Tracy; Suresh, Gautham; Cravero, Joseph P

    2016-03-01

    Preterm and former preterm children frequently require sedation/anesthesia for diagnostic and therapeutic procedures. Our objective was to determine the age at which children who are born risk for sedation/anesthesia adverse events. Our secondary objective was to describe the nature and incidence of adverse events. This is a prospective observational study of children receiving sedation/anesthesia for diagnostic and/or therapeutic procedures outside of the operating room by the Pediatric Sedation Research Consortium. A total of 57,227 patients 0 to 22 years of age were eligible for this study. All adverse events and descriptive terms were predefined. Logistic regression and locally weighted scatterplot regression were used for analysis. Preterm and former preterm children had higher adverse event rates (14.7% vs 8.5%) compared with children born at term. Our analysis revealed a biphasic pattern for the development of adverse sedation/anesthesia events. Airway and respiratory adverse events were most commonly reported. MRI scans were the most commonly performed procedures in both categories of patients. Patients born preterm are nearly twice as likely to develop sedation/anesthesia adverse events, and this risk continues up to 23 years of age. We recommend obtaining birth history during the formulation of an anesthetic/sedation plan, with heightened awareness that preterm and former preterm children may be at increased risk. Further prospective studies focusing on the etiology and prevention of adverse events in former preterm patients are warranted. Copyright © 2016 by the American Academy of Pediatrics.

  7. Resilience Analysis of Key Update Strategies for Resource-Constrained Networks

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2011-01-01

    Severe resource limitations in certain types of networks lead to various open issues in security. Since such networks usually operate in unattended or hostile environments, revoking the cryptographic keys and establishing (also distributing) new keys – which we refer to as key update – is a criti...

  8. An analysis of post-event processing in social anxiety disorder.

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  9. Uncertainty analysis of one Main Circulation Pump trip event at the Ignalina NPP

    Vileiniskis, V.; Kaliatka, A.; Uspuras, E.

    2004-01-01

    One Main Circulation Pump (MCP) trip event is an anticipated transient with expected frequency of approximately one event per year. There were a few events when one MCP was inadvertently tripped. The throughput of the rest running pumps in the affected Main Circulation Circuit loop increased, however, the total coolant flow through the affected loop decreased. The main question arises whether this coolant flow rate is sufficient for adequate core cooling. This paper presents an investigation of one MCP trip event at the Ignalina NPP. According to international practice, the transient analysis should consist of deterministic analysis by employing best-estimate codes and uncertainty analysis. For that purpose, the plant's RELAP5 model and the GRS (Germany) System for Uncertainty and Sensitivity Analysis package (SUSA) were employed. Uncertainty analysis of flow energy loss in different parts of the Main Circulation Circuit, initial conditions and code-selected models was performed. Such analysis allows to estimate the influence of separate parameters on calculation results and to find the modelling parameters that have the largest impact on the event studied. On the basis of this analysis, recommendations for the further improvement of the model have been developed. (author)

  10. ANLN functions as a key candidate gene in cervical cancer as determined by integrated bioinformatic analysis

    Xia L

    2018-04-01

    DEGs PPI network complex, contained 305 nodes and 4,962 edges, and 8 clusters were calculated according to k-core =2. Among them, cluster 1, which had 65 nodes and 1,780 edges, had the highest score in these clusters. In coexpression analysis, there were 86 hubgenes from the Brown modules that were chosen for further analysis. Sixty-one key genes were identified as the intersecting genes of the Brown module of WGCNA and DEGs. In survival analysis, only ANLN was a prognostic factor, and the survival was significantly better in the low-expression ANLN group.Conclusion: Our study suggested that ANLN may be a potential tumor oncogene and could serve as a biomarker for predicting the prognosis of cervical cancer patients. Keywords: bioinformatics analysis, cervical cancer, WGCNA, ANLN 

  11. A Key Gene, PLIN1, Can Affect Porcine Intramuscular Fat Content Based on Transcriptome Analysis

    Bojiang Li

    2018-04-01

    Full Text Available Intramuscular fat (IMF content is an important indicator for meat quality evaluation. However, the key genes and molecular regulatory mechanisms affecting IMF deposition remain unclear. In the present study, we identified 75 differentially expressed genes (DEGs between the higher (H and lower (L IMF content of pigs using transcriptome analysis, of which 27 were upregulated and 48 were downregulated. Notably, Kyoto Encyclopedia of Genes and Genomes (KEGG enrichment analysis indicated that the DEG perilipin-1 (PLIN1 was significantly enriched in the fat metabolism-related peroxisome proliferator-activated receptor (PPAR signaling pathway. Furthermore, we determined the expression patterns and functional role of porcine PLIN1. Our results indicate that PLIN1 was highly expressed in porcine adipose tissue, and its expression level was significantly higher in the H IMF content group when compared with the L IMF content group, and expression was increased during adipocyte differentiation. Additionally, our results confirm that PLIN1 knockdown decreases the triglyceride (TG level and lipid droplet (LD size in porcine adipocytes. Overall, our data identify novel candidate genes affecting IMF content and provide new insight into PLIN1 in porcine IMF deposition and adipocyte differentiation.

  12. A Key Gene, PLIN1, Can Affect Porcine Intramuscular Fat Content Based on Transcriptome Analysis.

    Li, Bojiang; Weng, Qiannan; Dong, Chao; Zhang, Zengkai; Li, Rongyang; Liu, Jingge; Jiang, Aiwen; Li, Qifa; Jia, Chao; Wu, Wangjun; Liu, Honglin

    2018-04-04

    Intramuscular fat (IMF) content is an important indicator for meat quality evaluation. However, the key genes and molecular regulatory mechanisms affecting IMF deposition remain unclear. In the present study, we identified 75 differentially expressed genes (DEGs) between the higher (H) and lower (L) IMF content of pigs using transcriptome analysis, of which 27 were upregulated and 48 were downregulated. Notably, Kyoto Encyclopedia of Genes and Genomes (KEGG) enrichment analysis indicated that the DEG perilipin-1 ( PLIN1 ) was significantly enriched in the fat metabolism-related peroxisome proliferator-activated receptor (PPAR) signaling pathway. Furthermore, we determined the expression patterns and functional role of porcine PLIN1. Our results indicate that PLIN1 was highly expressed in porcine adipose tissue, and its expression level was significantly higher in the H IMF content group when compared with the L IMF content group, and expression was increased during adipocyte differentiation. Additionally, our results confirm that PLIN1 knockdown decreases the triglyceride (TG) level and lipid droplet (LD) size in porcine adipocytes. Overall, our data identify novel candidate genes affecting IMF content and provide new insight into PLIN1 in porcine IMF deposition and adipocyte differentiation.

  13. Mapping the Zambian prison health system: An analysis of key structural determinants.

    Topp, Stephanie M; Moonga, Clement N; Luo, Nkandu; Kaingu, Michael; Chileshe, Chisela; Magwende, George; Henostroza, German

    2017-07-01

    Health and health service access in Zambian prisons are in a state of 'chronic emergency'. This study aimed to identify major structural barriers to strengthening the prison health systems. A case-based analysis drew on key informant interviews (n = 7), memos generated during workshops (n = 4) document review and investigator experience. Structural determinants were defined as national or macro-level contextual and material factors directly or indirectly influencing prison health services. The analysis revealed that despite an favourable legal framework, four major and intersecting structural factors undermined the Zambian prison health system. Lack of health financing was a central and underlying challenge. Weak health governance due to an undermanned prisons health directorate impeded planning, inter-sectoral coordination, and recruitment and retention of human resources for health. Outdated prison infrastructure simultaneously contributed to high rates of preventable disease related to overcrowding and lack of basic hygiene. These findings flag the need for policy and administrative reform to establish strong mechanisms for domestic prison health financing and enable proactive prison health governance, planning and coordination.

  14. Extensive Analysis of Worldwide Events Related to The Construction and Commissioning of Nuclear Power Plants: Lessons Learned and Recommendations

    Noel, M.; Zerger, B.; Vuorio, U.; )

    2011-01-01

    Lessons learnt from past experience are extensively used to improve the safety of nuclear power plants (NPPs) worldwide. Although the process of analyzing operational experience is now widespread and well developed, the need for establishment of a similar process for construction experience was highlighted by several countries embarking on construction of new NPPs and in some international forums including the Working Group on the Regulation of New Reactors (WGRNR) of the OECD-NEA. In 2008, EU Member State Safety Authorities participating to the EU Clearinghouse on Operational Experience Feedback decided to launch a topical study on events related to pre-operational stages of NPPs. The aim of this topical study is to reduce the recurrence of events related to the construction, the initial component manufacturing and the commissioning of NPPs, by identifying the main recurring and safety significant issues. For this study, 1090 IRS event reports, 857 US Licensee Event Reports (LERs) and approximately 100 WGRNR reports have been preselected based on key word searches and screened. The screening period starts from the beginning of the databases operation (in the 1980's as far as IRS and LER database are concerned) and ends in November 2009. After this initial screening, a total of 582 reports have been found applicable (247 IRS reports, 309 LERs and 26 WGRNR reports). Events considered for this study were those which have been initiated before the start of commercial operation, and detected before or even long after commercial operation. The events have been classified into 3 main categories (construction, manufacturing and commissioning), and into further sub-categories (building structures, metallic liners, electrical components, anchors, I and C, penetrations and building seals, emergency diesel generators, pipes, valves, welds, pumps, etc.) in order to facilitate the detailed analysis with the final objective to formulate both equipment specific

  15. Radionuclide data analysis in connection of DPRK event in May 2009

    Nikkinen, Mika; Becker, Andreas; Zähringer, Matthias; Polphong, Pornsri; Pires, Carla; Assef, Thierry; Han, Dongmei

    2010-05-01

    The seismic event detected in DPRK on 25.5.2009 was triggering a series of actions within CTBTO/PTS to ensure its preparedness to detect any radionuclide emissions possibly linked with the event. Despite meticulous work to detect and verify, traces linked to the DPRK event were not found. After three weeks of high alert the PTS resumed back to normal operational routine. This case illuminates the importance of objectivity and procedural approach in the data evaluation. All the data coming from particulate and noble gas stations were evaluated daily, some of the samples even outside of office hours and during the weekends. Standard procedures were used to determine the network detection thresholds of the key (CTBT relevant) radionuclides achieved across the DPRK event area and for the assessment of radionuclides typically occurring at IMS stations (background history). Noble gas system has sometimes detections that are typical for the sites due to legitimate non-nuclear test related activities. Therefore, set of hypothesis were used to see if the detection is consistent with event time and location through atmospheric transport modelling. Also the consistency of event timing and isotopic ratios was used in the evaluation work. As a result it was concluded that if even 1/1000 of noble gasses from a nuclear detonation would had leaked, the IMS system would not had problems to detect it. This case also showed the importance of on-site inspections to verify the nuclear traces of possible tests.

  16. Systematic Prioritization and Integrative Analysis of Copy Number Variations in Schizophrenia Reveal Key Schizophrenia Susceptibility Genes

    Luo, Xiongjian; Huang, Liang; Han, Leng; Luo, Zhenwu; Hu, Fang; Tieu, Roger; Gan, Lin

    2014-01-01

    Schizophrenia is a common mental disorder with high heritability and strong genetic heterogeneity. Common disease-common variants hypothesis predicts that schizophrenia is attributable in part to common genetic variants. However, recent studies have clearly demonstrated that copy number variations (CNVs) also play pivotal roles in schizophrenia susceptibility and explain a proportion of missing heritability. Though numerous CNVs have been identified, many of the regions affected by CNVs show poor overlapping among different studies, and it is not known whether the genes disrupted by CNVs contribute to the risk of schizophrenia. By using cumulative scoring, we systematically prioritized the genes affected by CNVs in schizophrenia. We identified 8 top genes that are frequently disrupted by CNVs, including NRXN1, CHRNA7, BCL9, CYFIP1, GJA8, NDE1, SNAP29, and GJA5. Integration of genes affected by CNVs with known schizophrenia susceptibility genes (from previous genetic linkage and association studies) reveals that many genes disrupted by CNVs are also associated with schizophrenia. Further protein-protein interaction (PPI) analysis indicates that protein products of genes affected by CNVs frequently interact with known schizophrenia-associated proteins. Finally, systematic integration of CNVs prioritization data with genetic association and PPI data identifies key schizophrenia candidate genes. Our results provide a global overview of genes impacted by CNVs in schizophrenia and reveal a densely interconnected molecular network of de novo CNVs in schizophrenia. Though the prioritized top genes represent promising schizophrenia risk genes, further work with different prioritization methods and independent samples is needed to confirm these findings. Nevertheless, the identified key candidate genes may have important roles in the pathogenesis of schizophrenia, and further functional characterization of these genes may provide pivotal targets for future therapeutics and

  17. Corrective action program at the Krsko NPP. Trending and analysis of minor events

    Bach, B.; Kavsek, D.

    2007-01-01

    Industry and On-site Operating Experience has shown that the significant events, minor events and near misses all share something in common: latent weaknesses that result in failed barriers and the same or similar (root) causes for that failure. All these types of events differ only in their resulting consequences: minor events and near misses have no immediate or significant impact to plant safety or reliability. However, the significant events are usually preceded by a number of those kinds of events and could be prevented from occurring if the root cause(s) of these precursor events could be identified and eliminated. It would be therefore poor management to leave minor events and near misses unreported and unanalysed. Reporting and analysing of minor events allows detection of latent weaknesses that may indicate the need for improvement. The benefit of low level event analysis is that deficiencies can be found in barriers that normally go unchallenged and may not be known that they are ineffective in stopping a significant event. In addition, large numbers of minor events and near misses may increase the probability of occurrence of a significant event, which in itself is a sufficient reason for addressing these types of events. However, as it is not often practical neither feasible to perform a detailed root cause determination for every minor events, trending and trend analysis are used to identify and correct the causes prior to their resulting in a significant event. Trending is monitoring a change in frequency of similar minor events occurrence. Adverse trend is an increase in the frequency of minor events which are sorted by commonality such as common equipment failure, human factors, common or similar causal factors, activity etc. or worsening performance of processes that has been trending. The primary goal of any trending programme should be to identify an adverse trend early enough that the operating organization can initiate an investigation to help

  18. Computer-aided event tree analysis by the impact vector method

    Lima, J.E.P.

    1984-01-01

    In the development of the Probabilistic Risk Analysis of Angra I, the ' large event tree/small fault tree' approach was adopted for the analysis of the plant behavior in an emergency situation. In this work, the event tree methodology is presented along with the adaptations which had to be made in order to attain a correct description of the safety system performances according to the selected analysis method. The problems appearing in the application of the methodology and their respective solutions are presented and discussed, with special emphasis to the impact vector technique. A description of the ETAP code ('Event Tree Analysis Program') developed for constructing and quantifying event trees is also given in this work. A preliminary version of the small-break LOCA analysis for Angra 1 is presented as an example of application of the methodology and of the code. It is shown that the use of the ETAP code sigmnificantly contributes to decreasing the time spent in event tree analyses, making it viable the practical application of the analysis approach referred above. (author) [pt

  19. Integrated bioinformatics analysis reveals key candidate genes and pathways in breast cancer.

    Wang, Yuzhi; Zhang, Yi; Huang, Qian; Li, Chengwen

    2018-04-19

    Breast cancer (BC) is the leading malignancy in women worldwide, yet relatively little is known about the genes and signaling pathways involved in BC tumorigenesis and progression. The present study aimed to elucidate potential key candidate genes and pathways in BC. Five gene expression profile data sets (GSE22035, GSE3744, GSE5764, GSE21422 and GSE26910) were downloaded from the Gene Expression Omnibus (GEO) database, which included data from 113 tumorous and 38 adjacent non‑tumorous tissue samples. Differentially expressed genes (DEGs) were identified using t‑tests in the limma R package. These DEGs were subsequently investigated by pathway enrichment analysis and a protein‑protein interaction (PPI) network was constructed. The most significant module from the PPI network was selected for pathway enrichment analysis. In total, 227 DEGs were identified, of which 82 were upregulated and 145 were downregulated. Pathway enrichment analysis results revealed that the upregulated DEGs were mainly enriched in 'cell division', the 'proteinaceous extracellular matrix (ECM)', 'ECM structural constituents' and 'ECM‑receptor interaction', whereas downregulated genes were mainly enriched in 'response to drugs', 'extracellular space', 'transcriptional activator activity' and the 'peroxisome proliferator‑activated receptor signaling pathway'. The PPI network contained 174 nodes and 1,257 edges. DNA topoisomerase 2‑a, baculoviral inhibitor of apoptosis repeat‑containing protein 5, cyclin‑dependent kinase 1, G2/mitotic‑specific cyclin‑B1 and kinetochore protein NDC80 homolog were identified as the top 5 hub genes. Furthermore, the genes in the most significant module were predominantly involved in 'mitotic nuclear division', 'mid‑body', 'protein binding' and 'cell cycle'. In conclusion, the DEGs, relative pathways and hub genes identified in the present study may aid in understanding of the molecular mechanisms underlying BC progression and provide

  20. Analysis of events with isolated leptons and missing transverse momentum in ep collisions at HERA

    Brandt, G.

    2007-02-07

    A study of events with isolated leptons and missing transverse momentum in ep collisions is presented. Within the Standard Model (SM) such topologies are expected mainly from production of real W bosons with subsequent leptonic decay. This thesis continues the analysis of such events done in the HERA-1 period where an excess over the SM prediction was observed for events with high hadronic transverse momentum P{sup X}{sub T}>25 GeV. New data of the HERA-2 period are added. The analysed data sample recorded in e{sup +}p collisions corresponds to an integrated luminosity of 220 pb{sup -1} which is a factor of two more with respect to the HERA-1 analysis. The e{sup -}p data correspond to 186 pb{sup -1} which is a factor of 13 more with respect to HERA-1. All three lepton generations (electrons muons and tau leptons) are analysed. In the electron and muon channels a total of 53 events are observed in 406 pb{sup -1}. This compares well to the SM expectation of 53.7{+-}6.5 events, dominated by W production. However a difference in the event rate is observed for different electron beam charges. In e{sup +}p data the excess of events with P{sup X}{sub T}>25 GeV is sustained, while the e{sup -}p data agree with the SM. In the tau channel 18 events are observed in all HERA data, with 20{+-}3 expected from the SM. The events are dominated by irreducible background from charged currents. The contribution from W production amounts to about 22%. One event with P{sup X}{sub T}>25 GeV is observed, where 1.4{+-}0.3 are expected from the SM. (orig.)

  1. Analysis of event tree with imprecise inputs by fuzzy set theory

    Ahn, Kwang Il; Chun, Moon Hyun

    1990-01-01

    Fuzzy set theory approach is proposed as a method to analyze event trees with imprecise or linguistic input variables such as 'likely' or 'improbable' instead of the numerical probability. In this paper, it is shown how the fuzzy set theory can be applied to the event tree analysis. The result of this study shows that the fuzzy set theory approach can be applied as an acceptable and effective tool for analysis of the event tree with fuzzy type of inputs. Comparisons of the fuzzy theory approach with the probabilistic approach of computing probabilities of final states of the event tree through subjective weighting factors and LHS technique show that the two approaches have common factors and give reasonable results

  2. Analysis of Unmanned Aerial Vehicle (UAV) hyperspectral remote sensing monitoring key technology in coastal wetland

    Ma, Yi; Zhang, Jie; Zhang, Jingyu

    2016-01-01

    The coastal wetland, a transitional zone between terrestrial ecosystems and marine ecosystems, is the type of great value to ecosystem services. For the recent 3 decades, area of the coastal wetland is decreasing and the ecological function is gradually degraded with the rapid development of economy, which restricts the sustainable development of economy and society in the coastal areas of China in turn. It is a major demand of the national reality to carry out the monitoring of coastal wetlands, to master the distribution and dynamic change. UAV, namely unmanned aerial vehicle, is a new platform for remote sensing. Compared with the traditional satellite and manned aerial remote sensing, it has the advantage of flexible implementation, no cloud cover, strong initiative and low cost. Image-spectrum merging is one character of high spectral remote sensing. At the same time of imaging, the spectral curve of each pixel is obtained, which is suitable for quantitative remote sensing, fine classification and target detection. Aimed at the frontier and hotspot of remote sensing monitoring technology, and faced the demand of the coastal wetland monitoring, this paper used UAV and the new remote sensor of high spectral imaging instrument to carry out the analysis of the key technologies of monitoring coastal wetlands by UAV on the basis of the current situation in overseas and domestic and the analysis of developing trend. According to the characteristic of airborne hyperspectral data on UAV, that is "three high and one many", the key technology research that should develop are promoted as follows: 1) the atmosphere correction of the UAV hyperspectral in coastal wetlands under the circumstance of complex underlying surface and variable geometry, 2) the best observation scale and scale transformation method of the UAV platform while monitoring the coastal wetland features, 3) the classification and detection method of typical features with high precision from multi scale

  3. Markov chains and semi-Markov models in time-to-event analysis.

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  4. Analysis of human error and organizational deficiency in events considering risk significance

    Lee, Yong Suk; Kim, Yoonik; Kim, Say Hyung; Kim, Chansoo; Chung, Chang Hyun; Jung, Won Dea

    2004-01-01

    In this study, we analyzed human and organizational deficiencies in the trip events of Korean nuclear power plants. K-HPES items were used in human error analysis, and the organizational factors by Jacobs and Haber were used for organizational deficiency analysis. We proposed the use of CCDP as a risk measure to consider risk information in prioritizing K-HPES items and organizational factors. Until now, the risk significance of events has not been considered in human error and organizational deficiency analysis. Considering the risk significance of events in the process of analysis is necessary for effective enhancement of nuclear power plant safety by focusing on causes of human error and organizational deficiencies that are associated with significant risk

  5. Regression analysis of mixed recurrent-event and panel-count data.

    Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L

    2014-07-01

    In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20: , 1-42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  7. Making sense of root cause analysis investigations of surgery-related adverse events.

    Cassin, Bryce R; Barach, Paul R

    2012-02-01

    This article discusses the limitations of root cause analysis (RCA) for surgical adverse events. Making sense of adverse events involves an appreciation of the unique features in a problematic situation, which resist generalization to other contexts. The top priority of adverse event investigations must be to inform the design of systems that help clinicians to adapt and respond effectively in real time to undesirable combinations of design, performance, and circumstance. RCAs can create opportunities in the clinical workplace for clinicians to reflect on local barriers and identify enablers of safe and reliable outcomes. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. Analysis of external flooding events occurred in foreign nuclear power plant sites

    Li Dan; Cai Hankun; Xiao Zhi; An Hongzhen; Mao Huan

    2013-01-01

    This paper screens and studies 17 external flooding events occurred in foreign NPP sites, analysis the characteristic of external flooding events based on the source of the flooding, the impact on the building, systems and equipment, as well as the threat to nuclear safety. Furthermore, based on the experiences and lessons learned from Fukushima nuclear accident relating to external flooding and countermeasures carried out in the world, some suggestions are proposed in order to improve external flooding response capacity for Chinese NPPs. (authors)

  9. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  10. Analysis of double random phase encryption from a key-space perspective

    Monaghan, David S.; Situ, Guohai; Ryle, James; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    The main advantage of the double random phase encryption technique is its physical implementation however to allow us to analyse its behaviour we perform the encryption/decryption numerically. A typically strong encryption scheme will have an extremely large key-space, which will make the probable success of any brute force attack on that algorithm miniscule. Traditionally, designers of optical image encryption systems only demonstrate how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. We analyse this algorithm from a key-space perspective. The key-space of an encryption algorithm can be defined as the set of possible keys that can be used to encode data using that algorithm. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute force attack.

  11. Analysis and Verification of a Key Agreement Protocol over Cloud Computing Using Scyther Tool

    Hazem A Elbaz

    2015-01-01

    The mostly cloud computing authentication mechanisms use public key infrastructure (PKI). Hierarchical Identity Based Cryptography (HIBC) has several advantages that sound well align with the demands of cloud computing. The main objectives of cloud computing authentication protocols are security and efficiency. In this paper, we clarify Hierarchical Identity Based Authentication Key Agreement (HIB-AKA) protocol, providing lightweight key management approach for cloud computing users. Then, we...

  12. Root-Cause Analysis of a Potentially Sentinel Transfusion Event: Lessons for Improvement of Patient Safety

    Ali Reza Jeddian

    2012-09-01

    Full Text Available Errors prevention and patient safety in transfusion medicine are a serious concern. Errors can occur at any step in transfusion and evaluation of their root causes can be helpful for preventive measures. Root cause analysis as a structured and systematic approach can be used for identification of underlying causes of adverse events. To specify system vulnerabilities and illustrate the potential of such an approach, we describe the root cause analysis of a case of transfusion error in emergency ward that could have been fatal. After reporting of the mentioned event, through reviewing records and interviews with the responsible personnel, the details of the incident were elaborated. Then, an expert panel meeting was held to define event timeline and the care and service delivery problems and discuss their underlying causes, safeguards and preventive measures. Root cause analysis of the mentioned event demonstrated that certain defects of the system and the ensuing errors were main causes of the event. It also points out systematic corrective actions. It can be concluded that health care organizations should endeavor to provide opportunities to discuss errors and adverse events and introduce preventive measures to find areas where resources need to be allocated to improve patient safety.

  13. KEY ELEMENTS OF CHARACTERIZING SAVANNAH RIVER SITE HIGH LEVEL WASTE SLUDGE INSOLUBLES THROUGH SAMPLING AND ANALYSIS

    Reboul, S; Barbara Hamm, B

    2007-01-01

    Characterization of HLW is a prerequisite for effective planning of HLW disposition and site closure performance assessment activities. Adequate characterization typically requires application of a combination of data sources, including process knowledge, theoretical relationships, and real-waste analytical data. Consistently obtaining high quality real-waste analytical data is a challenge, particularly for HLW sludge insolubles, due to the inherent complexities associated with matrix heterogeneities, sampling access limitations, radiological constraints, analyte loss mechanisms, and analyte measurement interferences. Understanding how each of these complexities affects the analytical results is the first step to developing a sampling and analysis program that provides characterization data that are both meaningful and adequate. A summary of the key elements impacting SRS HLW sludge analytical data uncertainties is presented in this paper, along with guidelines for managing each of the impacts. The particular elements addressed include: (a) sample representativeness; (b) solid/liquid phase quantification effectiveness; (c) solids dissolution effectiveness; (d) analyte cross contamination, loss, and tracking; (e) dilution requirements; (f) interference removal; (g) analyte measurement technique; and (h) analytical detection limit constraints. A primary goal of understanding these elements is to provide a basis for quantifying total propagated data uncertainty

  14. Security analysis of an untrusted source for quantum key distribution: passive approach

    Zhao Yi; Qi Bing; Lo, H-K; Qian Li

    2010-01-01

    We present a passive approach to the security analysis of quantum key distribution (QKD) with an untrusted source. A complete proof of its unconditional security is also presented. This scheme has significant advantages in real-life implementations as it does not require fast optical switching or a quantum random number generator. The essential idea is to use a beam splitter to split each input pulse. We show that we can characterize the source using a cross-estimate technique without active routing of each pulse. We have derived analytical expressions for the passive estimation scheme. Moreover, using simulations, we have considered four real-life imperfections: additional loss introduced by the 'plug and play' structure, inefficiency of the intensity monitor noise of the intensity monitor, and statistical fluctuation introduced by finite data size. Our simulation results show that the passive estimate of an untrusted source remains useful in practice, despite these four imperfections. Also, we have performed preliminary experiments, confirming the utility of our proposal in real-life applications. Our proposal makes it possible to implement the 'plug and play' QKD with the security guaranteed, while keeping the implementation practical.

  15. Analysis of the key issues in establishing the Russian-based high technological companies

    V. G. Zinov

    2016-01-01

    Full Text Available The article analyses the reasons that prevent a competitive export-oriented Russian domestic development of the robot surgical complex from being industrially prototyped and serially released for the domestic and global market. The development is completed within the framework of a «priority project» which was also highlighted as of significant importance in the Forecast analysis of the scientific-technological development of Russia for the period of up until 2030 and in the road maps of the National technological initiative and other strategic documents. The project was financed by nearly all development funds and institutes in Russia. A conclusion is drawn that the key reason behind the significant increase in a life span of a high-tech market product is the absence of a target setting and a programme of coordinated actions from committed ministries and governmental agencies, as well as the lack of a large high tech company in Russia, capable of producing in industrial volumes the export-oriented products and ensure their global sales.

  16. Microarray analysis reveals key genes and pathways in Tetralogy of Fallot

    He, Yue-E; Qiu, Hui-Xian; Jiang, Jian-Bing; Wu, Rong-Zhou; Xiang, Ru-Lian; Zhang, Yuan-Hai

    2017-01-01

    The aim of the present study was to identify key genes that may be involved in the pathogenesis of Tetralogy of Fallot (TOF) using bioinformatics methods. The GSE26125 microarray dataset, which includes cardiovascular tissue samples derived from 16 children with TOF and five healthy age-matched control infants, was downloaded from the Gene Expression Omnibus database. Differential expression analysis was performed between TOF and control samples to identify differentially expressed genes (DEGs) using Student's t-test, and the R/limma package, with a log2 fold-change of >2 and a false discovery rate of <0.01 set as thresholds. The biological functions of DEGs were analyzed using the ToppGene database. The ReactomeFIViz application was used to construct functional interaction (FI) networks, and the genes in each module were subjected to pathway enrichment analysis. The iRegulon plugin was used to identify transcription factors predicted to regulate the DEGs in the FI network, and the gene-transcription factor pairs were then visualized using Cytoscape software. A total of 878 DEGs were identified, including 848 upregulated genes and 30 downregulated genes. The gene FI network contained seven function modules, which were all comprised of upregulated genes. Genes enriched in Module 1 were enriched in the following three neurological disorder-associated signaling pathways: Parkinson's disease, Alzheimer's disease and Huntington's disease. Genes in Modules 0, 3 and 5 were dominantly enriched in pathways associated with ribosomes and protein translation. The Xbox binding protein 1 transcription factor was demonstrated to be involved in the regulation of genes encoding the subunits of cytoplasmic and mitochondrial ribosomes, as well as genes involved in neurodegenerative disorders. Therefore, dysfunction of genes involved in signaling pathways associated with neurodegenerative disorders, ribosome function and protein translation may contribute to the pathogenesis of TOF

  17. Relation of air mass history to nucleation events in Po Valley, Italy, using back trajectories analysis

    L. Sogacheva

    2007-01-01

    Full Text Available In this paper, we study the transport of air masses to San Pietro Capofiume (SPC in Po Valley, Italy, by means of back trajectories analysis. Our main aim is to investigate whether air masses originate over different regions on nucleation event days and on nonevent days, during three years when nucleation events have been continuously recorded at SPC. The results indicate that nucleation events occur frequently in air masses arriving from Central Europe, whereas event frequency is much lower in the air transported from southern directions and from the Atlantic Ocean. We also analyzed the behaviour of meteorological parameters during 96 h transport to SPC, and found that, on average, event trajectories undergo stronger subsidence during the last 12 h before the arrival at SPC than nonevent trajectories. This causes a reversal in the temperature and relative humidity (RH differences between event and nonevent trajectories: between 96 and 12 h back time, temperature is lower and RH is higher for event than nonevent trajectories and between 12 and 0 h vice versa. Boundary layer mixing is stronger along the event trajectories compared to nonevent trajectories. The absolute humidity (AH is similar for the event and nonevent trajectories between about 96 h and about 60 h back time, but after that, the event trajectories AH becomes lower due to stronger rain. We also studied transport of SO2 to SPC, and conclude that although sources in Po Valley most probably dominate the measured concentrations, certain Central and Eastern European sources also make a substantial contribution.

  18. Key Process Uncertainties in Soil Carbon Dynamics: Comparing Multiple Model Structures and Observational Meta-analysis

    Sulman, B. N.; Moore, J.; Averill, C.; Abramoff, R. Z.; Bradford, M.; Classen, A. T.; Hartman, M. D.; Kivlin, S. N.; Luo, Y.; Mayes, M. A.; Morrison, E. W.; Riley, W. J.; Salazar, A.; Schimel, J.; Sridhar, B.; Tang, J.; Wang, G.; Wieder, W. R.

    2016-12-01

    Soil carbon (C) dynamics are crucial to understanding and predicting C cycle responses to global change and soil C modeling is a key tool for understanding these dynamics. While first order model structures have historically dominated this area, a recent proliferation of alternative model structures representing different assumptions about microbial activity and mineral protection is providing new opportunities to explore process uncertainties related to soil C dynamics. We conducted idealized simulations of soil C responses to warming and litter addition using models from five research groups that incorporated different sets of assumptions about processes governing soil C decomposition and stabilization. We conducted a meta-analysis of published warming and C addition experiments for comparison with simulations. Assumptions related to mineral protection and microbial dynamics drove strong differences among models. In response to C additions, some models predicted long-term C accumulation while others predicted transient increases that were counteracted by accelerating decomposition. In experimental manipulations, doubling litter addition did not change soil C stocks in studies spanning as long as two decades. This result agreed with simulations from models with strong microbial growth responses and limited mineral sorption capacity. In observations, warming initially drove soil C loss via increased CO2 production, but in some studies soil C rebounded and increased over decadal time scales. In contrast, all models predicted sustained C losses under warming. The disagreement with experimental results could be explained by physiological or community-level acclimation, or by warming-related changes in plant growth. In addition to the role of microbial activity, assumptions related to mineral sorption and protected C played a key role in driving long-term model responses. In general, simulations were similar in their initial responses to perturbations but diverged over

  19. Analysis of internal events for the Unit 1 of the Laguna Verde Nuclear Power Station. Appendixes

    Huerta B, A.; Lopez M, R.

    1995-01-01

    This volume contains the appendices for the accident sequences analysis for those internally initiated events for Laguna Verde Unit 1, Nuclear Power Plant. The appendix A presents the comments raised by the Sandia National Laboratories technical staff as a result of the review of the Internal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant. This review was performed during a joint Sandia/CNSNS multi-day meeting by the end 1992. Also included is a brief evaluation on the applicability of these comments to the present study. The appendix B presents the fault tree models printed for each of the systems included and.analyzed in the Internal Event Analysis for LVNPP. The appendice C presents the outputs of the TEMAC code, used for the cuantification of the dominant accident sequences as well as for the final core damage evaluation. (Author)

  20. Analysis of internal events for the Unit 1 of the Laguna Verde Nuclear Power Station. Appendixes

    Huerta B, A.; Lopez M, R.

    1995-01-01

    This volume contains the appendices for the accident sequences analysis for those internally initiated events for Laguna Verde Unit 1, Nuclear Power Plant. The appendix A presents the comments raised by the Sandia National Laboratories technical staff as a result of the review of the Internal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant. This review was performed during a joint Sandia/CNSNS multi-day meeting by the end 1992. Also included is a brief evaluation on the applicability of these comments to the present study. The appendix B presents the fault tree models printed for each of the systems included and analyzed in the Internal Event Analysis for LVNPP. The appendice C presents the outputs of the TEMAC code, used for the cuantification of the dominant accident sequences as well as for the final core damage evaluation. (Author)

  1. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  2. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  3. Superposed epoch analysis of O+ auroral outflow during sawtooth events and substorms

    Nowrouzi, N.; Kistler, L. M.; Lund, E. J.; Cai, X.

    2017-12-01

    Sawtooth events are repeated injection of energetic particles at geosynchronous orbit. Studies have shown that 94% of sawtooth events occurred during magnetic storm times. The main factor that causes a sawtooth event is still an open question. Simulations have suggested that heavy ions like O+ may play a role in triggering the injections. One of the sources of the O+ in the Earth's magnetosphere is the nightside aurora. O+ ions coming from the nightside auroral region have direct access to the near-earth magnetotail. A model (Brambles et al. 2013) for interplanetary coronal mass ejection driven sawtooth events found that nightside O+ outflow caused the subsequent teeth of the sawtooth event through a feedback mechanism. This work is a superposed epoch analysis to test whether the observed auroral outflow supports this model. Using FAST spacecraft data from 1997-2007, we examine the auroral O+ outflow as a function of time relative to an injection onset. Then we determine whether the profile of outflow flux of O+ during sawtooth events is different from the outflow observed during isolated substorms. The auroral region boundaries are estimated using the method of (Andersson et al. 2004). Subsequently the O+ outflow flux inside these boundaries are calculated and binned as a function of superposed epoch time for substorms and sawtooth "teeth". In this way, we will determine if sawtooth events do in fact have greater O+ outflow, and if that outflow is predominantly from the nightside, as suggested by the model results.

  4. Regression analysis of mixed panel count data with dependent terminal events.

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L

    2017-05-10

    Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Analysis of adverse events occurred at overseas nuclear power plants in 2003

    Miyazaki, Takamasa; Sato, Masahiro; Takagawa, Kenichi; Fushimi, Yasuyuki; Shimada, Hiroki; Shimada, Yoshio

    2004-01-01

    The adverse events that have occurred in the overseas nuclear power plants can be studied to provide an indication of how to improve the safety and the reliability of nuclear power plants in Japan. The Institute of Nuclear Safety Systems (INSS) obtains information related to overseas adverse events and incidents, and by evaluating them proposes improvements to prevent similar occurrences in Japanese PWR plants. In 2003, INSS obtained approximately 2800 pieces of information and, by evaluating them, proposed nine recommendations to Japanese utilities. This report shows a summary of the evaluation activity and of the tendency analysis based on individual event analyzed in 2003. The tendency analysis was undertaken on about 1600 analyzed events, from the view point of Mechanics, Electrics, Instruments and Controls and Operations, about the causes, countermeasures, troubled equipments and the possible of lessons learnt from overseas events. This report is to show the whole tendency of overseas events and incidents for the improvement of the safety and reliability of domestic PWR plants. (author)

  6. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    Mitchell, Lisbeth A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  7. Analysis of events occurred at overseas nuclear power plants in 2004

    Miyazaki, Takamasa; Nishioka, Hiromasa; Sato, Masahiro; Chiba, Gorou; Takagawa, Kenichi; Shimada, Hiroki

    2005-01-01

    The Institute of Nuclear Safety Systems (INSS) investigates the information related to events and incidents occurred at overseas nuclear power plants, and proposes recommendations for the improvement of the safety and reliability of domestic PWR plants by evaluating them. Succeeding to the 2003 report, this report shows the summary of the evaluation activity and of the tendency analysis based on about 2800 information obtained in 2004. The tendency analysis was undertaken on about 1700 analyzed events, from the view point of mechanics, electrics and operations, about the causes, troubled equipments and so on. (author)

  8. Using high complexity analysis to probe the evolution of organic aerosol during pollution events in Beijing

    Hamilton, J.; Dixon, W.; Dunmore, R.; Squires, F. A.; Swift, S.; Lee, J. D.; Rickard, A. R.; Sun, Y.; Xu, W.

    2017-12-01

    There is increasing evidence that exposure to air pollution results in significant impacts on human health. In Beijing, home to over 20 million inhabitants, particulate matter levels are very high by international standards, with official estimates of an annual mean PM2.5 concentration in 2014 of 86 μg m-3, nearly 9 times higher than the WHO guideline. Changes in particle composition during pollution events will provide key information on sources and can be used to inform strategies for pollution mitigation and health benefits. The organic fraction of PM is an extremely complex mixture reflecting the diversity of sources to the atmosphere. In this study we attempt to harness the chemical complexity of OA by developing an extensive database of over 700 mass spectra, built using literature data and sources specific tracers (e.g. diesel emission characterisation experiments and SOA generated in chamber simulations). Using a high throughput analysis method (15 min), involving UHPLC coupled to Orbitrap mass spectrometry, chromatograms are integrated, compared to the library and a list of identified compounds produced. Purpose built software based on R is used to automatically produce time series, alongside common aerosol metrics and data visualisation techniques, dramatically reducing analysis times. Offline measurements of organic aerosol composition were made as part of the Sources and Emissions of Air Pollutants in Beijing project, a collaborative program between leading UK and Chinese research groups. Rather than studying only a small number of 24 hr PM samples, we collected 250 filters samples at a range of different time resolutions, from 30 minutes to 12 hours, depending on the time of day and PM loadings. In total 643 species were identified based on their elemental formula and retention time, with species ranging from C2-C22 and between 1-13 oxygens. A large fraction of the OA species observed were organosulfates and/or nitrates. Here we will present

  9. Logistic Organization of Mass Events in the Light of SWOT Analysis - Case Study

    Joanna Woźniak

    2018-02-01

    Full Text Available Rzeszow Juwenalia is the largest free-entry student event in Subcarpathia, and, at the same time, one of the best in Poland. On average, more than 25,000 people stay on the campus of Rzeszow University of Technology for every single day of the event. Such an enormous undertaking requires developing a strategy which will make it possible to design and coordinate the event effectively. In connection with that, the principal objective of this paper is to present the strengths and weaknesses of Rzeszow Juwenalia, and also to attempt to verify opportunities and threats related to the event. SWOT analysis was used in order to attain the adopted objective. With the use of it, results making it possible to conduct a detailed assessment of the undertaking were obtained. In the publication were presented proposals of improvement activities which may be implemented in the future.

  10. An economic cost analysis of emergency department key performance indicators in Ireland.

    Gannon, Brenda; Jones, Cheryl; McCabe, Aileen; O'Sullivan, Ronan; Wakai, Abel

    2017-06-01

    High quality data is fundamental to using key performance indicators (KPIs) for performance monitoring. However, the resources required to collect high quality data are often significant and should usually be targeted at high priority areas. As part of a study of 11 emergency department (ED) KPIs in Ireland, the primary objective of this study was to estimate the relative cost of collecting the additional minimum data set (MDS) elements for those 11 KPIs. An economic cost analysis focused on 12 EDs in the Republic of Ireland. The resource use data were obtained using two separate focus group interviews. The number of available MDS elements was obtained from a sample of 100 patient records per KPI per participating ED. Unit costs for all resource use were taken at the midpoint of the relevant staff salary scales. An ED would need to spend an estimated additional &OV0556;3561 per month on average to capture all the MDS elements relevant to the 11 KPIs investigated. The additional cost ranges from 14.8 to 39.2%; this range is 13.9-32.3% for small EDs, whereas the range for medium EDs is 11.7-40%. Regional EDs have a higher additional estimated cost to capture all the relevant MDS elements (&OV0556;3907), compared with urban EDs (&OV0556;3353). The additional cost of data collection, contingent on that already collected, required to capture all the relevant MDS elements for the KPIs examined, ranges from 14.8 to 39.2% per KPI, with variation identified between regional and urban hospitals.

  11. Value of Earth Observations: Key principles and techniques of socioeconomic benefits analysis (Invited)

    Friedl, L.; Macauley, M.; Bernknopf, R.

    2013-12-01

    Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.

  12. DISPELLING ILLUSIONS OF REFLECTION: A NEW ANALYSIS OF THE 2007 MAY 19 CORONAL 'WAVE' EVENT

    Attrill, Gemma D. R.

    2010-01-01

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified 'reflections' are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  13. Analysis of events related to cracks and leaks in the reactor coolant pressure boundary

    Ballesteros, Antonio, E-mail: Antonio.Ballesteros-Avila@ec.europa.eu [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Sanda, Radian; Peinador, Miguel; Zerger, Benoit [JRC-IET: Institute for Energy and Transport of the Joint Research Centre of the European Commission, Postbus 2, NL-1755 ZG Petten (Netherlands); Negri, Patrice [IRSN: Institut de Radioprotection et de Sûreté Nucléaire (France); Wenke, Rainer [GRS: Gesellschaft für Anlagen- und Reaktorsicherheit (GRS) mbH (Germany)

    2014-08-15

    Highlights: • The important role of Operating Experience Feedback is emphasised. • Events relating to cracks and leaks in the reactor coolant pressure boundary are analysed. • A methodology for event investigation is described. • Some illustrative results of the analysis of events for specific components are presented. - Abstract: The presence of cracks and leaks in the reactor coolant pressure boundary may jeopardise the safe operation of nuclear power plants. Analysis of cracks and leaks related events is an important task for the prevention of their recurrence, which should be performed in the context of activities on Operating Experience Feedback. In response to this concern, the EU Clearinghouse operated by the JRC-IET supports and develops technical and scientific work to disseminate the lessons learned from past operating experience. In particular, concerning cracks and leaks, the studies carried out in collaboration with IRSN and GRS have allowed to identify the most sensitive areas to degradation in the plant primary system and to elaborate recommendations for upgrading the maintenance, ageing management and inspection programmes. An overview of the methodology used in the analysis of cracks and leaks related events is presented in this paper, together with the relevant results obtained in the study.

  14. Erectile dysfunction and cardiovascular events in diabetic men: a meta-analysis of observational studies.

    Tomohide Yamada

    Full Text Available BACKGROUND: Several studies have shown that erectile dysfunction (ED influences the risk of cardiovascular events (CV events. However, a meta-analysis of the overall risk of CV events associated with ED in patients with diabetes has not been performed. METHODOLOGY/PRINCIPAL FINDINGS: We searched MEDLINE and the Cochrane Library for pertinent articles (including references published between 1951 and April 22, 2012. English language reports of original observational cohort studies and cross-sectional studies were included. Pooled effect estimates were obtained by random effects meta-analysis. A total of 3,791 CV events were reported in 3 cohort studies and 9 cross-sectional studies (covering 22,586 subjects. Across the cohort studies, the overall odds ratio (OR of diabetic men with ED versus those without ED was 1.74 (95% confidence interval [CI]: 1.34-2.27; P0.05. Moreover, meta-regression analysis found no relationship between the method used to assess ED (questionnaire or interview, mean age, mean hemoglobin A(1c, mean body mass index, or mean duration of diabetes and the risk of CV events or CHD. In the cross-sectional studies, the OR of diabetic men with ED versus those without ED was 3.39 (95% CI: 2.58-4.44; P<0.001 for CV events (N = 9, 3.43 (95% CI: 2.46-4.77; P<0.001 for CHD (N = 7, and 2.63 (95% CI: 1.41-4.91; P = 0.002 for peripheral vascular disease (N = 5. CONCLUSION/SIGNIFICANCE: ED was associated with an increased risk of CV events in diabetic patients. Prevention and early detection of cardiovascular disease are important in the management of diabetes, especially in view of the rapid increase in its prevalence.

  15. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  16. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual. Companion CD

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  17. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two

  18. Analysis of convection-permitting simulations for capturing heavy rainfall events over Myanmar Region

    Acierto, R. A. E.; Kawasaki, A.

    2017-12-01

    Perennial flooding due to heavy rainfall events causes strong impacts on the society and economy. With increasing pressures of rapid development and potential for climate change impacts, Myanmar experiences a rapid increase in disaster risk. Heavy rainfall hazard assessment is key on quantifying such disaster risk in both current and future conditions. Downscaling using Regional Climate Models (RCM) such as Weather Research and Forecast model have been used extensively for assessing such heavy rainfall events. However, usage of convective parameterizations can introduce large errors in simulating rainfall. Convective-permitting simulations have been used to deal with this problem by increasing the resolution of RCMs to 4km. This study focuses on the heavy rainfall events during the six-year (2010-2015) wet period season from May to September in Myanmar. The investigation primarily utilizes rain gauge observation for comparing downscaled heavy rainfall events in 4km resolution using ERA-Interim as boundary conditions using 12km-4km one-way nesting method. The study aims to provide basis for production of high-resolution climate projections over Myanmar in order to contribute for flood hazard and risk assessment.

  19. The limiting events transient analysis by RETRAN02 and VIPRE01 for an ABWR

    Tsai Chiungwen; Shih Chunkuan; Wang Jongrong; Lin Haotzu; Jin Jiunan; Cheng Suchin

    2009-01-01

    This paper describes the transient analysis of generator load rejection (LR) and One Turbine Control Valve Closure (OTCVC) events for Lungmen nuclear power plant (LMNPP). According to the Critical Power Ratio (CPR) criterion, the Preliminary Safety Analysis Report (PSAR) concluded that LR and OTCVC are the first and second limiting events respectively. In addition, the fuel type is changed from GE12 to GE14 now. It's necessary to re-analyze these two events for safety consideration. In this study, to quantify the impact to reactor, the difference of initial critical power ratio (ICPR) and minimum critical power ratio (MCPR), ie. ΔCPR is calculated. The ΔCPRs of the LR and OTCVC events are calculated with the combination of RETRAN02 and VIPRE01 codes. In RETRAN02 calculation, a thermal-hydraulic model was prepared for the transient analysis. The data including upper plenum pressure, core inlet flow, normalized power, and axial power shapes during transient are furthermore submitted into VIPRE01 for ΔCPR calculation. In VIPRE01 calculation, there was a hot channel model built to simulate the hottest fuel bundle. Based on the thermal-hydraulic data from RETRAN02, the ΔCPRs are calculated by VIPRE01 hot channel model. Additionally, the different TCV control modes are considered to study the influence of different TCV closure curves on transient behavior. Meanwhile, sensitivity studies including different initial system pressure and different initial power/flow conditions are also considered. Based on this analysis, the maximum ΔCPRs for LR and OTCVC are 0.162 and 0.191 respectively. According CPR criterion, the result shows that the impact caused by OTCVC event leads to be larger than LR event. (author)

  20. Participants of the "Grid: the Key to Scientific Collaboration", an outstanding UNESCO-ROSTE and CERN event sponsored by Hewlett Packard held on 28 and 29 September at CERN, Geneva.

    Maximilien Brice

    2005-01-01

    Based on the collaboration-fostering and research-enabling role of the grid, CERN and UNESCO are taking the opportunity to invite current and future grid participants, universities and research institutions to a grid event hosted by CERN in Geneva. Through presentations by key grid protagonists from CERN, the European Commission, the EGEE Grid, and the European research community, participants have been able to learn about the capabilities of the grid, opportunities to leverage their research work, and participation in international projects.

  1. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)

    2014-07-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  2. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    Kančev, Duško; Duchac, Alexander; Zerger, Benoit; Maqua, Michael; Wattrelos, Didier

    2014-01-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  3. Time-to-event analysis of mastitis at first-lactation in Valle del Belice ewes

    Portolano, B.; Firlocchiaro, R.; Kaam, van J.B.C.H.M.; Riggio, V.; Maizon, D.O.

    2007-01-01

    A time-to-event study for mastitis at first-lactation in Valle del Belice ewes was conducted, using survival analysis with an animal model. The goals were to evaluate the effect of lambing season and level of milk production on the time from lambing to the day when a ewe experienced a test-day with

  4. Analysis of electrical penetration graph data: what to do with artificially terminated events?

    Observing the durations of hemipteran feeding behaviors via Electrical Penetration Graph (EPG) results in situations where the duration of the last behavior is not ended by the insect under observation, but by the experimenter. These are artificially terminated events. In data analysis, one must ch...

  5. Analysis of operational events by ATHEANA framework for human factor modelling

    Bedreaga, Luminita; Constntinescu, Cristina; Doca, Cezar; Guzun, Basarab

    2007-01-01

    In the area of human reliability assessment, the experts recognise the fact that the current methods have not represented correctly the role of human in prevention, initiating and mitigating the accidents in nuclear power plants. The nature of this deficiency appears because the current methods used in modelling of human factor have not taken into account the human performance and reliability such as it has been observed in the operational events. ATHEANA - A Technique for Human Error ANAlysis - is a new methodology for human analysis that has included the specific data of operational events and also psychological models for human behaviour. This method has included new elements such as the unsafe action and error mechanisms. In this paper we present the application of ATHEANA framework in the analysis of operational events that appeared in different nuclear power plants during 1979-2002. The analysis of operational events has consisted of: - identification of the unsafe actions; - including the unsafe actions into a category, omission ar commission; - establishing the type of error corresponding to the unsafe action: slip, lapse, mistake and circumvention; - establishing the influence of performance by shaping the factors and some corrective actions. (authors)

  6. Propensity for Violence among Homeless and Runaway Adolescents: An Event History Analysis

    Crawford, Devan M.; Whitbeck, Les B.; Hoyt, Dan R.

    2011-01-01

    Little is known about the prevalence of violent behaviors among homeless and runaway adolescents or the specific behavioral factors that influence violent behaviors across time. In this longitudinal study of 300 homeless and runaway adolescents aged 16 to 19 at baseline, the authors use event history analysis to assess the factors associated with…

  7. The Analysis of the Properties of Super Solar Proton Events and the Associated Phenomena

    Cheng, L. B.; Le, G. M.; Lu, Y. P.; Chen, M. H.; Li, P.; Yin, Z. Q.

    2014-05-01

    The solar flare, the propagation speed of shock driven by coronal mass ejection (CME) from the sun to the Earth, the source longitudes and Carrington longitudes, and the geomagnetic storms associated with each super solar proton event with the peak flux equal to or exceeding 10000 pfu have been investigated. The analysis results show that the source longitudes of super solar proton events ranged from E30° to W75°. The Carrington longitudes of source regions of super solar proton events distributed in the two longitude bands, 130°˜220° and 260°˜320°, respectively. All super solar proton events were accompanied by major solar flares and fast CMEs. The averaged speeds of shocks propagated from the sun to the Earth were greater than 1200 km/s. Eight super solar proton events were followed by major geomagnetic storms (Dst≤-100 nT). One super solar proton event was followed by a geomagnetic storm with Dst=-96 nT.

  8. Probabilistic safety analysis for fire events for the NPP Isar 2

    Schmaltz, H.; Hristodulidis, A.

    2007-01-01

    The 'Probabilistic Safety Analysis for Fire Events' (Fire-PSA KKI2) for the NPP Isar 2 was performed in addition to the PSA for full power operation and considers all possible events which can be initiated due to a fire. The aim of the plant specific Fire-PSA was to perform a quantitative assessment of fire events during full power operation, which is state of the art. Based on simplistic assumptions referring to the fire induced failures, the influence of system- and component-failures on the frequency of the core damage states was analysed. The Fire-PSA considers events, which will result due to fire-induced failures of equipment on the one hand in a SCRAM and on the other hand in events, which will not have direct operational effects but because of the fire-induced failure of safety related installations the plant will be shut down as a precautionary measure. These events are considered because they may have a not negligible influence on the frequency of core damage states in case of failures during the plant shut down because of the reduced redundancy of safety related systems. (orig.)

  9. Fuel element thermo-mechanical analysis during transient events using the FMS and FETMA codes

    Hernandez Lopez Hector; Hernandez Martinez Jose Luis; Ortiz Villafuerte Javier

    2005-01-01

    In the Instituto Nacional de Investigaciones Nucleares of Mexico, the Fuel Management System (FMS) software package has been used for long time to simulate the operation of a BWR nuclear power plant in steady state, as well as in transient events. To evaluate the fuel element thermo-mechanical performance during transient events, an interface between the FMS codes and our own Fuel Element Thermo Mechanical Analysis (FETMA) code is currently being developed and implemented. In this work, the results of the thermo-mechanical behavior of fuel rods in the hot channel during the simulation of transient events of a BWR nuclear power plant are shown. The transient events considered for this work are a load rejection and a feedwater control failure, which among the most important events that can occur in a BWR. The results showed that conditions leading to fuel rod failure at no time appeared for both events. Also, it is shown that a transient due load rejection is more demanding on terms of safety that the failure of a controller of the feedwater. (authors)

  10. Hazard analysis of typhoon-related external events using extreme value theory

    Kim, Yo Chan; Jang, Seung Cheol [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lim, Tae Jin [Dept. of Industrial Information Systems Engineering, Soongsil University, Seoul (Korea, Republic of)

    2015-02-15

    After the Fukushima accident, the importance of hazard analysis for extreme external events was raised. To analyze typhoon-induced hazards, which are one of the significant disasters of East Asian countries, a statistical analysis using the extreme value theory, which is a method for estimating the annual exceedance frequency of a rare event, was conducted for an estimation of the occurrence intervals or hazard levels. For the four meteorological variables, maximum wind speed, instantaneous wind speed, hourly precipitation, and daily precipitation, the parameters of the predictive extreme value theory models were estimated. The 100-year return levels for each variable were predicted using the developed models and compared with previously reported values. It was also found that there exist significant long-term climate changes of wind speed and precipitation. A fragility analysis should be conducted to ensure the safety levels of a nuclear power plant for high levels of wind speed and precipitation, which exceed the results of a previous analysis.

  11. Screening Analysis of Criticality Features, Events, and Processes for License Application

    J.A. McClure

    2004-01-01

    This report documents the screening analysis of postclosure criticality features, events, and processes. It addresses the probability of criticality events resulting from degradation processes as well as disruptive events (i.e., seismic, rock fall, and igneous). Probability evaluations are performed utilizing the configuration generator described in ''Configuration Generator Model'', a component of the methodology from ''Disposal Criticality Analysis Methodology Topical Report''. The total probability per package of criticality is compared against the regulatory probability criterion for inclusion of events established in 10 CFR 63.114(d) (consider only events that have at least one chance in 10,000 of occurring over 10,000 years). The total probability of criticality accounts for the evaluation of identified potential critical configurations of all baselined commercial and U.S. Department of Energy spent nuclear fuel waste form and waste package combinations, both internal and external to the waste packages. This criticality screening analysis utilizes available information for the 21-Pressurized Water Reactor Absorber Plate, 12-Pressurized Water Reactor Absorber Plate, 44-Boiling Water Reactor Absorber Plate, 24-Boiling Water Reactor Absorber Plate, and the 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Short waste package types. Where defensible, assumptions have been made for the evaluation of the following waste package types in order to perform a complete criticality screening analysis: 21-Pressurized Water Reactor Control Rod, 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Long, and 2-Multi-Canister Overpack/2-Defense High-Level Radioactive Waste package types. The inputs used to establish probabilities for this analysis report are based on information and data generated for the Total System Performance Assessment for the License Application, where available. This analysis report determines whether criticality is to be

  12. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  13. Identification of fire modeling issues based on an analysis of real events from the OECD FIRE database

    Hermann, Dominik [Swiss Federal Nuclear Safety Inspectorate ENSI, Brugg (Switzerland)

    2017-03-15

    Precursor analysis is widely used in the nuclear industry to judge the significance of events relevant to safety. However, in case of events that may damage equipment through effects that are not ordinary functional dependencies, the analysis may not always fully appreciate the potential for further evolution of the event. For fires, which are one class of such events, this paper discusses modelling challenges that need to be overcome when performing a probabilistic precursor analysis. The events used to analyze are selected from the Organisation for Economic Cooperation and Development (OECD) Fire Incidents Records Exchange (FIRE) Database.

  14. Is the efficacy of antidepressants in panic disorder mediated by adverse events? A mediational analysis.

    Irene Bighelli

    Full Text Available It has been hypothesised that the perception of adverse events in placebo-controlled antidepressant clinical trials may induce patients to conclude that they have been randomized to the active arm of the trial, leading to the breaking of blind. This may enhance the expectancies for improvement and the therapeutic response. The main objective of this study is to test the hypothesis that the efficacy of antidepressants in panic disorder is mediated by the perception of adverse events. The present analysis is based on a systematic review of published and unpublished randomised trials comparing antidepressants with placebo for panic disorder. The Baron and Kenny approach was applied to investigate the mediational role of adverse events in the relationship between antidepressants treatment and efficacy. Fourteen placebo-controlled antidepressants trials were included in the analysis. We found that: (a antidepressants treatment was significantly associated with better treatment response (ß = 0.127, 95% CI 0.04 to 0.21, p = 0.003; (b antidepressants treatment was not associated with adverse events (ß = 0.094, 95% CI -0.05 to 0.24, p = 0.221; (c adverse events were negatively associated with treatment response (ß = 0.035, 95% CI -0.06 to -0.05, p = 0.022. Finally, after adjustment for adverse events, the relationship between antidepressants treatment and treatment response remained statistically significant (ß = 0.122, 95% CI 0.01 to 0.23, p = 0.039. These findings do not support the hypothesis that the perception of adverse events in placebo-controlled antidepressant clinical trials may lead to the breaking of blind and to an artificial inflation of the efficacy measures. Based on these results, we argue that the moderate therapeutic effect of antidepressants in individuals with panic disorder is not an artefact, therefore reflecting a genuine effect that doctors can expect to replicate under real-world conditions.

  15. Device-independent secret-key-rate analysis for quantum repeaters

    Holz, Timo; Kampermann, Hermann; Bruß, Dagmar

    2018-01-01

    The device-independent approach to quantum key distribution (QKD) aims to establish a secret key between two or more parties with untrusted devices, potentially under full control of a quantum adversary. The performance of a QKD protocol can be quantified by the secret key rate, which can be lower bounded via the violation of an appropriate Bell inequality in a setup with untrusted devices. We study secret key rates in the device-independent scenario for different quantum repeater setups and compare them to their device-dependent analogon. The quantum repeater setups under consideration are the original protocol by Briegel et al. [Phys. Rev. Lett. 81, 5932 (1998), 10.1103/PhysRevLett.81.5932] and the hybrid quantum repeater protocol by van Loock et al. [Phys. Rev. Lett. 96, 240501 (2006), 10.1103/PhysRevLett.96.240501]. For a given repeater scheme and a given QKD protocol, the secret key rate depends on a variety of parameters, such as the gate quality or the detector efficiency. We systematically analyze the impact of these parameters and suggest optimized strategies.

  16. A multiprocessor system for the analysis of pictures of nuclear events

    Bacilieri, P; Matteuzzi, P; Sini, G P; Zanotti, U

    1979-01-01

    The pictures of nuclear events obtained from the bubble chambers such as Gargamelle and BEBC at CERN and others from Serpukhov are geometrically processed at CNAF (Centro Nazionale Analysis Photogrammi) in Bologna. The analysis system includes an Erasme table and a CRT flying spot digitizer. The difficulties connected with the pictures of the four stereoscopic views of the bubble chambers are overcome by the choice of a strong interactive system. (0 refs).

  17. Working group of experts on rare events in human error analysis and quantification

    Goodstein, L.P.

    1977-01-01

    In dealing with the reference problem of rare events in nuclear power plants, the group has concerned itself with the man-machine system and, in particular, with human error analysis and quantification. The Group was requested to review methods of human reliability prediction, to evaluate the extent to which such analyses can be formalized and to establish criteria to be met by task conditions and system design which would permit a systematic, formal analysis. Recommendations are given on the Fessenheim safety system

  18. Cryptographic analysis on the key space of optical phase encryption algorithm based on the design of discrete random phase mask

    Lin, Chao; Shen, Xueju; Li, Zengyan

    2013-07-01

    The key space of phase encryption algorithm using discrete random phase mask is investigated by numerical simulation in this paper. Random phase mask with finite and discrete phase levels is considered as the core component in most practical optical encryption architectures. The key space analysis is based on the design criteria of discrete random phase mask. The role of random amplitude mask and random phase mask in optical encryption system is identified from the perspective of confusion and diffusion. The properties of discrete random phase mask in a practical double random phase encoding scheme working in both amplitude encoding (AE) and phase encoding (PE) modes are comparably analyzed. The key space of random phase encryption algorithm is evaluated considering both the encryption quality and the brute-force attack resistibility. A method for enlarging the key space of phase encryption algorithm is also proposed to enhance the security of optical phase encryption techniques.

  19. Comprehensive Protein Interactome Analysis of a Key RNA Helicase: Detection of Novel Stress Granule Proteins

    Rebecca Bish

    2015-07-01

    Full Text Available DDX6 (p54/RCK is a human RNA helicase with central roles in mRNA decay and translation repression. To help our understanding of how DDX6 performs these multiple functions, we conducted the first unbiased, large-scale study to map the DDX6-centric protein-protein interactome using immunoprecipitation and mass spectrometry. Using DDX6 as bait, we identify a high-confidence and high-quality set of protein interaction partners which are enriched for functions in RNA metabolism and ribosomal proteins. The screen is highly specific, maximizing the number of true positives, as demonstrated by the validation of 81% (47/58 of the RNA-independent interactors through known functions and interactions. Importantly, we minimize the number of indirect interaction partners through use of a nuclease-based digestion to eliminate RNA. We describe eleven new interactors, including proteins involved in splicing which is an as-yet unknown role for DDX6. We validated and characterized in more detail the interaction of DDX6 with Nuclear fragile X mental retardation-interacting protein 2 (NUFIP2 and with two previously uncharacterized proteins, FAM195A and FAM195B (here referred to as granulin-1 and granulin-2, or GRAN1 and GRAN2. We show that NUFIP2, GRAN1, and GRAN2 are not P-body components, but re-localize to stress granules upon exposure to stress, suggesting a function in translation repression in the cellular stress response. Using a complementary analysis that resolved DDX6’s multiple complex memberships, we further validated these interaction partners and the presence of splicing factors. As DDX6 also interacts with the E3 SUMO ligase TIF1β, we tested for and observed a significant enrichment of sumoylation amongst DDX6’s interaction partners. Our results represent the most comprehensive screen for direct interaction partners of a key regulator of RNA life cycle and localization, highlighting new stress granule components and possible DDX6 functions

  20. Identification of homogeneous regions for rainfall regional frequency analysis considering typhoon event in South Korea

    Heo, J. H.; Ahn, H.; Kjeldsen, T. R.

    2017-12-01

    South Korea is prone to large, and often disastrous, rainfall events caused by a mixture of monsoon and typhoon rainfall phenomena. However, traditionally, regional frequency analysis models did not consider this mixture of phenomena when fitting probability distributions, potentially underestimating the risk posed by the more extreme typhoon events. Using long-term observed records of extreme rainfall from 56 sites combined with detailed information on the timing and spatial impact of past typhoons from the Korea Meteorological Administration (KMA), this study developed and tested a new mixture model for frequency analysis of two different phenomena; events occurring regularly every year (monsoon) and events only occurring in some years (typhoon). The available annual maximum 24 hour rainfall data were divided into two sub-samples corresponding to years where the annual maximum is from either (1) a typhoon event, or (2) a non-typhoon event. Then, three-parameter GEV distribution was fitted to each sub-sample along with a weighting parameter characterizing the proportion of historical events associated with typhoon events. Spatial patterns of model parameters were analyzed and showed that typhoon events are less commonly associated with annual maximum rainfall in the North-West part of the country (Seoul area), and more prevalent in the southern and eastern parts of the country, leading to the formation of two distinct typhoon regions: (1) North-West; and (2) Southern and Eastern. Using a leave-one-out procedure, a new regional frequency model was tested and compared to a more traditional index flood method. The results showed that the impact of typhoon on design events might previously have been underestimated in the Seoul area. This suggests that the use of the mixture model should be preferred where the typhoon phenomena is less frequent, and thus can have a significant effect on the rainfall-frequency curve. This research was supported by a grant(2017-MPSS31

  1. Key trends of climate change in the ASEAN countries. The IPAT decomposition analysis 1980-2005

    Vehmas, J.; Luukkanen, J.; Kaivo-oja, J.; Panula-Ontto, J.; Allievi, F.

    2012-07-01

    has been widely recognized. Energy and climate policy planning requires in-depth analyses of current trends and structures of energy production systems and related emission flows. Possibilities to reduce greenhouse gas emissions depend critically on economic growth and on the development of energy efficiency in economywide production systems. The ASEAN Leaders have expressed their concern and commitment for ASEAN to play a proactive role in addressing climate change through their declarations to the 2007 Bali and 2009 Copenhagen UN Conferences on Climate Change. They view the protection of the environment and the sustainable use and management of natural resources as essential to the long-term economic growth and social development of countries in the region. The ASEAN Vision 2020 calls for 'a clean and green ASEAN' with fully established mechanisms to ensure the protection of the environment, sustainability of natural resources, and high quality of life of people in the region. ASEAN Leaders have noted that: 'We acknowledged the energy cooperation between ASEAN and Japan in promoting energy efficiency and conservation as well as new and renewable energy, and stressed the need for closer cooperation. The ASEAN Leaders welcomed Japan's efforts to create a low-carbon society. We appreciated Thailand's offer for the use of the Practical Energy Management Training Center in Thailand which was established with funding from Japan to other ASEAN Member States interested in energy conservation in factories.' Thus, low carbon society is key energy policy target of ASEAN countries. Our analysis in this e-book gives analytical background to this strategy. This e-book also indicates that ASEAN countries have very different kind of challenges for low carbon strategy. The e-book provides useful information for ASEAN energy policy formulation and implementation of the Bali Roadmap. This study presents a comparative analysis of the driving forces behind

  2. An Initiating-Event Analysis for PSA of Hanul Units 3 and 4: Results and Insights

    Kim, Dong-San; Park, Jin Hee

    2015-01-01

    As a part of the PSA, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard related to IE analysis. This paper describes the methods of, results and some insights from the IE analysis for the PSA of the Hanul units 3 and 4. In this study, as a part of the PSA for the Hanul units 3 and 4, an initiating-event (IE) analysis was newly performed by considering the current state of knowledge and the requirements of the ASME/ANS probabilistic risk assessment (PRA) standard. In comparison with the previous IE analysis, this study performed a more systematic and detailed analysis to identify potential initiating events, and calculated the IE frequencies by using the state-of-the-art methods and the latest data. As a result, not a few IE frequencies are quite different from the previous frequencies, which can change the major accident sequences obtained from the quantification of the PSA model

  3. Extended analysis of the Trojan-horse attack in quantum key distribution

    Vinay, Scott E.; Kok, Pieter

    2018-04-01

    The discrete-variable quantum key distribution protocols based on the 1984 protocol of Bennett and Brassard (BB84) are known to be secure against an eavesdropper, Eve, intercepting the flying qubits and performing any quantum operation on them. However, these protocols may still be vulnerable to side-channel attacks. We investigate the Trojan-horse side-channel attack where Eve sends her own state into Alice's apparatus and measures the reflected state to estimate the key. We prove that the separable coherent state is optimal for Eve among the class of multimode Gaussian attack states, even in the presence of thermal noise. We then provide a bound on the secret key rate in the case where Eve may use any separable state.

  4. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  5. Disruptive event uncertainties in a perturbation approach to nuclear waste repository risk analysis

    Harvey, T.F.

    1980-09-01

    A methodology is developed for incorporating a full range of the principal forecasting uncertainties into a risk analysis of a nuclear waste repository. The result of this methodology is a set of risk curves similar to those used by Rasmussen in WASH-1400. The set of curves is partially derived from a perturbation approach to analyze potential disruptive event sequences. Such a scheme could be useful in truncating the number of disruptive event scenarios and providing guidance to those establishing data-base development priorities.

  6. Analysis of transverse momentum and event shape in νN scattering

    Bosetti, P.C.; Graessler, H.; Lanske, D.; Schulte, R.; Schultze, K.; Simopoulou, E.; Vayaki, A.; Barnham, K.W.J.; Hamisi, F.; Miller, D.B.; Mobayyen, M.M.; Wainstein, S.; Aderholz, M.; Hantke, D.; Hoffmann, E.; Katz, U.F.; Kern, J.; Schmitz, N.; Wittek, W.; Albajar, C.; Batley, J.R.; Myatt, G.; Perkins, D.H.; Radojicic, D.; Renton, P.; Saitta, S.; Bullock, F.W.; Burke, S.

    1990-01-01

    The transverse momentum distributions of hadrons produced in neutrino-nucleon charged current interactions and their dependence on W are analysed in detail. It is found that the components of the transverse momentum in the event plane and normal to it increase with W at about the same rate throughout the available W range. A comparison with e + e - data is made. Studies of the energy flow and angular distributions in the events classified as planar do not show clear evidence for high energy, wide angle gluon radiation, in contrast to the conclusion of a previous analysis of similar neutrino data. (orig.)

  7. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  8. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  9. Geohazard assessment through the analysis of historical alluvial events in Southern Italy

    Esposito, Eliana; Violante, Crescenzo

    2015-04-01

    The risk associated with extreme water events such as flash floods, results from a combination of overflows and landslides hazards. A multi-hazard approach have been utilized to analyze the 1773 flood that occurred in conjunction with heavy rainfall, causing major damage in terms of lost lives and economic cost over an area of 200 km2, including both the coastal strip between Salerno and Maiori and the Apennine hinterland, Campania region - Southern Italy. This area has been affected by a total of 40 flood events over the last five centuries, 26 of them occurred between 1900 and 2000. Streamflow events have produced severe impacts on Cava de' Tirreni (SA) and its territory and in particular four catastrophic floods in 1581, 1773, 1899 and 1954, caused a pervasive pattern of destruction. In the study area, rainstorm events typically occur in small and medium-sized fluvial system, characterized by small catchment areas and high-elevation drainage basins, causing the detachment of large amount of volcaniclastic and siliciclastic covers from the carbonate bedrock. The mobilization of these deposits (slope debris) mixed with rising floodwaters along the water paths can produce fast-moving streamflows of large proportion with significant hazardous implications (Violante et al., 2009). In this context the study of 1773 historical flood allows the detection and the definition of those areas where catastrophic events repeatedly took place over the time. Moreover, it improves the understanding of the phenomena themselves, including some key elements in the management of risk mitigation, such as the restoration of the damage suffered by the buildings and/or the environmental effects caused by the floods.

  10. Gait Event Detection in Real-World Environment for Long-Term Applications: Incorporating Domain Knowledge Into Time-Frequency Analysis.

    Khandelwal, Siddhartha; Wickstrom, Nicholas

    2016-12-01

    Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.

  11. Identifying Key Features of Student Performance in Educational Video Games and Simulations through Cluster Analysis

    Kerr, Deirdre; Chung, Gregory K. W. K.

    2012-01-01

    The assessment cycle of "evidence-centered design" (ECD) provides a framework for treating an educational video game or simulation as an assessment. One of the main steps in the assessment cycle of ECD is the identification of the key features of student performance. While this process is relatively simple for multiple choice tests, when…

  12. Analysis of 3gpp-MAC and two-key 3gpp-MAC

    Knudsen, Lars Ramkilde; Mitchell, C.J.

    2003-01-01

    Forgery and key-recovery attacks are described on the 3gpp-MAC scheme, proposed for inclusion in the 3gpp specification. Three main classes of attack are given, all of which operate whether or not truncation is applied to the MAC value. Attacks in the first class use a large number of 'chosen MAC...

  13. Internal event analysis for Laguna Verde Unit 1 Nuclear Power Plant. Accident sequence quantification and results

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1994-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant, CNSNS-TR 004, in five volumes. The reports are organized as follows: CNSNS-TR 004 Volume 1: Introduction and Methodology. CNSNS-TR4 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR 004 Volume 3: System Analysis. CNSNS-TR 004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR 005 Volume 5: Appendices A, B and C. This volume presents the development of the dependent failure analysis, the treatment of the support system dependencies, the identification of the shared-components dependencies, and the treatment of the common cause failure. It is also presented the identification of the main human actions considered along with the possible recovery actions included. The development of the data base and the assumptions and limitations in the data base are also described in this volume. The accident sequences quantification process and the resolution of the core vulnerable sequences are presented. In this volume, the source and treatment of uncertainties associated with failure rates, component unavailabilities, initiating event frequencies, and human error probabilities are also presented. Finally, the main results and conclusions for the Internal Event Analysis for Laguna Verde Nuclear Power Plant are presented. The total core damage frequency calculated is 9.03x 10-5 per year for internal events. The most dominant accident sequences found are the transients involving the loss of offsite power, the station blackout accidents, and the anticipated transients without SCRAM (ATWS). (Author)

  14. Analysis methodology for the post-trip return to power steam line break event

    Lee, Chul Shin; Kim, Chul Woo; You, Hyung Keun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different from that of non-RTP SLB and is more difficult. Therefore, it is necessary to develop a methodology to analyze the response of the NSSS parameters to the post-trip RTP SLB events and the fuel performance after the total reactivity exceeds the criticality. In this analysis, the cases with and without offsite power were simulated crediting 3-D reactivity feedback effect due to a local heatup in the vicinity of stuck CEA and compared with the cases without 3-D reactivity feedback with respect to post-trip fuel performance. Departure-to Nucleate Boiling Ratio (DNBR) and Linear Heat Generation Rate (LHGR). 36 tabs., 32 figs., 11 refs. (Author) .new.

  15. Tree Mortality following Prescribed Fire and a Storm Surge Event in Slash Pine (Pinus elliottii var. densa Forests in the Florida Keys, USA

    Jay P. Sah

    2010-01-01

    Full Text Available In fire-dependent forests, managers are interested in predicting the consequences of prescribed burning on postfire tree mortality. We examined the effects of prescribed fire on tree mortality in Florida Keys pine forests, using a factorial design with understory type, season, and year of burn as factors. We also used logistic regression to model the effects of burn season, fire severity, and tree dimensions on individual tree mortality. Despite limited statistical power due to problems in carrying out the full suite of planned experimental burns, associations with tree and fire variables were observed. Post-fire pine tree mortality was negatively correlated with tree size and positively correlated with char height and percent crown scorch. Unlike post-fire mortality, tree mortality associated with storm surge from Hurricane Wilma was greater in the large size classes. Due to their influence on population structure and fuel dynamics, the size-selective mortality patterns following fire and storm surge have practical importance for using fire as a management tool in Florida Keys pinelands in the future, particularly when the threats to their continued existence from tropical storms and sea level rise are expected to increase.

  16. Tree Mortality following Prescribed Fire and a Storm Surge Event in Slash Pine (Pinus elliottii var. densa) Forests in the Florida Keys, USA

    Sah, J.P.; Ross, M.S.; Ross, M.S.; Ogurcak, D.E.; Snyder, J.R.

    2010-01-01

    In fire-dependent forests, managers are interested in predicting the consequences of prescribed burning on post fire tree mortality. We examined the effects of prescribed fire on tree mortality in Florida Keys pine forests, using a factorial design with under story type, season, and year of burn as factors. We also used logistic regression to model the effects of burn season, fire severity, and tree dimensions on individual tree mortality. Despite limited statistical power due to problems in carrying out the full suite of planned experimental burns, associations with tree and fire variables were observed. Post-fire pine tree mortality was negatively correlated with tree size and positively correlated with char height and percent crown scorch. Unlike post-fire mortality, tree mortality associated with storm surge from Hurricane Wilma was greater in the large size classes. Due to their influence on population structure and fuel dynamics, the size-selective mortality patterns following fire and storm surge have practical importance for using fire as a management tool in Florida Keys pine lands in the future, particularly when the threats to their continued existence from tropical storms and sea level rise are expected to increase.

  17. Tree mortality following prescribed fire and a storm surge event in Slash Pine (pinus elliottii var. densa) forests in the Florida Keys, USA

    Sah, Jay P.; Ross, Michael S.; Snyder, James R.; Ogurcak, Danielle E.

    2010-01-01

    In fire-dependent forests, managers are interested in predicting the consequences of prescribed burning on postfire tree mortality. We examined the effects of prescribed fire on tree mortality in Florida Keys pine forests, using a factorial design with understory type, season, and year of burn as factors. We also used logistic regression to model the effects of burn season, fire severity, and tree dimensions on individual tree mortality. Despite limited statistical power due to problems in carrying out the full suite of planned experimental burns, associations with tree and fire variables were observed. Post-fire pine tree mortality was negatively correlated with tree size and positively correlated with char height and percent crown scorch. Unlike post-fire mortality, tree mortality associated with storm surge from Hurricane Wilma was greater in the large size classes. Due to their influence on population structure and fuel dynamics, the size-selective mortality patterns following fire and storm surge have practical importance for using fire as a management tool in Florida Keys pinelands in the future, particularly when the threats to their continued existence from tropical storms and sea level rise are expected to increase.

  18. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes

    Smits, M.; Janssen, J.; Vet, de H.C.W.; Zwaan, L.; Timmermans, D.R.M.; Groenewegen, P.P.; Wagner, C.

    2009-01-01

    BACKGROUND: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis: is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  19. Analysis of unintended events in hospitals : inter-rater reliability of constructing causal trees and classifying root causes

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Groenewegen, P.P.; Timmermans, D.

    2009-01-01

    Background. Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  20. Analysis of unintended events in hospitals: inter-rater reliability of constructing causal trees and classifying root causes.

    Smits, M.; Janssen, J.; Vet, R. de; Zwaan, L.; Timmermans, D.; Groenewegen, P.; Wagner, C.

    2009-01-01

    Background: Root cause analysis is a method to examine causes of unintended events. PRISMA (Prevention and Recovery Information System for Monitoring and Analysis) is a root cause analysis tool. With PRISMA, events are described in causal trees and root causes are subsequently classified with the

  1. Antipsychotics, glycemic disorders, and life-threatening diabetic events: a Bayesian data-mining analysis of the FDA adverse event reporting system (1968-2004).

    DuMouchel, William; Fram, David; Yang, Xionghu; Mahmoud, Ramy A; Grogg, Amy L; Engelhart, Luella; Ramaswamy, Krishnan

    2008-01-01

    This analysis compared diabetes-related adverse events associated with use of different antipsychotic agents. A disproportionality analysis of the US Food and Drug Administration (FDA) Adverse Event Reporting System (AERS) was performed. Data from the FDA postmarketing AERS database (1968 through first quarter 2004) were evaluated. Drugs studied included aripiprazole, clozapine, haloperidol, olanzapine, quetiapine, risperidone, and ziprasidone. Fourteen Medical Dictionary for Regulatory Activities (MedDRA) Primary Terms (MPTs) were chosen to identify diabetes-related adverse events; 3 groupings into higher-level descriptive categories were also studied. Three methods of measuring drug-event associations were used: proportional reporting ratio, the empirical Bayes data-mining algorithm known as the Multi-Item Gamma Poisson Shrinker, and logistic regression (LR) analysis. Quantitative measures of association strength, with corresponding confidence intervals, between drugs and specified adverse events were computed and graphed. Some of the LR analyses were repeated separately for reports from patients under and over 45 years of age. Differences in association strength were declared statistically significant if the corresponding 90% confidence intervals did not overlap. Association with various glycemic events differed for different drugs. On average, the rankings of association strength agreed with the following ordering: low association, ziprasidone, aripiprazole, haloperidol, and risperidone; medium association, quetiapine; and strong association, clozapine and olanzapine. The median rank correlation between the above ordering and the 17 sets of LR coefficients (1 set for each glycemic event) was 93%. Many of the disproportionality measures were significantly different across drugs, and ratios of disproportionality factors of 5 or more were frequently observed. There are consistent and substantial differences between atypical antipsychotic drugs in the

  2. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data.

    Luo, Jake; Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-10-17

    Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (Pgroup. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older patients (older than 40 years) were more likely to observe a diverse range of adverse events (Page group (older

  3. Performance Analysis of Hierarchical Group Key Management Integrated with Adaptive Intrusion Detection in Mobile ad hoc Networks

    2016-04-05

    applications in wireless networks such as military battlefields, emergency response, mobile commerce , online gaming, and collaborative work are based on the...www.elsevier.com/locate/peva Performance analysis of hierarchical group key management integrated with adaptive intrusion detection in mobile ad hoc...Accepted 19 September 2010 Available online 26 September 2010 Keywords: Mobile ad hoc networks Intrusion detection Group communication systems Group

  4. Living with extreme weather events - perspectives from climatology, geomorphological analysis, chronicles and opinion polls

    Auer, I.; Kirchengast, A.; Proske, H.

    2009-09-01

    The ongoing climate change debate focuses more and more on changing extreme events. Information on past events can be derived from a number of sources, such as instrumental data, residual impacts in the landscape, but also chronicles and people's memories. A project called "A Tale of Two Valleys” within the framework of the research program "proVision” allowed to study past extreme events in two inner-alpine valleys from the sources mentioned before. Instrumental climate time series provided information for the past 200 years, however great attention had to be given to the homogeneity of the series. To derive homogenized time series of selected climate change indices methods like HOCLIS and Vincent have been applied. Trend analyses of climate change indices inform about increase or decrease of extreme events. Traces of major geomorphodynamic processes of the past (e.g. rockfalls, landslides, debris flows) which were triggered or affected by extreme weather events are still apparent in the landscape and could be evaluated by geomorphological analysis using remote sensing and field data. Regional chronicles provided additional knowledge and covered longer periods back in time, however compared to meteorological time series they enclose a high degree of subjectivity and intermittent recordings cannot be obviated. Finally, questionnaires and oral history complemented our picture of past extreme weather events. People were differently affected and have different memories of it. The joint analyses of these four data sources showed agreement to some extent, however also showed some reasonable differences: meteorological data are point measurements only with a sometimes too coarse temporal resolution. Due to land-use changes and improved constructional measures the impact of an extreme meteorological event may be different today compared to earlier times.

  5. The logic of surveillance guidelines: an analysis of vaccine adverse event reports from an ontological perspective.

    Mélanie Courtot

    Full Text Available BACKGROUND: When increased rates of adverse events following immunization are detected, regulatory action can be taken by public health agencies. However to be interpreted reports of adverse events must be encoded in a consistent way. Regulatory agencies rely on guidelines to help determine the diagnosis of the adverse events. Manual application of these guidelines is expensive, time consuming, and open to logical errors. Representing these guidelines in a format amenable to automated processing can make this process more efficient. METHODS AND FINDINGS: Using the Brighton anaphylaxis case definition, we show that existing clinical guidelines used as standards in pharmacovigilance can be logically encoded using a formal representation such as the Adverse Event Reporting Ontology we developed. We validated the classification of vaccine adverse event reports using the ontology against existing rule-based systems and a manually curated subset of the Vaccine Adverse Event Reporting System. However, we encountered a number of critical issues in the formulation and application of the clinical guidelines. We report these issues and the steps being taken to address them in current surveillance systems, and in the terminological standards in use. CONCLUSIONS: By standardizing and improving the reporting process, we were able to automate diagnosis confirmation. By allowing medical experts to prioritize reports such a system can accelerate the identification of adverse reactions to vaccines and the response of regulatory agencies. This approach of combining ontology and semantic technologies can be used to improve other areas of vaccine adverse event reports analysis and should inform both the design of clinical guidelines and how they are used in the future. AVAILABILITY: Sufficient material to reproduce our results is available, including documentation, ontology, code and datasets, at http://purl.obolibrary.org/obo/aero.

  6. Analysis of the Power oscillations event in Laguna Verde Nuclear Power Plant. Preliminary Report

    Gonzalez M, V.M.; Amador G, R.; Castillo, R.; Hernandez, J.L.

    1995-01-01

    The event occurred at Unit 1 of Laguna Verde Nuclear Power Plant in January 24, 1995, is analyzed using the Ramona 3 B code. During this event, Unit 1 suffered power oscillation when operating previous to the transfer at high speed recirculating pumps. This phenomenon was timely detected by reactor operator who put the reactor in shut-down doing a manual Scram. Oscillations reached a maximum extent of 10.5% of nominal power from peak to peak with a frequency of 0.5 Hz. Preliminary evaluations show that the event did not endangered the fuel integrity. The results of simulating the reactor core with Ramona 3 B code show that this code is capable to moderate reactor oscillations. Nevertheless it will be necessary to perform a more detailed simulation of the event in order to prove that the code can predict the beginning of oscillations. It will be need an additional analysis which permit the identification of factors that influence the reactor stability in order to express recommendations and in this way avoid the recurrence of this kind of events. (Author)

  7. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.

  8. A hydrological analysis of the 4 November 2011 event in Genoa

    F. Silvestro

    2012-09-01

    Full Text Available On the 4 November 2011 a flash flood event hit the area of Genoa with dramatic consequences. Such an event represents, from the meteorological and hydrological perspective, a paradigm of flash floods in the Mediterranean environment.

    The hydro-meteorological probabilistic forecasting system for small and medium size catchments in use at the Civil Protection Centre of Liguria region exhibited excellent performances for the event, by predicting, 24–48 h in advance, the potential level of risk associated with the forecast. It greatly helped the decision makers in issuing a timely and correct alert.

    In this work we present the operational outputs of the system provided during the Liguria events and the post event hydrological modelling analysis that has been carried out accounting also for the crowd sourcing information and data. We discuss the benefit of the implemented probabilistic systems for decision-making under uncertainty, highlighting how, in this case, the multi-catchment approach used for predicting floods in small basins has been crucial.

  9. Accuracy analysis of measurements on a stable power-law distributed series of events

    Matthews, J O; Hopcraft, K I; Jakeman, E; Siviour, G B

    2006-01-01

    We investigate how finite measurement time limits the accuracy with which the parameters of a stably distributed random series of events can be determined. The model process is generated by timing the emigration of individuals from a population that is subject to deaths and a particular choice of multiple immigration events. This leads to a scale-free discrete random process where customary measures, such as mean value and variance, do not exist. However, converting the number of events occurring in fixed time intervals to a 1-bit 'clipped' process allows the construction of well-behaved statistics that still retain vestiges of the original power-law and fluctuation properties. These statistics include the clipped mean and correlation function, from measurements of which both the power-law index of the distribution of events and the time constant of its fluctuations can be deduced. We report here a theoretical analysis of the accuracy of measurements of the mean of the clipped process. This indicates that, for a fixed experiment time, the error on measurements of the sample mean is minimized by an optimum choice of the number of samples. It is shown furthermore that this choice is sensitive to the power-law index and that the approach to Poisson statistics is dominated by rare events or 'outliers'. Our results are supported by numerical simulation

  10. Security as the Key Factor in Contemporary Tourism: Specificities Identified Through the Analysis of Responders’ Attitudes

    Penić, Josipa; Kurečić, Petar

    2017-01-01

    The paper represents a product of mentor- graduate student cooperation, developed at the graduate study of Business Economics, major Tourism. Following the latest threatening events and having in mind those yet to come, we can conclude that no country can benefit from the tourism industry if at the same time does not develop its security system as an integral part of the standard tourist offer. Analyzing the trends in contemporary tourism, the safety and security issues became the decisive fa...

  11. Software failure events derivation and analysis by frame-based technique

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  12. Finite-size analysis of continuous-variable measurement-device-independent quantum key distribution

    Zhang, Xueying; Zhang, Yichen; Zhao, Yijia; Wang, Xiangyu; Yu, Song; Guo, Hong

    2017-10-01

    We study the impact of the finite-size effect on the continuous-variable measurement-device-independent quantum key distribution (CV-MDI QKD) protocol, mainly considering the finite-size effect on the parameter estimation procedure. The central-limit theorem and maximum likelihood estimation theorem are used to estimate the parameters. We also analyze the relationship between the number of exchanged signals and the optimal modulation variance in the protocol. It is proved that when Charlie's position is close to Bob, the CV-MDI QKD protocol has the farthest transmission distance in the finite-size scenario. Finally, we discuss the impact of finite-size effects related to the practical detection in the CV-MDI QKD protocol. The overall results indicate that the finite-size effect has a great influence on the secret-key rate of the CV-MDI QKD protocol and should not be ignored.

  13. Key Performance Indicators and Analysts' Earnings Forecast Accuracy: An Application of Content Analysis

    Alireza Dorestani; Zabihollah Rezaee

    2011-01-01

    We examine the association between the extent of change in key performance indicator (KPI) disclosures and the accuracy of forecasts made by analysts. KPIs are regarded as improving both the transparency and relevancy of public financial information. The results of using linear regression models show that contrary to our prediction and the hypothesis of this paper, there is no significant association between the change in non- financial KPI disclosures and the accuracy of analysts' forecasts....

  14. Desynchronization Chaos Shift Keying Method Based on the Error Second Derivative and Its Security Analysis

    Čelikovský, Sergej; Lynnyk, Volodymyr

    2012-01-01

    Roč. 22, č. 9 (2012), 1250231-1-1250231-11 ISSN 0218-1274 R&D Projects: GA ČR(CZ) GAP103/12/1794 Institutional support: RVO:67985556 Keywords : Nonlinear system * desynchronization * chaos shift keying * generalized Lorenz system Subject RIV: BC - Control Systems Theory Impact factor: 0.921, year: 2012 http://library.utia.cas.cz/separaty/2012/TR/celikovsky-0381701.pdf

  15. Automatic Inference of Cryptographic Key Length Based on Analysis of Proof Tightness

    2016-06-01

    allows us to select a smaller security parameter). 5.5 Python Implementation We implement our software tool in the Python programming language...27 5.4 Second Pass . . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.5 Python Implementation . . . . . . . . . . . . . . . . . . . . . . 32 5.6...software tool, implemented in Python and leveraging the SymPy symbolic solver library; and • We validate our tool using the Schnorr public-key

  16. A tandem regression-outlier analysis of a ligand cellular system for key structural modifications around ligand binding.

    Lin, Ying-Ting

    2013-04-30

    A tandem technique of hard equipment is often used for the chemical analysis of a single cell to first isolate and then detect the wanted identities. The first part is the separation of wanted chemicals from the bulk of a cell; the second part is the actual detection of the important identities. To identify the key structural modifications around ligand binding, the present study aims to develop a counterpart of tandem technique for cheminformatics. A statistical regression and its outliers act as a computational technique for separation. A PPARγ (peroxisome proliferator-activated receptor gamma) agonist cellular system was subjected to such an investigation. Results show that this tandem regression-outlier analysis, or the prioritization of the context equations tagged with features of the outliers, is an effective regression technique of cheminformatics to detect key structural modifications, as well as their tendency of impact to ligand binding. The key structural modifications around ligand binding are effectively extracted or characterized out of cellular reactions. This is because molecular binding is the paramount factor in such ligand cellular system and key structural modifications around ligand binding are expected to create outliers. Therefore, such outliers can be captured by this tandem regression-outlier analysis.

  17. Review article: loss of the calcium-sensing receptor in colonic epithelium is a key event in the pathogenesis of colon cancer.

    Rogers, Ailín C

    2012-03-01

    The calcium-sensing receptor (CaSR) is expressed abundantly in normal colonic epithelium and lost in colon cancer, but its exact role on a molecular level and within the carcinogenesis pathway is yet to be described. Epidemiologic studies show that inadequate dietary calcium predisposes to colon cancer; this may be due to the ability of calcium to bind and upregulate the CaSR. Loss of CaSR expression does not seem to be an early event in carcinogenesis; indeed it is associated with late stage, poorly differentiated, chemo-resistant tumors. Induction of CaSR expression in neoplastic colonocytes arrests tumor progression and deems tumors more sensitive to chemotherapy; hence CaSR may be an important target in colon cancer treatment. The CaSR has a complex role in colon cancer; however, more investigation is required on a molecular level to clarify its exact function in carcinogenesis. This review describes the mechanisms by which the CaSR is currently implicated in colon cancer and identifies areas where further study is needed.

  18. Cryogenic dark matter search (CDMS II): Application of neural networks and wavelets to event analysis

    Attisha, Michael J. [Brown U.

    2006-01-01

    The Cryogenic Dark Matter Search (CDMS) experiment is designed to search for dark matter in the form of Weakly Interacting Massive Particles (WIMPs) via their elastic scattering interactions with nuclei. This dissertation presents the CDMS detector technology and the commissioning of two towers of detectors at the deep underground site in Soudan, Minnesota. CDMS detectors comprise crystals of Ge and Si at temperatures of 20 mK which provide ~keV energy resolution and the ability to perform particle identification on an event by event basis. Event identification is performed via a two-fold interaction signature; an ionization response and an athermal phonon response. Phonons and charged particles result in electron recoils in the crystal, while neutrons and WIMPs result in nuclear recoils. Since the ionization response is quenched by a factor ~ 3(2) in Ge(Si) for nuclear recoils compared to electron recoils, the relative amplitude of the two detector responses allows discrimination between recoil types. The primary source of background events in CDMS arises from electron recoils in the outer 50 µm of the detector surface which have a reduced ionization response. We develop a quantitative model of this ‘dead layer’ effect and successfully apply the model to Monte Carlo simulation of CDMS calibration data. Analysis of data from the two tower run March-August 2004 is performed, resulting in the world’s most sensitive limits on the spin-independent WIMP-nucleon cross-section, with a 90% C.L. upper limit of 1.6 × 10-43 cm2 on Ge for a 60 GeV WIMP. An approach to performing surface event discrimination using neural networks and wavelets is developed. A Bayesian methodology to classifying surface events using neural networks is found to provide an optimized method based on minimization of the expected dark matter limit. The discrete wavelet analysis of CDMS phonon pulses improves surface event discrimination in conjunction with the neural

  19. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  20. Brief communication: Post-event analysis of loss of life due to hurricane Harvey

    Jonkman, Sebastiaan N.; Godfroy, Maartje; Sebastian, Antonia; Kolen, Bas

    2018-01-01

    An analysis was made of the loss of life directly caused by hurricane Harvey. Information was collected for 70 fatalities that occurred directly due to the event. Most of the fatalities occurred in the greater Houston area, which was most severely affected by extreme rainfall and heavy flooding. The majority of fatalities in this area were recovered outside the designated 100 and 500 year flood zones. Most fatalities occurred due to drowning (81 %), particularly in and around vehicles...

  1. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  2. SHAREHOLDERS VALUE AND CATASTROPHE BONDS. AN EVENT STUDY ANALYSIS AT EUROPEAN LEVEL

    Constantin, Laura-Gabriela; Cernat-Gruici, Bogdan; Lupu, Radu; Nadotti Loris, Lino Maria

    2015-01-01

    Considering that the E.U. based (re)insurance companies are increasingly active within the segment of alternative risk transfer market, the aim of the present paper is to emphasize the impact of issuing cat bonds on the shareholders’ value for highlighting the competitive advantages of the analysed (re)insurance companies while pursuing the consolidation of their resilience in a turbulent economic environment.Eminently an applicative research, the analysis employs an event study methodology w...

  3. FINANCIAL MARKET REACTIONS TO INTERNATIONAL MERGERS & ACQUISITIONS IN THE BREWING INDUSTRY: AN EVENT STUDY ANALYSIS

    Heyder, Matthias; Ebneth, Oliver; Theuvsen, Ludwig

    2008-01-01

    Cross-border acquisitions have been the growing trend in recent years in the world brewing industry, giving brewers the opportunity to enhance their degree of internationalization and market share remarkably. This study employs event study analysis to examine 31 mergers and acquisitions among leading European brewing groups. Differences regarding financial market reactions can be determined within the European peer group. Managerial implications as well as future research propositions conclud...

  4. Neural network approach in multichannel auditory event-related potential analysis.

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  5. Advanced reactor passive system reliability demonstration analysis for an external event

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2017-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event

  6. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    Matthew Bucknor

    2017-03-01

    Full Text Available Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general for the postulated transient event.

  7. Advanced reactor passive system reliability demonstration analysis for an external event

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin [Argonne National Laboratory, Argonne (United States)

    2017-03-15

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.

  8. RELAP5/MOD 3.3 analysis of Reactor Coolant Pump Trip event at NPP Krsko

    Bencik, V.; Debrecin, N.; Foretic, D.

    2003-01-01

    In the paper the results of the RELAP5/MOD 3.3 analysis of the Reactor Coolant Pump (RCP) Trip event at NPP Krsko are presented. The event was initiated by an operator action aimed to prevent the RCP 2 bearing damage. The action consisted of a power reduction, that lasted for 50 minutes, followed by a reactor and a subsequent RCP 2 trip when the reactor power was reduced to 28 %. Two minutes after reactor trip, the Main Steam Isolation Valves (MSIV) were isolated and the steam dump flow was closed. On the secondary side the Steam Generator (SG) pressure rose until SG 1 Safety Valve (SV) 1 opened. The realistic RELAP5/MOD 3.3 analysis has been performed in order to model the particular plant behavior caused by operator actions. The comparison of the RELAP5/MOD 3.3 results with the measurement for the power reduction transient has shown small differences for the major parameters (nuclear power, average temperature, secondary pressure). The main trends and physical phenomena following the RCP Trip event were well reproduced in the analysis. The parameters that have the major influence on transient results have been identified. In the paper the influence of SG 1 relief and SV valves on transient results was investigated more closely. (author)

  9. The January 2001, El Salvador event: a multi-data analysis

    Vallee, M.; Bouchon, M.; Schwartz, S. Y.

    2001-12-01

    On January 13, 2001, a large normal event (Mw=7.6) occured 100 kilometers away from the Salvadorian coast (Central America) with a centroid depth of about 50km. The size of this event is surprising according to the classical idea that such events have to be much weaker than thrust events in subduction zones. We analysed this earthquake with different types of data: because teleseismic waves are the only data which offer a good azimuthal coverage, we first built a kinematic source model with P and SH waves provided by the IRIS-GEOSCOPE networks. The ambiguity between the 30o plane (plunging toward Pacific Ocean) and the 60o degree plane (plunging toward Central America) leaded us to do a parallel analysis of the two possible planes. We used a simple point-source modelling in order to define the main characteristics of the event and then used an extended source to retrieve the kinematic features of the rupture. For the 2 possible planes, this analysis reveals a downdip and northwest rupture propagation but the difference of fit remains subtle even when using the extended source. In a second part we confronted our models for the two planes with other seismological data, which are (1) regional data, (2) surface wave data through an Empirical Green Function given by a similar but much weaker earthquake which occured in July 1996 and lastly (3) nearfield data provided by Universidad Centroamericana (UCA) and Centro de Investigationes Geotecnicas (CIG). Regional data do not allow to discriminate the 2 planes neither but surface waves and especially near field data confirm that the fault plane is the steepest one plunging toward Central America. Moreover, the slight directivity toward North is confirmed by surface waves.

  10. Analysis on Outcome of 3537 Patients with Coronary Artery Disease: Integrative Medicine for Cardiovascular Events

    Zhu-ye Gao

    2013-01-01

    Full Text Available Aims. To investigate the treatment of hospitalized patients with coronary artery disease (CAD and the prognostic factors in Beijing, China. Materials and Methods. A multicenter prospective study was conducted through an integrative platform of clinical and research at 12 hospitals in Beijing, China. The clinical information of 3537 hospitalized patients with CAD was collected from September 2009 to May 2011, and the efficacy of secondary prevention during one-year followup was evaluated. In addition, a logistic regression analysis was performed to identify some factors which will have independent impact on the prognosis. Results. The average age of all patients was 64.88 ± 11.97. Of them, 65.42% are males. The medicines for patients were as follows: antiplatelet drugs accounting for 91.97%, statins accounting for 83.66%, β-receptor blockers accounting for 72.55%, ACEI/ARB accounting for 58.92%, and revascularization (including PCI and CABG accounting for 40.29%. The overall incidence of cardiovascular events was 13.26% (469/3537. The logistic stepwise regression analysis showed that heart failure (OR, 3.707, 95% CI = 2.756–4.986, age ≥ 65 years old (OR, 2.007, 95% CI = 1.587–2.53, and myocardial infarction (OR, 1.649, 95% CI = 1.322–2.057 were the independent risk factors of others factors for cardiovascular events that occurred during followup of one-year period. Integrative medicine (IM therapy showed the beneficial tendency for decreasing incidence of cardiovascular events, although no statistical significance was found (OR, 0.797, 95% CI = 0.613~1.036. Conclusions. Heart failure, age ≥ 65 years old, and myocardial infarction were associated with an increase in incidence of cardiovascular events, and treatment with IM showed a tendency for decreasing incidence of cardiovascular events.

  11. STOP-EVENT-RELATED POTENTIALS FROM INTRACRANIAL ELECTRODES REVEAL A KEY ROLE OF PREMOTOR AND MOTOR CORTICES IN STOPPING ONGOING MOVEMENTS

    Maurizio eMattia

    2012-06-01

    Full Text Available In humans, the ability to withhold manual motor responses seems to rely on a right-lateralized frontal–basal ganglia–thalamic network, including the pre-supplementary motor area and the inferior frontal gyrus. These areas should drive subthalamic nuclei to implement movement inhibition via the hyperdirect pathway. The output of this network is expected to influence those cortical areas underlying limb movement preparation and initiation, i.e. premotor (PMA and primary motor (M1 cortices. Electroencephalographic (EEG studies have shown an enhancement of the N200/P300 complex in the event-related potentials (ERPs when a planned reaching movement is successfully stopped after the presentation of an infrequent stop-signal. PMA and M1 have been suggested as possible neural sources of this ERP complex but, due to the limited spatial resolution of scalp EEG, it is not yet clear which cortical areas contribute to its generation. To elucidate the role of motor cortices, we recorded epicortical ERPs from the lateral surface of the fronto-temporal lobes of five pharmacoresistant epileptic patients performing a reaching version of the countermanding task while undergoing presurgical monitoring. We consistently found a stereotyped ERP complex on a single-trial level when a movement was successfully cancelled. These ERPs were selectively expressed in M1, PMA and Brodmann's area (BA 9 and their onsets preceded the end of the stop process, suggesting a causal involvement in this executive function. Such ERPs also occurred in unsuccessful-stop trials, that is, when subjects moved despite the occurrence of a stop-signal, mostly when they had long reaction times. These findings support the hypothesis that motor cortices are the final target of the inhibitory command elaborated by the frontal–basal ganglia–thalamic network.

  12. CYP2F2-generated metabolites, not styrene oxide, are a key event mediating the mode of action of styrene-induced mouse lung tumors.

    Cruzan, G; Bus, J; Hotchkiss, J; Harkema, J; Banton, M; Sarang, S

    2012-02-01

    Styrene induces lung tumors in mice but not in rats. Although metabolism of styrene to 7,8-styrene oxide (SO) by CYP2E1 has been suggested as a mediator of styrene toxicity, lung toxicity is not attenuated in CYP2E1 knockout mice. However, styrene and/or SO metabolism by mouse lung Clara cell-localized CYP2F2 to ring-oxidized cytotoxic metabolite(s) has been postulated as a key metabolic gateway responsible for both lung toxicity and possible tumorigenicity. To test this hypothesis, the lung toxicity of styrene and SO was evaluated in C57BL/6 (WT) and CYP2F2⁻/⁻ knockout mice treated with styrene (400 mg/kg/day, gavage, or 200 or 400 mg/kg/day, ip) or S- or R-SO (200 mg/kg/day, ip) for 5 days. Styrene treated WT mice displayed significant necrosis and exfoliation of Clara cells, and cumulative BrdU-labeling index of S-phase cells was markedly increased in terminal bronchioles of WT mice exposed to styrene or S- or RSO. In contrast, Clara and terminal bronchiole cell toxicity was not observed in CYP2F2⁻/⁻ mice exposed to either styrene or SO. This study clearly demonstrates that the mouse lung toxicity of both styrene and SO is critically dependent on metabolism by CYP2F2. Importantly, the human isoform of CYP2F, CYP2F1, is expressed at much lower levels and likely does not catalyze significant styrene metabolism, supporting the hypothesis that styrene-induced mouse lung tumors may not quantitatively, or possibly qualitatively, predict lung tumor potential in humans. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Probabilistic Dynamics for Integrated Analysis of Accident Sequences considering Uncertain Events

    Robertas Alzbutas

    2015-01-01

    Full Text Available The analytical/deterministic modelling and simulation/probabilistic methods are used separately as a rule in order to analyse the physical processes and random or uncertain events. However, in the currently used probabilistic safety assessment this is an issue. The lack of treatment of dynamic interactions between the physical processes on one hand and random events on the other hand causes the limited assessment. In general, there are a lot of mathematical modelling theories, which can be used separately or integrated in order to extend possibilities of modelling and analysis. The Theory of Probabilistic Dynamics (TPD and its augmented version based on the concept of stimulus and delay are introduced for the dynamic reliability modelling and the simulation of accidents in hybrid (continuous-discrete systems considering uncertain events. An approach of non-Markovian simulation and uncertainty analysis is discussed in order to adapt the Stimulus-Driven TPD for practical applications. The developed approach and related methods are used as a basis for a test case simulation in view of various methods applications for severe accident scenario simulation and uncertainty analysis. For this and for wider analysis of accident sequences the initial test case specification is then extended and discussed. Finally, it is concluded that enhancing the modelling of stimulated dynamics with uncertainty and sensitivity analysis allows the detailed simulation of complex system characteristics and representation of their uncertainty. The developed approach of accident modelling and analysis can be efficiently used to estimate the reliability of hybrid systems and at the same time to analyze and possibly decrease the uncertainty of this estimate.

  14. Security analysis on some experimental quantum key distribution systems with imperfect optical and electrical devices

    Liang, Lin-Mei; Sun, Shi-Hai; Jiang, Mu-Sheng; Li, Chun-Yan

    2014-10-01

    In general, quantum key distribution (QKD) has been proved unconditionally secure for perfect devices due to quantum uncertainty principle, quantum noncloning theorem and quantum nondividing principle which means that a quantum cannot be divided further. However, the practical optical and electrical devices used in the system are imperfect, which can be exploited by the eavesdropper to partially or totally spy the secret key between the legitimate parties. In this article, we first briefly review the recent work on quantum hacking on some experimental QKD systems with respect to imperfect devices carried out internationally, then we will present our recent hacking works in details, including passive faraday mirror attack, partially random phase attack, wavelength-selected photon-number-splitting attack, frequency shift attack, and single-photon-detector attack. Those quantum attack reminds people to improve the security existed in practical QKD systems due to imperfect devices by simply adding countermeasure or adopting a totally different protocol such as measurement-device independent protocol to avoid quantum hacking on the imperfection of measurement devices [Lo, et al., Phys. Rev. Lett., 2012, 108: 130503].

  15. Detector-device-independent quantum key distribution: Security analysis and fast implementation

    Boaron, Alberto; Korzh, Boris; Boso, Gianluca; Martin, Anthony; Zbinden, Hugo; Houlmann, Raphael; Lim, Charles Ci Wen

    2016-01-01

    One of the most pressing issues in quantum key distribution (QKD) is the problem of detector side-channel attacks. To overcome this problem, researchers proposed an elegant “time-reversal” QKD protocol called measurement-device-independent QKD (MDI-QKD), which is based on time-reversed entanglement swapping. However, MDI-QKD is more challenging to implement than standard point-to-point QKD. Recently, an intermediary QKD protocol called detector-device-independent QKD (DDI-QKD) has been proposed to overcome the drawbacks of MDI-QKD, with the hope that it would eventually lead to a more efficient detector side-channel-free QKD system. Here, we analyze the security of DDI-QKD and elucidate its security assumptions. We find that DDI-QKD is not equivalent to MDI-QKD, but its security can be demonstrated with reasonable assumptions. On the more practical side, we consider the feasibility of DDI-QKD and present a fast experimental demonstration (clocked at 625 MHz), capable of secret key exchange up to more than 90 km.

  16. A matter of definition--key elements identified in a discourse analysis of definitions of palliative care.

    Pastrana, T; Jünger, S; Ostgathe, C; Elsner, F; Radbruch, L

    2008-04-01

    For more than 30 years, the term "palliative care" has been used. From the outset, the term has undergone a series of transformations in its definitions and consequently in its tasks and goals. There remains a lack of consensus on a definition. The aim of this article is to analyse the definitions of palliative care in the specialist literature and to identify the key elements of palliative care using discourse analysis: a qualitative methodology. The literature search focused on definitions of the term 'palliative medicine' and 'palliative care' in the World Wide Web and medical reference books in English and German. A total of 37 English and 26 German definitions were identified and analysed. Our study confirmed the lack of a consistent meaning concerning the investigated terms, reflecting on-going discussion about the nature of the field among palliative care practitioners. Several common key elements were identified. Four main categories emerged from the discourse analysis of the definition of palliative care: target groups, structure, tasks and expertise. In addition, the theoretical principles and goals of palliative care were discussed and found to be key elements, with relief and prevention of suffering and improvement of quality of life as main goals. The identified key elements can contribute to the definition of the concept 'palliative care'. Our study confirms the importance of semantic and ethical influences on palliative care that should be considered in future research on semantics in different languages.

  17. Procedure proposed for performance of a probabilistic safety analysis for the event of ''Air plane crash''

    Hoffmann, H.H.

    1998-01-01

    A procedures guide for a probabilistic safety analysis for the external event 'Air plane crash' has been prepared. The method is based on analysis done within the framework of PSA for German NPPs as well as on international documents. Both crashes of military air planes and commercial air planes contribute to the plant risk. For the determination of the plant related crash rate the air traffic will be divided into 3 different categories of air traffic: - The landing and takeoff phase, - the airlane traffic and waiting loop traffic, - the free air traffic, and the air planes into different types and weight classes. (orig./GL) [de

  18. Analysis of an ordinary bedload transport event in a mountain torrent (Rio Vanti, Verona, Italy)

    Pastorello, Roberta; D'Agostino, Vincenzo

    2016-04-01

    The correct simulation of the sediment-transport response of mountain torrents both for extreme and ordinary flood events is a fundamental step to understand the process, but also to drive proper decisions on the protection works. The objective of this research contribution is to reconstruct the 'ordinary' flood event with the associated sediment-graph of a flood that caused on the 14th of October, 2014 the formation of a little debris cone (about 200-210 m3) at the junction between the 'Rio Vanti' torrent catchment and the 'Selva di Progno' torrent (Veneto Region, Prealps, Verona, Italy). To this purpose, it is important to notice that a great part of equations developed for the computation of the bedload transport capacity, like for example that of Schoklitsch (1962) or Smart and Jaeggi (1983), are focused on extraordinary events heavily affecting the river-bed armour. These formulas do not provide reliable results if used on events, like the one under analysis, not too far from the bankfull conditions. The Rio Vanti event was characterized by a total rainfall depth of 36.2 mm and a back-calculated peak discharge of 6.12 m3/s with a return period of 1-2 years. The classical equations to assess the sediment transport capacity overestimate the total volume of the event of several orders of magnitude. By the consequence, the following experimental bedload transport equation has been applied (D'Agostino and Lenzi, 1999), which is valid for ordinary flood events (q: unit water discharge; qc: unit discharge of bedload transport initiation; qs: unit bedload rate; S: thalweg slope): -qs-˜= 0,04ṡ(q- qc) S3/2 In particular, starting from the real rainfall data, the hydrograph and the sediment-graph have been reconstructed. Then, comparing the total volume calculated via the above cited equation to the real volume estimated using DoD techniques on post-event photogrammetric survey, a very satisfactory agreement has been obtained. The result further supports the thesis

  19. Cost analysis of adverse events associated with non-small cell lung cancer management in France

    Chouaid C

    2017-07-01

    , anemia (€5,752 per event, dehydration (€5,207 per event and anorexia (€4,349 per event. Costs were mostly driven by hospitalization costs.Conclusion: Among the AEs identified, a majority appeared to have an important economic impact, with a management cost of at least €2,000 per event mainly driven by hospitalization costs. This study may be of interest for economic evaluations of new interventions in NSCLC. Keywords: non-small cell lung cancer, adverse events, cost analysis, chemotherapy, immunotherapy

  20. Analysis and modeling of a hail event consequences on a building portfolio

    Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel

    2014-05-01

    North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the

  1. Shock events and flood risk management: a media analysis of the institutional long-term effects of flood events in the Netherlands and Poland

    Maria Kaufmann

    2016-12-01

    Full Text Available Flood events that have proven to create shock waves in society, which we will call shock events, can open windows of opportunity that allow different actor groups to introduce new ideas. Shock events, however, can also strengthen the status quo. We will take flood events as our object of study. Whereas others focus mainly on the immediate impact and disaster management, we will focus on the long-term impact on and resilience of flood risk governance arrangements. Over the last 25 years, both the Netherlands and Poland have suffered several flood-related events. These triggered strategic and institutional changes, but to different degrees. In a comparative analysis these endogenous processes, i.e., the importance of framing of the flood event, its exploitation by different actor groups, and the extent to which arrangements are actually changing, are examined. In line with previous research, our analysis revealed that shock events test the capacity to resist and bounce back and provide opportunities for adapting and learning. They "open up" institutional arrangements and make them more susceptible to change, increasing the opportunity for adaptation. In this way they can facilitate a shift toward different degrees of resilience, i.e., by adjusting the current strategic approach or by moving toward another strategic approach. The direction of change is influenced by the actors and the frames they introduce, and their ability to increase the resonance of the frame. The persistence of change seems to be influenced by the evolution of the initial management approach, the availability of resources, or the willingness to allocate resources.

  2. Prehospital Interventions During Mass-Casualty Events in Afghanistan: A Case Analysis.

    Schauer, Steven G; April, Michael D; Simon, Erica; Maddry, Joseph K; Carter, Robert; Delorenzo, Robert A

    2017-08-01

    Mass-casualty (MASCAL) events are known to occur in the combat setting. There are very limited data at this time from the Joint Theater (Iraq and Afghanistan) wars specific to MASCAL events. The purpose of this report was to provide preliminary data for the development of prehospital planning and guidelines. Cases were identified using the Department of Defense (DoD; Virginia USA) Trauma Registry (DoDTR) and the Prehospital Trauma Registry (PHTR). These cases were identified as part of a research study evaluating Tactical Combat Casualty Care (TCCC) guidelines. Cases that were designated as or associated with denoted MASCAL events were included. Data Fifty subjects were identified during the course of this project. Explosives were the most common cause of injuries. There was a wide range of vital signs. Tourniquet placement and pressure dressings were the most common interventions, followed by analgesia administration. Oral transmucosal fentanyl citrate (OTFC) was the most common parenteral analgesic drug administered. Most were evacuated as "routine." Follow-up data were available for 36 of the subjects and 97% were discharged alive. The most common prehospital interventions were tourniquet and pressure dressing hemorrhage control, along with pain medication administration. Larger data sets are needed to guide development of MASCAL in-theater clinical practice guidelines. Schauer SG , April MD , Simon E , Maddry JK , Carter R III , Delorenzo RA . Prehospital interventions during mass-casualty events in Afghanistan: a case analysis. Prehosp Disaster Med. 2017;32(4):465-468.

  3. Psychiatric adverse events during treatment with brodalumab: Analysis of psoriasis clinical trials.

    Lebwohl, Mark G; Papp, Kim A; Marangell, Lauren B; Koo, John; Blauvelt, Andrew; Gooderham, Melinda; Wu, Jashin J; Rastogi, Shipra; Harris, Susan; Pillai, Radhakrishnan; Israel, Robert J

    2018-01-01

    Individuals with psoriasis are at increased risk for psychiatric comorbidities, including suicidal ideation and behavior (SIB). To distinguish between the underlying risk and potential for treatment-induced psychiatric adverse events in patients with psoriasis being treated with brodalumab, a fully human anti-interleukin 17 receptor A monoclonal antibody. Data were evaluated from a placebo-controlled, phase 2 clinical trial; the open-label, long-term extension of the phase 2 clinical trial; and three phase 3, randomized, double-blind, controlled clinical trials (AMAGINE-1, AMAGINE-2, and AMAGINE-3) and their open-label, long-term extensions of patients with moderate-to-severe psoriasis. The analysis included 4464 patients with 9161.8 patient-years of brodalumab exposure. The follow-up time-adjusted incidence rates of SIB events were comparable between the brodalumab and ustekinumab groups throughout the 52-week controlled phases (0.20 vs 0.60 per 100 patient-years). In the brodalumab group, 4 completed suicides were reported, 1 of which was later adjudicated as indeterminate; all patients had underlying psychiatric disorders or stressors. There was no comparator arm past week 52. Controlled study periods were not powered to detect differences in rare events such as suicide. Comparison with controls and the timing of events do not indicate a causal relationship between SIB and brodalumab treatment. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  4. Revisiting Slow Slip Events Occurrence in Boso Peninsula, Japan, Combining GPS Data and Repeating Earthquakes Analysis

    Gardonio, B.; Marsan, D.; Socquet, A.; Bouchon, M.; Jara, J.; Sun, Q.; Cotte, N.; Campillo, M.

    2018-02-01

    Slow slip events (SSEs) regularly occur near the Boso Peninsula, central Japan. Their time of recurrence has been decreasing from 6.4 to 2.2 years from 1996 to 2014. It is important to better constrain the slip history of this area, especially as models show that the recurrence intervals could become shorter prior to the occurrence of a large interplate earthquake nearby. We analyze the seismic waveforms of more than 2,900 events (M≥1.0) taking place in the Boso Peninsula, Japan, from 1 April 2004 to 4 November 2015, calculating the correlation and the coherence between each pair of events in order to define groups of repeating earthquakes. The cumulative number of repeating earthquakes suggests the existence of two slow slip events that have escaped detection so far. Small transient displacements observed in the time series of nearby GPS stations confirm these results. The detection scheme coupling repeating earthquakes and GPS analysis allow to detect small SSEs that were not seen before by classical methods. This work brings new information on the diversity of SSEs and demonstrates that the SSEs in Boso area present a more complex history than previously considered.

  5. Sensitivity Analysis of Per-Protocol Time-to-Event Treatment Efficacy in Randomized Clinical Trials

    Gilbert, Peter B.; Shepherd, Bryan E.; Hudgens, Michael G.

    2013-01-01

    Summary Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common objective of randomized clinical trials. The typical analysis uses the same method employed for the intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting protocol adherence criteria. However, due to potential post-randomization selection bias, this analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on methods for assessing causal treatment effects in compliers, these methods do not apply to a common class of trials where a) the primary objective compares survival curves, b) it is inconceivable to assign participants to be adherent and event-free before adherence is measured, and c) the exclusion restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence is measured, and nonadherent subjects who receive some of the planned immunizations may be partially protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this problem class, considering three causal estimands of interest. Because these estimands are not identifiable from the observable data, we develop nonparametric bounds and semiparametric sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods are applied to RV144. PMID:24187408

  6. Urbanization and fertility: an event-history analysis of coastal Ghana.

    White, Michael J; Muhidin, Salut; Andrzejewski, Catherine; Tagoe, Eva; Knight, Rodney; Reed, Holly

    2008-11-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field.

  7. Development of time dependent safety analysis code for plasma anomaly events in fusion reactors

    Honda, Takuro; Okazaki, Takashi; Bartels, H.W.; Uckan, N.A.; Seki, Yasushi.

    1997-01-01

    A safety analysis code SAFALY has been developed to analyze plasma anomaly events in fusion reactors, e.g., a loss of plasma control. The code is a hybrid code comprising a zero-dimensional plasma dynamics and a one-dimensional thermal analysis of in-vessel components. The code evaluates the time evolution of plasma parameters and temperature distributions of in-vessel components. As the plasma-safety interface model, we proposed a robust plasma physics model taking into account updated data for safety assessment. For example, physics safety guidelines for beta limit, density limit and H-L mode confinement transition threshold power, etc. are provided in the model. The model of the in-vessel components are divided into twenty temperature regions in the poloidal direction taking account of radiative heat transfer between each surface of each region. This code can also describe the coolant behavior under hydraulic accidents with the results by hydraulics code and treat vaporization (sublimation) from plasma facing components (PFCs). Furthermore, the code includes the model of impurity transport form PFCs by using a transport probability and a time delay. Quantitative analysis based on the model is possible for a scenario of plasma passive shutdown. We examined the possibility of the code as a safety analysis code for plasma anomaly events in fusion reactors and had a prospect that it would contribute to the safety analysis of the International Thermonuclear Experimental Reactor (ITER). (author)

  8. On Event/Time Triggered and Distributed Analysis of a WSN System for Event Detection, Using Fuzzy Logic

    Sofia Maria Dima

    2016-01-01

    Full Text Available Event detection in realistic WSN environments is a critical research domain, while the environmental monitoring comprises one of its most pronounced applications. Although efforts related to the environmental applications have been presented in the current literature, there is a significant lack of investigation on the performance of such systems, when applied in wireless environments. Aiming at addressing this shortage, in this paper an advanced multimodal approach is followed based on fuzzy logic. The proposed fuzzy inference system (FIS is implemented on TelosB motes and evaluates the probability of fire detection while aiming towards power conservation. Additionally to a straightforward centralized approach, a distributed implementation of the above FIS is also proposed, aiming towards network congestion reduction while optimally distributing the energy consumption among network nodes so as to maximize network lifetime. Moreover this work proposes an event based execution of the aforementioned FIS aiming to further reduce the computational as well as the communication cost, compared to a periodical time triggered FIS execution. As a final contribution, performance metrics acquired from all the proposed FIS implementation techniques are thoroughly compared and analyzed with respect to critical network conditions aiming to offer realistic evaluation and thus objective conclusions’ extraction.

  9. The Key Factors Analysis of Palisades Temperature in Deep Open-pit Mine

    Wang, Yuan; Du, Cuifeng; Jin, Wenbo; Wang, Puyu

    2018-01-01

    In order to study the key factors of palisades temperature field in a deep open-pit mine in the natural environment, the influence of natural factors on the palisades temperature in a deep open-pit mine were analysed based on the principle of heat transfer. Four typical places with different ways of solar radiation were selected to carry out the field test. The results show that solar radiation, atmospheric temperature, and wind speed are three main factors affecting the temperature of palisades and that the direct sunlight plays a leading role. The time period of the sun shining directly on the shady slope of the palisades is short because of blocking effect, whose temperature changes in a smaller scale. At the same time, the sun slope of the palisades suffers from the solar radiation for a long time, whose temperature changes in a larger scale, and the variation is similar to the air temperature.

  10. Practical security analysis of continuous-variable quantum key distribution with jitter in clock synchronization

    Xie, Cailang; Guo, Ying; Liao, Qin; Zhao, Wei; Huang, Duan; Zhang, Ling; Zeng, Guihua

    2018-03-01

    How to narrow the gap of security between theory and practice has been a notoriously urgent problem in quantum cryptography. Here, we analyze and provide experimental evidence of the clock jitter effect on the practical continuous-variable quantum key distribution (CV-QKD) system. The clock jitter is a random noise which exists permanently in the clock synchronization in the practical CV-QKD system, it may compromise the system security because of its impact on data sampling and parameters estimation. In particular, the practical security of CV-QKD with different clock jitter against collective attack is analyzed theoretically based on different repetition frequencies, the numerical simulations indicate that the clock jitter has more impact on a high-speed scenario. Furthermore, a simplified experiment is designed to investigate the influence of the clock jitter.

  11. Particle-scale Analysis of Key Technologies on Cut-and-over Tunnel in Slope Engineering

    J.H. Yang

    2014-08-01

    Full Text Available When the shallow tunnel is constructed on the slope terrain in the mountains, there are the potential risks such as landslide induced by cutting the slope and the non-compacted backfill material during the construction of the cut-andcover tunnel. In order to solve these problems, based on a practical engineering, the optimized construction plans of the cut-and-cover tunnel were analyzed by particle flow code (PFC, the key parts of the open-cut construction were identified, and the anti-slide piles countermeasures were proposed. Furthermore, the grouting reinforcement process for the non-compacted backfill material around the shallow tunnel was simulated by PFC, and the variation characteristics of the porosity and grouting pressure were revealed as well. The results are of great value to the similar engineering.

  12. Characterization of a Flood Event through a Sediment Analysis: The Tescio River Case Study

    Silvia Di Francesco

    2016-07-01

    Full Text Available This paper presents the hydrological analysis and grain size characteristics of fluvial sediments in a river basin and their combination to characterize a flood event. The overall objective of the research is the development of a practical methodology based on experimental surveys to reconstruct the hydraulic history of ungauged river reaches on the basis of the modifications detected on the riverbed during the dry season. The grain size analysis of fluvial deposits usually requires great technical and economical efforts and traditional sieving based on physical sampling is not appropriate to adequately represent the spatial distribution of sediments in a wide area of a riverbed with a reasonable number of samples. The use of photographic sampling techniques, on the other hand, allows for the quick and effective determination of the grain size distribution, through the use of a digital camera and specific graphical algorithms in large river stretches. A photographic sampling is employed to characterize the riverbed in a 3 km ungauged reach of the Tescio River, a tributary of the Chiascio River, located in central Italy, representative of many rivers in the same geographical area. To this end, the particle size distribution is reconstructed through the analysis of digital pictures of the sediments taken on the riverbed in dry conditions. The sampling has been performed after a flood event of known duration, which allows for the identification of the removal of the armor in one section along the river reach under investigation. The volume and composition of the eroded sediments made it possible to calculate the average flow rate associated with the flood event which caused the erosion, by means of the sediment transport laws and the hydrological analysis of the river basin. A hydraulic analysis of the river stretch under investigation was employed to verify the validity of the proposed procedure.

  13. Adverse events following yellow fever immunization: Report and analysis of 67 neurological cases in Brazil.

    Martins, Reinaldo de Menezes; Pavão, Ana Luiza Braz; de Oliveira, Patrícia Mouta Nunes; dos Santos, Paulo Roberto Gomes; Carvalho, Sandra Maria D; Mohrdieck, Renate; Fernandes, Alexandre Ribeiro; Sato, Helena Keico; de Figueiredo, Patricia Mandali; von Doellinger, Vanessa Dos Reis; Leal, Maria da Luz Fernandes; Homma, Akira; Maia, Maria de Lourdes S

    2014-11-20

    Neurological adverse events following administration of the 17DD substrain of yellow fever vaccine (YEL-AND) in the Brazilian population are described and analyzed. Based on information obtained from the National Immunization Program through passive surveillance or intensified passive surveillance, from 2007 to 2012, descriptive analysis, national and regional rates of YFV associated neurotropic, neurological autoimmune disease, and reporting rate ratios with their respective 95% confidence intervals were calculated for first time vaccinees stratified on age and year. Sixty-seven neurological cases were found, with the highest rate of neurological adverse events in the age group from 5 to 9 years (2.66 per 100,000 vaccine doses in Rio Grande do Sul state, and 0.83 per 100,000 doses in national analysis). Two cases had a combination of neurotropic and autoimmune features. This is the largest sample of YEL-AND already analyzed. Rates are similar to other recent studies, but on this study the age group from 5 to 9 years of age had the highest risk. As neurological adverse events have in general a good prognosis, they should not contraindicate the use of yellow fever vaccine in face of risk of infection by yellow fever virus. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Analysis of Loss-of-Offsite-Power Events 1997-2015

    Johnson, Nancy Ellen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.

  15. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  16. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  17. Analysis of mutual events of Galilean satellites observed from VBO during 2014-2015

    Vasundhara, R.; Selvakumar, G.; Anbazhagan, P.

    2017-06-01

    Results of analysis of 23 events of the 2014-2015 mutual event series from the Vainu Bappu Observatory are presented. Our intensity distribution model for the eclipsed/occulted satellite is based on the criterion that it simulates a rotational light curve that matches the ground-based light curve. Dichotomy in the scattering characteristics of the leading and trailing sides explains the basic shape of the rotational light curves of Europa, Ganymede and Callisto. In the case of Io, the albedo map (courtesy United States Geological Survey) along with global values of scattering parameters works well. Mean values of residuals in (O - C) along and perpendicular to the track are found to be -3.3 and -3.4 mas, respectively, compared to 'L2' theory for the seven 2E1/2O1 events. The corresponding rms values are 8.7 and 7.8 mas, respectively. For the five 1E3/1O3 events, the along and perpendicular to the track mean residuals are 5.6 and 3.2 mas, respectively. The corresponding rms residuals are 6.8 and 10.5 mas, respectively. We compare the results using the chosen model (Model 1) with a uniform but limb-darkened disc (Model 2). The residuals with Model 2 of the 2E1/2O1 and 1E3/1O3 events indicate a bias along the satellite track. The extent and direction of bias are consistent with the shift of the light centre from the geometric centre. Results using Model 1, which intrinsically takes into account the intensity distribution, show no such bias.

  18. Assessing molecular initiating events (MIEs), key events (KEs) and modulating factors (MFs) for styrene responses in mouse lungs using whole genome gene expression profiling following 1-day and multi-week exposures.

    Andersen, Melvin E; Cruzan, George; Black, Michael B; Pendse, Salil N; Dodd, Darol; Bus, James S; Sarang, Satinder S; Banton, Marcy I; Waites, Robbie; McMullen, Patrick D

    2017-11-15

    Styrene increased lung tumors in mice at chronic inhalation exposures of 20ppm and greater. MIEs, KEs and MFs were examined using gene expression in three strains of male mice (the parental C57BL/6 strain, a CYP2F2(-/-) knock out and a CYP2F2(-/-) transgenic containing human CYP2F1, 2A13 and 2B6). Exposures were for 1-day and 1, 4 and 26weeks. After 1-day exposures at 1, 5, 10, 20, 40 and 120ppm significant increases in differentially expressed genes (DEGs) occurred only in parental strain lungs where there was already an increase in DEGs at 5ppm and then many thousands of DEGs by 120ppm. Enrichment for 1-day and 1-week exposures included cell cycle, mitotic M-M/G1 phases, DNA-synthesis and metabolism of lipids and lipoproteins pathways. The numbers of DEGs decreased steadily over time with no DEGs meeting both statistical significance and fold-change criteria at 26weeks. At 4 and 26weeks, some key transcription factors (TFs) - Nr1d1, Nr1d2, Dbp, Tef, Hlf, Per3, Per2 and Bhlhe40 - were upregulated (|FC|>1.5), while others - Npas, Arntl, Nfil3, Nr4a1, Nr4a2, and Nr4a3 - were down-regulated. At all times, consistent changes in gene expression only occurred in the parental strain. Our results support a MIE for styrene of direct mitogenicity from mouse-specific CYP2F2-mediated metabolites activating Nr4a signaling. Longer-term MFs include down-regulation of Nr4a genes and shifts in both circadian clock TFs and other TFs, linking circadian clock to cellular metabolism. We found no gene expression changes indicative of cytotoxicity or activation of p53-mediated DNA-damage pathways. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Corrective interpersonal experience in psychodrama group therapy: a comprehensive process analysis of significant therapeutic events.

    McVea, Charmaine S; Gow, Kathryn; Lowe, Roger

    2011-07-01

    This study investigated the process of resolving painful emotional experience during psychodrama group therapy, by examining significant therapeutic events within seven psychodrama enactments. A comprehensive process analysis of four resolved and three not-resolved cases identified five meta-processes which were linked to in-session resolution. One was a readiness to engage in the therapeutic process, which was influenced by client characteristics and the client's experience of the group; and four were therapeutic events: (1) re-experiencing with insight; (2) activating resourcefulness; (3) social atom repair with emotional release; and (4) integration. A corrective interpersonal experience (social atom repair) healed the sense of fragmentation and interpersonal disconnection associated with unresolved emotional pain, and emotional release was therapeutically helpful when located within the enactment of this new role relationship. Protagonists who experienced resolution reported important improvements in interpersonal functioning and sense of self which they attributed to this experience.

  20. Exploitation of a component event data bank for common cause failure analysis

    Games, A.M.; Amendola, A.; Martin, P.

    1985-01-01

    Investigations into using the European Reliability Data System Component Event Data Bank for common cause failure analysis have been carried out. Starting from early exercises where data were analyzed without computer aid, different types of linked multiple failures have been identified. A classification system is proposed based on this experience. It defines a multiple failure event space wherein each category defines causal, modal, temporal and structural links between failures. It is shown that a search algorithm which incorporates the specific interrogative procedures of the data bank can be developed in conjunction with this classification system. It is concluded that the classification scheme and the search algorithm are useful organizational tools in the field of common cause failures studies. However, it is also suggested that the use of the term common cause failure should be avoided since it embodies to many different types of linked multiple failures

  1. Turning a Private Story into a Public Event. Frame Analysis of Scandals in Television Performance

    Olga Galanova

    2012-07-01

    Full Text Available It does not suffice to treat scandals only as supra-individual discourses on the macro level of  social communication. Rather we have to develop concrete methodical principles for the description of the practice of doing scandal in certain media. In this paper we look at these practices from a micro-sociological perspective and analyze how and through which concrete actions an event is staged as a scandal. Practices of scandal build a special frame of media communication, which allows  television producers to solve certain "communicative problems." Based on the detailed analysis of a video recording of a television show we exemplify how a private case turns to a public event by means of  scandal-framing. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120398

  2. Analysis of Data from a Series of Events by a Geometric Process Model

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  3. Benchmark analysis of three main circulation pump sequential trip event at Ignalina NPP

    Uspuras, E.; Kaliatka, A.; Urbonas, R.

    2001-01-01

    The Ignalina Nuclear Power Plant is a twin-unit with two RBMK-1500 reactors. The primary circuit consists of two symmetrical loops. Eight Main Circulation Pumps (MCPs) at the Ignalina NPP are employed for the coolant water forced circulation through the reactor core. The MCPs are joined in groups of four pumps for each loop (three for normal operation and one on standby). This paper presents the benchmark analysis of three main circulation pump sequential trip event at RBMK-1500 using RELAP5 code. During this event all three MCPs in one circulation loop at Unit 2 Ignalina NPP were tripped one after another, because of inadvertent activation of the fire protection system. The comparison of calculated and measured parameters led us to establish realistic thermal hydraulic characteristics of different main circulation circuit components and to verify the model of drum separators pressure and water level controllers.(author)

  4. A sequential threshold cure model for genetic analysis of time-to-event data

    Ødegård, J; Madsen, Per; Labouriau, Rodrigo S.

    2011-01-01

    In analysis of time-to-event data, classical survival models ignore the presence of potential nonsusceptible (cured) individuals, which, if present, will invalidate the inference procedures. Existence of nonsusceptible individuals is particularly relevant under challenge testing with specific...... pathogens, which is a common procedure in aquaculture breeding schemes. A cure model is a survival model accounting for a fraction of nonsusceptible individuals in the population. This study proposes a mixed cure model for time-to-event data, measured as sequential binary records. In a simulation study...... survival data were generated through 2 underlying traits: susceptibility and endurance (risk of dying per time-unit), associated with 2 sets of underlying liabilities. Despite considerable phenotypic confounding, the proposed model was largely able to distinguish the 2 traits. Furthermore, if selection...

  5. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  6. Analysis of a potential meteorite-dropping event over the south of Spain in 2007

    Madiedo, J. M.; Trigo-Rodríguez, J. M.

    2008-09-01

    the case of Puerto Lápice, there are no pictures or videos of the June 29, 2007 bolide and just some images of the distorted train taken several minutes later are available. A forth potential meteoritedropping bolide could be directly recorded by SPMN video cameras on March 25, 2007. We were lucky enough of having this event near to the zenith of two SPMN stations, exhibiting all its magnificence (Fig. 2). We focus here on the preliminary analysis of this event, which was observed over an

  7. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  8. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  9. Consensus building for interlaboratory studies, key comparisons, and meta-analysis

    Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza

    2017-06-01

    Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian

  10. Quantitative transcription dynamic analysis reveals candidate genes and key regulators for ethanol tolerance in Saccharomyces cerevisiae

    Ma Menggen

    2010-06-01

    Full Text Available Abstract Background Derived from our lignocellulosic conversion inhibitor-tolerant yeast, we generated an ethanol-tolerant strain Saccharomyces cerevisiae NRRL Y-50316 by enforced evolutionary adaptation. Using a newly developed robust mRNA reference and a master equation unifying gene expression data analyses, we investigated comparative quantitative transcription dynamics of 175 genes selected from previous studies for an ethanol-tolerant yeast and its closely related parental strain. Results A highly fitted master equation was established and applied for quantitative gene expression analyses using pathway-based qRT-PCR array assays. The ethanol-tolerant Y-50316 displayed significantly enriched background of mRNA abundance for at least 35 genes without ethanol challenge compared with its parental strain Y-50049. Under the ethanol challenge, the tolerant Y-50316 responded in consistent expressions over time for numerous genes belonging to groups of heat shock proteins, trehalose metabolism, glycolysis, pentose phosphate pathway, fatty acid metabolism, amino acid biosynthesis, pleiotropic drug resistance gene family and transcription factors. The parental strain showed repressed expressions for many genes and was unable to withstand the ethanol stress and establish a viable culture and fermentation. The distinct expression dynamics between the two strains and their close association with cell growth, viability and ethanol fermentation profiles distinguished the tolerance-response from the stress-response in yeast under the ethanol challenge. At least 82 genes were identified as candidate and key genes for ethanol-tolerance and subsequent fermentation under the stress. Among which, 36 genes were newly recognized by the present study. Most of the ethanol-tolerance candidate genes were found to share protein binding motifs of transcription factors Msn4p/Msn2p, Yap1p, Hsf1p and Pdr1p/Pdr3p. Conclusion Enriched background of transcription abundance

  11. Simplified containment event tree analysis for the Sequoyah Ice Condenser containment

    Galyean, W.J.; Schroeder, J.A.; Pafford, D.J.

    1990-12-01

    An evaluation of a Pressurized Water Reactor (PER) ice condenser containment was performed. In this evaluation, simplified containment event trees (SCETs) were developed that utilized the vast storehouse of information generated by the NRC's Draft NUREG-1150 effort. Specifically, the computer programs and data files produced by the NUREG-1150 analysis of Sequoyah were used to electronically generate SCETs, as opposed to the NUREG-1150 accident progression event trees (APETs). This simplification was performed to allow graphic depiction of the SCETs in typical event tree format, which facilitates their understanding and use. SCETs were developed for five of the seven plant damage state groups (PDSGs) identified by the NUREG-1150 analyses, which includes: both short- and long-term station blackout sequences (SBOs), transients, loss-of-coolant accidents (LOCAs), and anticipated transient without scram (ATWS). Steam generator tube rupture (SGTR) and event-V PDSGs were not analyzed because of their containment bypass nature. After being benchmarked with the APETs, in terms of containment failure mode and risk, the SCETs were used to evaluate a number of potential containment modifications. The modifications were examined for their potential to mitigate or prevent containment failure from hydrogen burns or direct impingement on the containment by the core, (both factors identified as significant contributors to risk in the NUREG-1150 Sequoyah analysis). However, because of the relatively low baseline risk postulated for Sequoyah (i.e., 12 person-rems per reactor year), none of the potential modifications appear to be cost effective. 15 refs., 10 figs. , 17 tabs

  12. Is English the key to access the wonders of the modern world? A Critical Discourse Analysis

    Carmen Helena Guerrero

    2010-01-01

    Full Text Available The spread of English in the world today is not only the result of colonizing campaigns (Canagarajah, 1999, 2005; Pennycook, 1994a, 1998a, 2000; Phillipson, 1992, 2000 but also of the compliance of the governments associated with the "expanding circle" (Kachru, 1986. Colombia is a good example of this phenomenon, because its national government is implementing a National Bilingualism Project (pnb where there is an explicit interest in the promotion of English over all other languages spoken in the country. This article is a critical discourse analysis of the handbook that sets the standards for competences in English. The analysis of data follows Fairclough¿s textual analysis and shows that the authors of the handbook perpetuate mainstream concepts about the symbolic power of English as the one and only necessary tool for academic and economic success.

  13. Inhibitions of mTORC1 and 4EBP-1 are key events orchestrated by Rottlerin in SK-Mel-28 cell killing.

    Daveri, E; Maellaro, E; Valacchi, G; Ietta, F; Muscettola, M; Maioli, E

    2016-09-28

    Earlier studies demonstrated that Rottlerin exerts a time- and dose-dependent antiproliferative effect on SK-Mel-28 melanoma cells during 24 h of treatment, but cytotoxicity due to cell death began only after a 48 h exposure. In the current study, in order to identify the type of cell death in this cell line, which is notoriously refractory to most anticancer therapies, and to clarify the underlying mechanisms of this delayed outcome, we searched for apoptotic, necrotic/necroptotic and autophagic traits in Rottlerin-exposed cells. Although SK-Mel-28 cells are both apoptosis and autophagy competent, Western blotting analysis, caspase activity assay, nuclear imaging and the effects of autophagy, apoptosis and necroptosis inhibitors, indicated that Rottlerin cytotoxicity was due to none of the aforementioned death mechanisms. Nevertheless, in growth arrested cells, the death did occur after a prolonged treatment and most likely ensued from the observed blockage of protein synthesis that reached levels expected to be incompatible with cell survival. From a mechanistic point of view, we ascribed this effect to the documented inhibition of mTORC1 activity; mTORC1 inhibition on the one hand led to a not deadly, rather protective autophagic response but, on the other hand caused a near complete arrest of protein synthesis. Interestingly, no cytotoxicity was found towards normal skin fibroblasts, which only resulted mildly growth arrested by the drug. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Statistical Analysis of Solar Events Associated with SSC over Year of Solar Maximum during Cycle 23: 1. Identification of Related Sun-Earth Events

    Grison, B.; Bocchialini, K.; Menvielle, M.; Chambodut, A.; Cornilleau-Wehrlin, N.; Fontaine, D.; Marchaudon, A.; Pick, M.; Pitout, F.; Schmieder, B.; Regnier, S.; Zouganelis, Y.

    2017-12-01

    Taking the 32 sudden storm commencements (SSC) listed by the observatory de l'Ebre / ISGI over the year 2002 (maximal solar activity) as a starting point, we performed a statistical analysis of the related solar sources, solar wind signatures, and terrestrial responses. For each event, we characterized and identified, as far as possible, (i) the sources on the Sun (Coronal Mass Ejections -CME-), with the help of a series of herafter detailed criteria (velocities, drag coefficient, radio waves, polarity), as well as (ii) the structure and properties in the interplanetary medium, at L1, of the event associated to the SSC: magnetic clouds -MC-, non-MC interplanetary coronal mass ejections -ICME-, co-rotating/stream interaction regions -SIR/CIR-, shocks only and unclear events that we call "miscellaneous" events. The categorization of the events at L1 is made on published catalogues. For each potential CME/L1 event association we compare the velocity observed at L1 with the one observed at the Sun and the estimated balistic velocity. Observations of radio emissions (Type II, Type IV detected from the ground and /or by WIND) associated to the CMEs make the solar source more probable. We also compare the polarity of the magnetic clouds with the hemisphere of the solar source. The drag coefficient (estimated with the drag-based model) is calculated for each potential association and it is compared to the expected range values. We identified a solar source for 26 SSC related events. 12 of these 26 associations match all criteria. We finally discuss the difficulty to perform such associations.

  15. Analysis of landslide overgrowing rates at Vaskiny Dachi key site, Central Yamal, Russia

    Khomutov, A.

    2009-04-01

    An estimation of overgrowing of landslide-affected slopes by vegetation at three main landslide elements: shear surface, landslide body and "frontal zone" at Vaskiny Dachi key site is presented. Vaskiny Dachi key site is located in the watershed of Se-Yakha and Mordy-Yakha rivers on Central Yamal, Russia. The area is represented by highly-dissected alluvial-lacustrine-marine plains and terraces. The closest to Vaskiny Dachi climate station is Marresale, about 90 km southwest of Vaskiny Dachi, at the Kara sea coast. The weather here is probably somewhat cooler than at Vaskiny Dachi. The average annual (summer) air temperature at Marresale is -8.3° C (4.3° C) ("Russia's Weather" Server). To estimate vegetation cover dynamics on cryogenic landslides at "Vaskiny Dachi", data published by O.Rebristaya and others (1995) were used. Their observations were done in 1991-1993, and were supplemented by further field observations (Leibman et al., 2000, Khomutov & Leibman 2007) and by field and remote sensing observations in 2008. An estimation of vegetation cover dynamics on cryogenic landslides at "Vaskiny Dachi" leads to the following results. Immediately after landsliding in 1989, landslide shear surface was bare without any vegetation, landslide body had initial vegetation, and "frontal zone" was under liquefied sediment masses. "Frontal zone" formed in front of a landslide body, appears as a result of damming of drainage routes by a landslide body with flooding of the shear surface "upstream" of the landslide body, formation of a sedge-cottongrass meadow there, and swamping downstream (Khomutov & Leibman 2007). By 1993, landslide shear surface got overgrown by species subordinate in surrounding initial landscapes (Alopecurus alpinus, Festuca ovina, Calamagrostis neglecta, Poa alpigena ssp. Alpigena, etc.). Landslide body was covered by initial communities which got depressed: vitality of Salix polaris, Vaccinium vitis-idaea was reduced, dead off moss cover and overgrown

  16. Developmental finite element analysis of cichlid pharyngeal jaws: Quantifying the generation of a key innovation

    Müller, Gerd B.

    2018-01-01

    Advances in imaging and modeling facilitate the calculation of biomechanical forces in biological specimens. These factors play a significant role during ontogenetic development of cichlid pharyngeal jaws, a key innovation responsible for one of the most prolific species diversifications in recent times. MicroCT imaging of radiopaque-stained vertebrate embryos were used to accurately capture the spatial relationships of the pharyngeal jaw apparatus in two cichlid species (Haplochromis elegans and Amatitlania nigrofasciata) for the purpose of creating a time series of developmental stages using finite element models, which can be used to assess the effects of biomechanical forces present in a system at multiple points of its ontogeny. Changes in muscle vector orientations, bite forces, force on the neurocranium where cartilage originates, and stress on upper pharyngeal jaws are analyzed in a comparative context. In addition, microCT scanning revealed the presence of previously unreported cement glands in A. nigrofasciata. The data obtained provide an underrepresented dimension of information on physical forces present in developmental processes and assist in interpreting the role of developmental dynamics in evolution. PMID:29320528

  17. Developmental finite element analysis of cichlid pharyngeal jaws: Quantifying the generation of a key innovation.

    Tim Peterson

    Full Text Available Advances in imaging and modeling facilitate the calculation of biomechanical forces in biological specimens. These factors play a significant role during ontogenetic development of cichlid pharyngeal jaws, a key innovation responsible for one of the most prolific species diversifications in recent times. MicroCT imaging of radiopaque-stained vertebrate embryos were used to accurately capture the spatial relationships of the pharyngeal jaw apparatus in two cichlid species (Haplochromis elegans and Amatitlania nigrofasciata for the purpose of creating a time series of developmental stages using finite element models, which can be used to assess the effects of biomechanical forces present in a system at multiple points of its ontogeny. Changes in muscle vector orientations, bite forces, force on the neurocranium where cartilage originates, and stress on upper pharyngeal jaws are analyzed in a comparative context. In addition, microCT scanning revealed the presence of previously unreported cement glands in A. nigrofasciata. The data obtained provide an underrepresented dimension of information on physical forces present in developmental processes and assist in interpreting the role of developmental dynamics in evolution.

  18. Implementation and Analysis Audio Steganography Used Parity Coding for Symmetric Cryptography Key Delivery

    Afany Zeinata Firdaus

    2013-12-01

    Full Text Available In today's era of communication, online data transactions is increasing. Various information even more accessible, both upload and download. Because it takes a capable security system. Blowfish cryptographic equipped with Audio Steganography is one way to secure the data so that the data can not be accessed by unauthorized parties. In this study Audio Steganography technique is implemented using parity coding method that is used to send the key cryptography blowfish in e-commerce applications based on Android. The results obtained for the average computation time on stage insertion (embedding the secret message is shorter than the average computation time making phase (extracting the secret message. From the test results can also be seen that the more the number of characters pasted the greater the noise received, where the highest SNR is obtained when a character is inserted as many as 506 characters is equal to 11.9905 dB, while the lowest SNR obtained when a character is inserted as many as 2006 characters at 5,6897 dB . Keywords: audio steganograph, parity coding, embedding, extractin, cryptography blowfih.

  19. Aroma extraction dilution analysis of Sauternes wines. Key role of polyfunctional thiols.

    Bailly, Sabine; Jerkovic, Vesna; Marchand-Brynaert, Jacqueline; Collin, Sonia

    2006-09-20

    The aim of the present work was to investigate Sauternes wine aromas. In all wine extracts, polyfunctional thiols were revealed to have a huge impact. A very strong bacon-petroleum odor emerged at RI = 845 from a CP-Sil5-CB column. Two thiols proved to participate in this perception: 3-methyl-3-sulfanylbutanal and 2-methylfuran-3-thiol. A strong synergetic effect was evidenced between the two compounds. The former, never mentioned before in wines, and not found in the musts of this study, is most probably synthesized during fermentation. 3-Methylbut-2-ene-1-thiol, 3-sulfanylpropyl acetate, 3-sulfanylhexan-1-ol, and 3-sulfanylheptanal also contribute to the global aromas of Sauternes wines. Among other key odorants, the presence of a varietal aroma (alpha-terpineol), sotolon, fermentation alcohols (3-methylbutan-1-ol and 2-phenylethanol) and esters (ethyl butyrate, ethyl hexanoate, and ethyl isovalerate), carbonyls (trans-non-2-enal and beta-damascenone), and wood flavors (guaiacol, vanillin, eugenol, beta-methyl-gamma-octalactone, and Furaneol) is worth stressing.

  20. Safety culture: analysis of the causal relationships between its key dimensions.

    Fernández-Muñiz, Beatriz; Montes-Peón, José Manuel; Vázquez-Ordás, Camilo José

    2007-01-01

    Several fields are showing increasing interest in safety culture as a means of reducing accidents in the workplace. The literature shows that safety culture is a multidimensional concept. However, considerable confusion surrounds this concept, about which little consensus has been reached. This study proposes a model for a positive safety culture and tests this on a sample of 455 Spanish companies, using the structural equation modeling statistical technique. Results show the important role of managers in the promotion of employees' safe behavior, both directly, through their attitudes and behaviors, and indirectly, by developing a safety management system. This paper identifies the key dimensions of safety culture. In addition, a measurement scale for the safety management system is validated. This will assist organizations in defining areas where they need to progress if they wish to improve their safety. Also, we stress that managers need to be wholly committed to and personally involved in safety activities, thereby conveying the importance the firm attaches to these issues.

  1. Transcriptomic analysis reveals key genes related to betalain biosynthesis in pulp coloration of Hylocereus polyrhizus

    Hua eQingzhu

    2016-01-01

    Full Text Available Betalains have high nutritional value and bioactivities. Red pulp pitaya (Hylocereus polyrhizus is the only fruit containing abundant betalains for consumer. However, no information is available about genes involved in betalain biosynthesis in H. polyrhizus. Herein, two cDNA libraries of pitaya pulps with two different coloration stages (white and red pulp stages of Guanhuahong (H. polyrhizus were constructed. A total of about 12 Gb raw RNA-Seq data was generated and was de novo assembled into 122,677 transcripts with an average length of 1,183 bp and an N50 value of 2008. Approximately 99.99% of all transcripts were annotated based on seven public databases. A total of 8,871 transcripts were significantly regulated. Thirty-three candidate transcripts related to betalain biosynthesis were obtained from the transcriptome data. Transcripts encoding enzymes involved in betalain biosynthesis were analyzed using RT-qPCR at the whole pulp coloration stages of H. Polyrhizus (7-1 and H. Undatus (132-4. Nine key transcripts of betalain biosynthesis were identified. They were assigned to four kinds of genes in betalain biosynthetic pathway, including tyrosinase, 4, 5-DOPA dioxygenase extradiol, cytochrome P450 and glucosyltransferase. Ultimately, a preliminary betalain biosynthetic pathway for pitaya was proposed based on betalain analyses and gene expression profiles.

  2. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  3. Embarrassment as a key to understanding cultural differences. Basic principles of cultural analysis

    Bouchet, Dominique

    1995-01-01

    I introduce here the principles I use in my investigation of intercultural marketing and management. I explain how I discovered them, and show how they spring from a theoretical understanding of the dynamic of cultural differences. One of the basic methodological principles for my analysis...

  4. Keys of Japanese Prosody and Didactical-Technical Analysis of OJAD (Online Japanese Accent Dictionary)

    Delgado Algarra, Emilio José

    2016-01-01

    Most of the studies focus on the teaching of foreign languages indicate that little attention is paid to the prosodic features both didactic materials and teaching-learning processes (Martinsen, Avord and Tanner, 2014). In this context and throughout this article, an analysis of the didactical and technical dimensions of OJAD (Japanese Accent…

  5. Establishing a Common Vocabulary of Key Concepts for the Effective Implementation of Applied Behavior Analysis

    Cihon, Traci M.; Cihon, Joseph H.; Bedient, Guy M.

    2016-01-01

    The technical language of behavior analysis is arguably necessary to share ideas and research with precision among each other. However, it can hinder effective implementation of behavior analytic techniques when it prevents clear communication between the supervising behavior analyst and behavior technicians. The present paper provides a case…

  6. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  7. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. An analysis on boron dilution events during SBLOCA for the KNGR

    Kim, Young In; Hwang, Young Dong; Park, Jong Kuen; Chung, Young Jong; Sim, Suk Gu

    1999-02-01

    An analysis on boron dilution events during small break loss of coolant accident (LOCA) for Korea Next Generation Reactor (KNGR) was performed using Computational Fluid Dynamic (CFD) computer program FLUENT code. The maximum size of the water slug was determined based on the source of un borated water slug and the possible flow paths. Axisymmetric computational fluid dynamic analysis model is applied for conservative scoping analysis of un borated water slug mixing with recirculation water of the reactor system following small break LOCA assuming one Reactor Coolant Pump (RCP) restart. The computation grid was determined through the sensitivity study on the grid size, which calculates the most conservative results, and the preliminary calculation for boron mixing was performed using the grid. (Author). 17 refs., 3 tabs., 26 figs

  9. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs

  10. From spatially variable streamflow to distributed hydrological models: Analysis of key modeling decisions

    Fenicia, Fabrizio; Kavetski, Dmitri; Savenije, Hubert H. G.; Pfister, Laurent

    2016-02-01

    This paper explores the development and application of distributed hydrological models, focusing on the key decisions of how to discretize the landscape, which model structures to use in each landscape element, and how to link model parameters across multiple landscape elements. The case study considers the Attert catchment in Luxembourg—a 300 km2 mesoscale catchment with 10 nested subcatchments that exhibit clearly different streamflow dynamics. The research questions are investigated using conceptual models applied at hydrologic response unit (HRU) scales (1-4 HRUs) on 6 hourly time steps. Multiple model structures are hypothesized and implemented using the SUPERFLEX framework. Following calibration, space/time model transferability is tested using a split-sample approach, with evaluation criteria including streamflow prediction error metrics and hydrological signatures. Our results suggest that: (1) models using geology-based HRUs are more robust and capture the spatial variability of streamflow time series and signatures better than models using topography-based HRUs; this finding supports the hypothesis that, in the Attert, geology exerts a stronger control than topography on streamflow generation, (2) streamflow dynamics of different HRUs can be represented using distinct and remarkably simple model structures, which can be interpreted in terms of the perceived dominant hydrologic processes in each geology type, and (3) the same maximum root zone storage can be used across the three dominant geological units with no loss in model transferability; this finding suggests that the partitioning of water between streamflow and evaporation in the study area is largely independent of geology and can be used to improve model parsimony. The modeling methodology introduced in this study is general and can be used to advance our broader understanding and prediction of hydrological behavior, including the landscape characteristics that control hydrologic response, the

  11. Key Microbiota Identification Using Functional Gene Analysis during Pepper (Piper nigrum L.) Peeling.

    Zhang, Jiachao; Hu, Qisong; Xu, Chuanbiao; Liu, Sixin; Li, Congfa

    2016-01-01

    Pepper pericarp microbiota plays an important role in the pepper peeling process for the production of white pepper. We collected pepper samples at different peeling time points from Hainan Province, China, and used a metagenomic approach to identify changes in the pericarp microbiota based on functional gene analysis. UniFrac distance-based principal coordinates analysis revealed significant changes in the pericarp microbiota structure during peeling, which were attributed to increases in bacteria from the genera Selenomonas and Prevotella. We identified 28 core operational taxonomic units at each time point, mainly belonging to Selenomonas, Prevotella, Megasphaera, Anaerovibrio, and Clostridium genera. The results were confirmed by quantitative polymerase chain reaction. At the functional level, we observed significant increases in microbial features related to acetyl xylan esterase and pectinesterase for pericarp degradation during peeling. These findings offer a new insight into biodegradation for pepper peeling and will promote the development of the white pepper industry.

  12. Sensitivity Analysis of Wind Plant Performance to Key Turbine Design Parameters: A Systems Engineering Approach; Preprint

    Dykes, K.; Ning, A.; King, R.; Graf, P.; Scott, G.; Veers, P.

    2014-02-01

    This paper introduces the development of a new software framework for research, design, and development of wind energy systems which is meant to 1) represent a full wind plant including all physical and nonphysical assets and associated costs up to the point of grid interconnection, 2) allow use of interchangeable models of varying fidelity for different aspects of the system, and 3) support system level multidisciplinary analyses and optimizations. This paper describes the design of the overall software capability and applies it to a global sensitivity analysis of wind turbine and plant performance and cost. The analysis was performed using three different model configurations involving different levels of fidelity, which illustrate how increasing fidelity can preserve important system interactions that build up to overall system performance and cost. Analyses were performed for a reference wind plant based on the National Renewable Energy Laboratory's 5-MW reference turbine at a mid-Atlantic offshore location within the United States.

  13. HOURLY STABILITY ANALYSIS AS THE KEY PARAMETER OF LEAN MANUFACTURING AND LOGISTICS

    Petr Besta

    2015-12-01

    Full Text Available Lean manufacturing belongs to the basic philosophies originating in automotive industry. It was originally based on a number of elementary principles and methods. Companies from other industrial areas have also been gradually trying to apply these principles. This leads to the incorporation of other tools from various areas into this concept. The fundamental techniques of lean manufacturing include the hourly stability (output analysis. This method can be applied in a wide variety of manufacturing fields. The aim is a stable working worker, not a worker working rapidly and with large fluctuations. Speed and sudden changes mean inaccuracy, poor quality and problems to the manufacturing companies. The research has also carried out the hourly stability analysis in a company manufacturing components for a variety of global car manufacturers. The objective of this article is to evaluate the research of hourly stability for the selected workplaces.

  14. Establishing a Common Vocabulary of Key Concepts for the Effective Implementation of Applied Behavior Analysis

    Traci M. CIHON; Joseph H. CIHON; Guy M. BEDIENT

    2016-01-01

    The technical language of behavior analysis is arguably necessary to share ideas and research with precision among each other. However, it can hinder effective implementation of behavior analytic techniques when it prevents clear communication between the supervising behavior analyst and behavior technicians. The present paper provides a case example of the development of a shared vocabulary, using plain English when possible, among supervisors and supervisees at a large public school distric...

  15. Hydroacoustic monitoring of a salt cavity: an analysis of precursory events of the collapse

    Lebert, F.; Bernardie, S.; Mainsant, G.

    2011-09-01

    One of the main features of "post mining" research relates to available methods for monitoring mine-degradation processes that could directly threaten surface infrastructures. In this respect, GISOS, a French scientific interest group, is investigating techniques for monitoring the eventual collapse of underground cavities. One of the methods under investigation was monitoring the stability of a salt cavity through recording microseismic-precursor signals that may indicate the onset of rock failure. The data were recorded in a salt mine in Lorraine (France) when monitoring the controlled collapse of 2 000 000 m3 of rocks surrounding a cavity at 130 m depth. The monitoring in the 30 Hz to 3 kHz frequency range highlights the occurrence of events with high energy during periods of macroscopic movement, once the layers had ruptured; they appear to be the consequence of the post-rupture rock movements related to the intense deformation of the cavity roof. Moreover the analysis shows the presence of some interesting precursory signals before the cavity collapsed. They occurred a few hours before the failure phases, when the rocks were being weakened and damaged. They originated from the damaging and breaking process, when micro-cracks appear and then coalesce. From these results we expect that deeper signal analysis and statistical analysis on the complete event time distribution (several millions of files) will allow us to finalize a complete typology of each signal families and their relations with the evolution steps of the cavity over the five years monitoring.

  16. Automatic Single Event Effects Sensitivity Analysis of a 13-Bit Successive Approximation ADC

    Márquez, F.; Muñoz, F.; Palomo, F. R.; Sanz, L.; López-Morillo, E.; Aguirre, M. A.; Jiménez, A.

    2015-08-01

    This paper presents Analog Fault Tolerant University of Seville Debugging System (AFTU), a tool to evaluate the Single-Event Effect (SEE) sensitivity of analog/mixed signal microelectronic circuits at transistor level. As analog cells can behave in an unpredictable way when critical areas interact with the particle hitting, there is a need for designers to have a software tool that allows an automatic and exhaustive analysis of Single-Event Effects influence. AFTU takes the test-bench SPECTRE design, emulates radiation conditions and automatically evaluates vulnerabilities using user-defined heuristics. To illustrate the utility of the tool, the SEE sensitivity of a 13-bits Successive Approximation Analog-to-Digital Converter (ADC) has been analysed. This circuit was selected not only because it was designed for space applications, but also due to the fact that a manual SEE sensitivity analysis would be too time-consuming. After a user-defined test campaign, it was detected that some voltage transients were propagated to a node where a parasitic diode was activated, affecting the offset cancelation, and therefore the whole resolution of the ADC. A simple modification of the scheme solved the problem, as it was verified with another automatic SEE sensitivity analysis.

  17. Analysis of area events as part of probabilistic safety assessment for Romanian TRIGA SSR 14 MW reactor

    Mladin, D.; Stefan, I.

    2005-01-01

    The international experience has shown that the external events could be an important contributor to plant/ reactor risk. For this reason such events have to be included in the PSA studies. In the context of PSA for nuclear facilities, external events are defined as events originating from outside the plant, but with the potential to create an initiating event at the plant. To support plant safety assessment, PSA can be used to find methods for identification of vulnerable features of the plant and to suggest modifications in order to mitigate the impact of external events or the producing of initiating events. For that purpose, probabilistic assessment of area events concerning fire and flooding risk and impact is necessary. Due to the relatively large power level amongst research reactors, the approach to safety analysis of Romanian 14 MW TRIGA benefits from an ongoing PSA project. In this context, treatment of external events should be considered. The specific tasks proposed for the complete evaluation of area event analysis are: identify the rooms important for facility safety, determine a relative area event risk index for these rooms and a relative area event impact index if the event occurs, evaluate the rooms specific area event frequency, determine the rooms contribution to reactor hazard state frequencies, analyze power supply and room dependencies of safety components (as pumps, motor operated valves). The fire risk analysis methodology is based on Berry's method [1]. This approach provides a systematic procedure to carry out a relative index of different rooms. The factors, which affect the fire probability, are: personal presence in the room, number and type of ignition sources, type and area of combustibles, fuel available in the room, fuel location, and ventilation. The flooding risk analysis is based on the amount of piping in the room. For accuracy of the information regarding piping a facility walk-about is necessary. In case of flooding risk

  18. COMPARABLE ANALYSIS REGARDING KEY MACROECONOMIC INDICATORS ON MOLDOVA’S WAY TOWARDS EUROPEAN INTEGRATION

    Valentina GANCIUCOV

    2015-07-01

    Full Text Available As Moldova has the purpose to enter the European Union the actual situation in the country is analyzed in this article. The article gives the comparative analysis of the basic parameters of Moldova with the other European Union country-members to define the ways of development of the country in the given direction. Since 1994 relations between Moldova and the European Union have developed on an upward trajectory. The dialogue between the two sides officially started that year with the signing of the Partnership and Cooperation Agreement (PCA, which entered into force in 1998 and provided the basis for cooperation with the EU in political, commercial, economic, legal, cultural fields. EU-Moldova relations have advanced to a higher level in 2009 when the country participated in the Eastern Partnership – an instrument of European policy that favored the signing on 29 May 2013 of the Association Agreement, the document which came to replace previous PCA and that is currently the most important element of the legal framework of Moldova-EU dialogue. But beyond the respective treaties signed, individually, between EU and states that intend to join the European community, there are a number of fundamental requirements3 (criteria, which condition the process of European integration of the state with declared intentions of accession. The aim of the research is to analyze to what extent Moldovan economy meet the requirements of economic alignment with EU standards, achieving a comparative analysis of the main relevant macroeconomic indicators. Research methodology. For analysis were used analysis-synthesis method, comparison method and others. Results of the analysis. Part of the criteria analyzed converge with EU requirements, while the most relevant indicators regarding standards of living show reserves show reserves for future improvement, such as the average wage, the lending rate, the exchange rate of the Moldovan Leu against the major international

  19. An analysis of potential costs of adverse events based on Drug Programs in Poland. Pulmonology focus

    Szkultecka-Debek Monika

    2014-06-01

    Full Text Available The project was performed within the Polish Society for Pharmacoeconomics (PTFE. The objective was to estimate the potential costs of treatment of side effects, which theoretically may occur as a result of treatment of selected diseases. We analyzed the Drug Programs financed by National Health Fund in Poland in 2012 and for the first analysis we selected those Programs where the same medicinal products were used. We based the adverse events selection on the Summary of Product Characteristics of the chosen products. We extracted all the potential adverse events defined as frequent and very frequent, grouping them according to therapeutic areas. This paper is related to the results in the pulmonology area. The events described as very common had an incidence of ≥ 1/10, and the common ones ≥ 1/100, <1/10. In order to identify the resources used, we performed a survey with the engagement of clinical experts. On the basis of the collected data we allocated direct costs incurred by the public payer. We used the costs valid in December 2013. The paper presents the estimated costs of treatment of side effects related to the pulmonology disease area. Taking into account the costs incurred by the NHF and the patient separately e calculated the total spending and the percentage of each component cost in detail. The treatment of adverse drug reactions generates a significant cost incurred by both the public payer and the patient.

  20. Preliminary analysis of beam trip and beam jump events in an ADS prototype

    D'Angelo, A.; Bianchini, G.; Carta, M.

    2001-01-01

    A core dynamics analysis relevant to some typical current transient events has been carried out on an 80 MW energy amplifier prototype (EAP) fuelled by mixed oxides and cooled by lead-bismuth. Fuel and coolant temperature trends relevant to recovered beam trip and beam jump events have been preliminary investigated. Beam trip results show that the drop in temperature of the core outlet coolant would be reduced a fair amount if the beam intensity could be recovered within few seconds. Due to the low power density in the EAP fuel, the beam jump from 50% of the nominal power transient evolves benignly. The worst thinkable current transient, beam jump with cold reactor, mainly depends on the coolant flow conditions. In the EAP design, the primary loop coolant flow is assured by natural convection and is enhanced by a particular system of cover gas injection into the bottom part of the riser. If this system of coolant flow enhancement is assumed in function, even the beam jump with cold reactor event evolves without severe consequences. (authors)