WorldWideScience

Sample records for statistical failure analysis

  1. Uncertainty analysis with statistically correlated failure data

    International Nuclear Information System (INIS)

    Modarres, M.; Dezfuli, H.; Roush, M.L.

    1987-01-01

    Likelihood of occurrence of the top event of a fault tree or sequences of an event tree is estimated from the failure probability of components that constitute the events of the fault/event tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. At present most fault tree calculations are based on uncorrelated component failure data. This chapter describes a methodology for assessing the probability intervals for the top event failure probability of fault trees or frequency of occurrence of event tree sequences when event failure data are statistically correlated. To estimate mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. Moment matching technique is used to obtain the probability distribution function of the top event through fitting the Johnson Ssub(B) distribution. The computer program, CORRELATE, was developed to perform the calculations necessary for the implementation of the method developed. (author)

  2. The Statistical Analysis of Failure Time Data

    CERN Document Server

    Kalbfleisch, John D

    2011-01-01

    Contains additional discussion and examples on left truncation as well as material on more general censoring and truncation patterns.Introduces the martingale and counting process formulation swil lbe in a new chapter.Develops multivariate failure time data in a separate chapter and extends the material on Markov and semi Markov formulations.Presents new examples and applications of data analysis.

  3. Statistical Analysis Of Failure Strength Of Material Using Weibull Distribution

    International Nuclear Information System (INIS)

    Entin Hartini; Mike Susmikanti; Antonius Sitompul

    2008-01-01

    In evaluation of ceramic and glass materials strength a statistical approach is necessary Strength of ceramic and glass depend on its measure and size distribution of flaws in these material. The distribution of strength for ductile material is narrow and close to a Gaussian distribution while strength of brittle materials as ceramic and glass following Weibull distribution. The Weibull distribution is an indicator of the failure of material strength resulting from a distribution of flaw size. In this paper, cumulative probability of material strength to failure probability, cumulative probability of failure versus fracture stress and cumulative probability of reliability of material were calculated. Statistical criteria calculation supporting strength analysis of Silicon Nitride material were done utilizing MATLAB. (author)

  4. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  5. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)

    2014-07-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  6. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    International Nuclear Information System (INIS)

    Kančev, Duško; Duchac, Alexander; Zerger, Benoit; Maqua, Michael; Wattrelos, Didier

    2014-01-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  7. Statistical analysis of nuclear power plant pump failure rate variability: some preliminary results

    International Nuclear Information System (INIS)

    Martz, H.F.; Whiteman, D.E.

    1984-02-01

    In-Plant Reliability Data System (IPRDS) pump failure data on over 60 selected pumps in four nuclear power plants are statistically analyzed using the Failure Rate Analysis Code (FRAC). A major purpose of the analysis is to determine which environmental, system, and operating factors adequately explain the variability in the failure data. Catastrophic, degraded, and incipient failure severity categories are considered for both demand-related and time-dependent failures. For catastrophic demand-related pump failures, the variability is explained by the following factors listed in their order of importance: system application, pump driver, operating mode, reactor type, pump type, and unidentified plant-specific influences. Quantitative failure rate adjustments are provided for the effects of these factors. In the case of catastrophic time-dependent pump failures, the failure rate variability is explained by three factors: reactor type, pump driver, and unidentified plant-specific influences. Finally, point and confidence interval failure rate estimates are provided for each selected pump by considering the influential factors. Both types of estimates represent an improvement over the estimates computed exclusively from the data on each pump

  8. Beyond reliability, multi-state failure analysis of satellite subsystems: A statistical approach

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Reliability is widely recognized as a critical design attribute for space systems. In recent articles, we conducted nonparametric analyses and Weibull fits of satellite and satellite subsystems reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we extend our investigation of failures of satellites and satellite subsystems beyond the binary concept of reliability to the analysis of their anomalies and multi-state failures. In reliability analysis, the system or subsystem under study is considered to be either in an operational or failed state; multi-state failure analysis introduces 'degraded states' or partial failures, and thus provides more insights through finer resolution into the degradation behavior of an item and its progression towards complete failure. The database used for the statistical analysis in the present work identifies five states for each satellite subsystem: three degraded states, one fully operational state, and one failed state (complete failure). Because our dataset is right-censored, we calculate the nonparametric probability of transitioning between states for each satellite subsystem with the Kaplan-Meier estimator, and we derive confidence intervals for each probability of transitioning between states. We then conduct parametric Weibull fits of these probabilities using the Maximum Likelihood Estimation (MLE) approach. After validating the results, we compare the reliability versus multi-state failure analyses of three satellite subsystems: the thruster/fuel; the telemetry, tracking, and control (TTC); and the gyro/sensor/reaction wheel subsystems. The results are particularly revealing of the insights that can be gleaned from multi-state failure analysis and the deficiencies, or blind spots, of the traditional reliability analysis. In addition to the specific results provided here, which should prove particularly useful to the space industry, this work highlights the importance

  9. Analysis of Statistical Distributions Used for Modeling Reliability and Failure Rate of Temperature Alarm Circuit

    International Nuclear Information System (INIS)

    EI-Shanshoury, G.I.

    2011-01-01

    Several statistical distributions are used to model various reliability and maintainability parameters. The applied distribution depends on the' nature of the data being analyzed. The presented paper deals with analysis of some statistical distributions used in reliability to reach the best fit of distribution analysis. The calculations rely on circuit quantity parameters obtained by using Relex 2009 computer program. The statistical analysis of ten different distributions indicated that Weibull distribution gives the best fit distribution for modeling the reliability of the data set of Temperature Alarm Circuit (TAC). However, the Exponential distribution is found to be the best fit distribution for modeling the failure rate

  10. Statistical analysis of human maintenance failures of a nuclear power plant

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-01-01

    In this paper, a statistical study of faults caused by maintenance activities is presented. The objective of the study was to draw conclusions on the unplanned effects of maintenance on nuclear power plant safety and system availability. More than 4400 maintenance history reports from the years 1992-1994 of Olkiluoto BWR nuclear power plant (NPP) were analysed together with the maintenance personnel. The human action induced faults were classified, e.g., according to their multiplicity and effects. This paper presents and discusses the results of a statistical analysis of the data. Instrumentation and electrical components are especially prone to human failures. Many human failures were found in safety related systems. Similarly, several failures remained latent from outages to power operation. The safety significance was generally small. Modifications are an important source of multiple human failures. Plant maintenance data is a good source of human reliability data and it should be used more, in future. (orig.)

  11. A statistical analysis of pellet-clad interaction failures in water reactor fuel

    International Nuclear Information System (INIS)

    McDonald, S.G.; Fardo, R.D.; Sipush, P.J.; Kaiser, R.S.

    1981-01-01

    The primary objective of the statistical analysis was to develop a mathematical function that would predict PCI fuel rod failures as a function of the imposed operating conditions. Linear discriminant analysis of data from both test and commercial reactors was performed. The initial data base used encompassed 713 data points (117 failures and 596 non-failures) representing a wide variety of water cooled reactor fuel (PWR, BWR, CANDU, and SGHWR). When applied on a best-estimate basis, the resulting function simultaneously predicts approximately 80 percent of both the failure and non-failure data correctly. One of the most significant predictions of the analysis is that relatively large changes in power can be tolerated when the pre-ramp irradiation power is low, but that only small changes in power can be tolerated when the pre-ramp irradiation power is high. However, it is also predicted that fuel rods irradiated at low power will fail at lower final powers than those irradiated at high powers. Other results of the analysis are that fuel rods with high clad operating temperatures can withstand larger power increases that fuel rods with low clad operating temperatures, and that burnup has only a minimal effect on PCI performance after levels of approximately 10000 MWD/MTU have been exceeded. These trends in PCI performance and the operating parameters selected are believed to be consistent with mechanistic considerations. Published PCI data indicate that BWR fuel usually operates at higher local powers and changes in power, lower clad temperatures, and higher local ramp rates than PWR fuel

  12. Uncertainty analysis of reactor safety systems with statistically correlated failure data

    International Nuclear Information System (INIS)

    Dezfuli, H.; Modarres, M.

    1985-01-01

    The probability of occurrence of the top event of a fault tree is estimated from failure probability of components that constitute the fault tree. Component failure probabilities are subject to statistical uncertainties. In addition, there are cases where the failure data are statistically correlated. Most fault tree evaluations have so far been based on uncorrelated component failure data. The subject of this paper is the description of a method of assessing the probability intervals for the top event failure probability of fault trees when component failure data are statistically correlated. To estimate the mean and variance of the top event, a second-order system moment method is presented through Taylor series expansion, which provides an alternative to the normally used Monte-Carlo method. For cases where component failure probabilities are statistically correlated, the Taylor expansion terms are treated properly. A moment matching technique is used to obtain the probability distribution function of the top event through fitting a Johnson Ssub(B) distribution. The computer program (CORRELATE) was developed to perform the calculations necessary for the implementation of the method developed. The CORRELATE code is very efficient and consumes minimal computer time. This is primarily because it does not employ the time-consuming Monte-Carlo method. (author)

  13. Statistical analysis on failure-to-open/close probability of motor-operated valve in sodium system

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1998-08-01

    The objective of this work is to develop basic data for examination on efficiency of preventive maintenance and actuation test from the standpoint of failure probability. This work consists of a statistical trend analysis of valve failure probability in a failure-to-open/close mode on time since installation and time since last open/close action, based on the field data of operating- and failure-experience. In this work, the terms both dependent and independent on time were considered in the failure probability. The linear aging model was modified and applied to the first term. In this model there are two terms with both failure rates in proportion to time since installation and to time since last open/close-demand. Because of sufficient statistical population, motor-operated valves (MOV's) in sodium system were selected to be analyzed from the CORDS database which contains operating data and failure data of components in the fast reactors and sodium test facilities. According to these data, the functional parameters were statistically estimated to quantify the valve failure probability in a failure-to-open/close mode, with consideration of uncertainty. (J.P.N.)

  14. Statistical analysis of failure time in stress corrosion cracking of fuel tube in light water reactor

    International Nuclear Information System (INIS)

    Hirao, Keiichi; Yamane, Toshimi; Minamino, Yoritoshi

    1991-01-01

    This report is to show how the life due to stress corrosion cracking breakdown of fuel cladding tubes is evaluated by applying the statistical techniques to that examined by a few testing methods. The statistical distribution of the limiting values of constant load stress corrosion cracking life, the statistical analysis by making the probabilistic interpretation of constant load stress corrosion cracking life, and the statistical analysis of stress corrosion cracking life by the slow strain rate test (SSRT) method are described. (K.I.)

  15. A COCAP program for the statistical analysis of common cause failure parameters

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Baehyeuk; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of). Dept. of Nuclear Engineering

    2016-03-15

    Probabilistic Safety Assessment (PSA) based applications and regulations are becoming more important in the field of nuclear energy. According to the results of a PSA in Korea, the common cause failure evaluates CDF (Core Damage Frequency) as one of the significant factors affecting redundancy of NPPs. The purpose of the study is to develop a COCAP (Common Cause Failure parameter Analysis for PSA) program for the accurate use of the alpha factor model parameter data provided by other countries and for obtaining the indigenous CCF data of NPPs in Korea through Bayesian updating.

  16. An analysis of distribution transformer failure using the statistical package for the social sciences (SPSS software

    Directory of Open Access Journals (Sweden)

    María Gabriela Mago Ramos

    2012-05-01

    Full Text Available A methodology was developed for analysing faults in distribution transformers using the statistical package for social sciences (SPSS; it consisted of organising and creating of database regarding failed equipment, incorporating such data into the processing programme and converting all the information into numerical variables to be processed, thereby obtaining descriptive statistics and enabling factor and discriminant analysis. The research was based on information provided by companies in areas served by Corpoelec (Valencia, Venezuela and Codensa (Bogotá, Colombia.

  17. A statistical analysis on failure-to open/close probability of pneumatic valve in sodium cooling systems

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1999-11-01

    The objective of this study is to develop fundamental data for examination on efficiency of preventive maintenance and surveillance test from the standpoint of failure probability. In this study, as a major standby component, a pneumatic valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve failure-to-open/close (FTOC) probability depending on number of demands ('n'), time since installation ('t') and standby time since last open/close action ('T'). The analysis is based on the field data of operating- and failure-experiences stored in the Component Reliability Database and Statistical Analysis System for LMFBR's (CORDS). In the analysis, the FTOC probability ('P') was expressed as follows: P=1-exp{-C-En-F/n-λT-aT(t-T/2)-AT 2 /2}. The functional parameters, 'C', 'E', 'F', 'λ', 'a' and 'A', were estimated with the maximum likelihood estimation method. As a result, the FTOC probability is almost expressed with the failure probability being derived from the failure rate under assumption of the Poisson distribution only when valve cycle (i.e. open-close-open cycle) exceeds about 100 days. When the valve cycle is shorter than about 100 days, the FTOC probability can be adequately estimated with the parameter model proposed in this study. The results obtained from this study may make it possible to derive an adequate frequency of surveillance test for a given target of the FTOC probability. (author)

  18. Electric propulsion reliability: Statistical analysis of on-orbit anomalies and comparative analysis of electric versus chemical propulsion failure rates

    Science.gov (United States)

    Saleh, Joseph Homer; Geng, Fan; Ku, Michelle; Walker, Mitchell L. R.

    2017-10-01

    With a few hundred spacecraft launched to date with electric propulsion (EP), it is possible to conduct an epidemiological study of EP's on orbit reliability. The first objective of the present work was to undertake such a study and analyze EP's track record of on orbit anomalies and failures by different covariates. The second objective was to provide a comparative analysis of EP's failure rates with those of chemical propulsion. Satellite operators, manufacturers, and insurers will make reliability- and risk-informed decisions regarding the adoption and promotion of EP on board spacecraft. This work provides evidence-based support for such decisions. After a thorough data collection, 162 EP-equipped satellites launched between January 1997 and December 2015 were included in our dataset for analysis. Several statistical analyses were conducted, at the aggregate level and then with the data stratified by severity of the anomaly, by orbit type, and by EP technology. Mean Time To Anomaly (MTTA) and the distribution of the time to (minor/major) anomaly were investigated, as well as anomaly rates. The important findings in this work include the following: (1) Post-2005, EP's reliability has outperformed that of chemical propulsion; (2) Hall thrusters have robustly outperformed chemical propulsion, and they maintain a small but shrinking reliability advantage over gridded ion engines. Other results were also provided, for example the differentials in MTTA of minor and major anomalies for gridded ion engines and Hall thrusters. It was shown that: (3) Hall thrusters exhibit minor anomalies very early on orbit, which might be indicative of infant anomalies, and thus would benefit from better ground testing and acceptance procedures; (4) Strong evidence exists that EP anomalies (onset and likelihood) and orbit type are dependent, a dependence likely mediated by either the space environment or differences in thrusters duty cycles; (5) Gridded ion thrusters exhibit both

  19. Quantification of Solar Cell Failure Signatures Based on Statistical Analysis of Electroluminescence Images

    DEFF Research Database (Denmark)

    Spataru, Sergiu; Parikh, Harsh; Hacke, Peter

    2017-01-01

    We demonstrate a method to quantify the extent of solar cell cracks, shunting, or damaged cell interconnects, present in crystalline silicon photovoltaic (PV) modules by statistical analysis of the electroluminescence (EL) intensity distributions of individual cells within the module. From the EL...... intensity distributions (ELID) of each cell, we calculated summary statistics such as standard deviation, median, skewness and kurtosis, and analyzed how they correlate with the magnitude of the solar cell degradation. We found that the dispersion of the ELID increases with the size and severity...

  20. Quantification of solar cell failure signatures based on statistical analysis of electroluminescence images

    DEFF Research Database (Denmark)

    Spataru, Sergiu; Parikh, Harsh; Benatto, Gisele Alves dos Reis

    2017-01-01

    We propose a method to identify and quantify the extent of solar cell cracks, shunting, or damaged cell interconnects, present in crystalline silicon photovoltaic (PV) modules by statistical analysis of the electroluminescence (EL) intensity distributions of individual cells within the module. From...... the EL intensity distributions (ELID) of each cell, we calculated summary statistics such as standard deviation, median, skewness and kurtosis, and analyzed how they correlate with the type of the solar cell degradation. We found that the dispersion of the ELID increases with the size and severity...

  1. Statistical analysis of operating efficiency and failures of a medical linear accelerator for ten years

    International Nuclear Information System (INIS)

    Ju, Sang Gyu; Huh, Seung Jae; Han, Young Yih

    2005-01-01

    To improve the management of a medical linear accelerator, the records of operational failures of a Varian CL2100C over a ten year period were retrospectively analyzed. The failures were classified according to the involved functional subunits, with each class rated into one of three levels depending on the operational conditions. The relationships between the failure rate and working ratio and between the failure rate and outside temperature were investigated. In addition, the average life time of the main part and the operating efficiency over the last 4 years were analyzed. Among the recorded failures (total 587 failures), the most frequent failure was observed in the parts related with the collimation system, including the monitor chamber, which accounted for 20% of all failures. With regard to the operational conditions, 2nd level of failures, which temporally interrupted treatments, were the most frequent. Third level of failures, which interrupted treatment for more than several hours, were mostly caused by the accelerating subunit. The number of failures was increased with number of treatments and operating time. The average life-times of the Klystron and Thyratron became shorter as the working ratio increased, and were 42 and 83% of the expected values, respectively. The operating efficiency was maintained at 95% or higher, but this value slightly decreased. There was no significant correlation between the number of failures and the outside temperature. The maintenance of detailed equipment problems and failures records over a long period of time can provide good knowledge of equipment function as well as the capability of predicting future failure. More rigorous equipment maintenance is required for old medical linear accelerators for the advanced avoidance of serious failure and to improve the quality of patient treatment

  2. Failure Analysis

    International Nuclear Information System (INIS)

    Iorio, A.F.; Crespi, J.C.

    1987-01-01

    After ten years of operation at the Atucha I Nuclear Power Station a gear belonging to a pressurized heavy water reactor refuelling machine, failed. The gear box was used to operate the inlet-outlet heavy-water valve of the machine. Visual examination of the gear device showed an absence of lubricant and that several gear teeth were broken at the root. Motion was transmitted with a speed-reducing device with controlled adjustable times in order to produce a proper fitness of the valve closure. The aim of this paper is to discuss the results of the gear failure analysis in order to recommend the proper solution to prevent further failures. (Author)

  3. Statistical analysis of fuel failures in large break loss-of-coolant accident (LBLOCA) in EPR type nuclear power plant

    International Nuclear Information System (INIS)

    Arkoma, Asko; Hänninen, Markku; Rantamäki, Karin; Kurki, Joona; Hämäläinen, Anitta

    2015-01-01

    Highlights: • The number of failing fuel rods in a LB-LOCA in an EPR is evaluated. • 59 scenarios are simulated with the system code APROS. • 1000 rods per scenario are simulated with the fuel performance code FRAPTRAN-GENFLO. • All the rods in the reactor are simulated in the worst scenario. • Results suggest that the regulations set by the Finnish safety authority are met. - Abstract: In this paper, the number of failing fuel rods in a large break loss-of-coolant accident (LB-LOCA) in EPR-type nuclear power plant is evaluated using statistical methods. For this purpose, a statistical fuel failure analysis procedure has been developed. The developed method utilizes the results of nonparametric statistics, the Wilks’ formula in particular, and is based on the selection and variation of parameters that are important in accident conditions. The accident scenario is simulated with the coupled fuel performance – thermal hydraulics code FRAPTRAN-GENFLO using various parameter values and thermal hydraulic and power history boundary conditions between the simulations. The number of global scenarios is 59 (given by the Wilks’ formula), and 1000 rods are simulated in each scenario. The boundary conditions are obtained from a new statistical version of the system code APROS. As a result, in the worst global scenario, 1.2% of the simulated rods failed, and it can be concluded that the Finnish safety regulations are hereby met (max. 10% of the rods allowed to fail)

  4. Statistical analysis of fuel failures in large break loss-of-coolant accident (LBLOCA) in EPR type nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Arkoma, Asko, E-mail: asko.arkoma@vtt.fi; Hänninen, Markku; Rantamäki, Karin; Kurki, Joona; Hämäläinen, Anitta

    2015-04-15

    Highlights: • The number of failing fuel rods in a LB-LOCA in an EPR is evaluated. • 59 scenarios are simulated with the system code APROS. • 1000 rods per scenario are simulated with the fuel performance code FRAPTRAN-GENFLO. • All the rods in the reactor are simulated in the worst scenario. • Results suggest that the regulations set by the Finnish safety authority are met. - Abstract: In this paper, the number of failing fuel rods in a large break loss-of-coolant accident (LB-LOCA) in EPR-type nuclear power plant is evaluated using statistical methods. For this purpose, a statistical fuel failure analysis procedure has been developed. The developed method utilizes the results of nonparametric statistics, the Wilks’ formula in particular, and is based on the selection and variation of parameters that are important in accident conditions. The accident scenario is simulated with the coupled fuel performance – thermal hydraulics code FRAPTRAN-GENFLO using various parameter values and thermal hydraulic and power history boundary conditions between the simulations. The number of global scenarios is 59 (given by the Wilks’ formula), and 1000 rods are simulated in each scenario. The boundary conditions are obtained from a new statistical version of the system code APROS. As a result, in the worst global scenario, 1.2% of the simulated rods failed, and it can be concluded that the Finnish safety regulations are hereby met (max. 10% of the rods allowed to fail)

  5. Data and Statistics: Heart Failure

    Science.gov (United States)

    ... Summary Coverdell Program 2012-2015 State Summaries Data & Statistics Fact Sheets Heart Disease and Stroke Fact Sheets ... Roadmap for State Planning Other Data Resources Other Statistic Resources Grantee Information Cross-Program Information Online Tools ...

  6. The statistical analysis of failure of a MEVATRON77 DX67 linear accelerator over a ten year period

    International Nuclear Information System (INIS)

    Aoyama, Hideki; Inamura, Keiji; Tahara, Seiji; Uno, Hirofumi; Kadohisa, Shigefumi; Azuma, Yoshiharu; Nakagiri, Yoshitada; Hiraki, Yoshio

    2003-01-01

    A linear accelerator (linac) takes a leading role in radiation therapy. A linac consists of complicated main parts and systems and it is required that highly accurate operational procedures should be maintained. Operational failure occurs for various reasons. In this report, the failure occurrences of one linac over a ten year period were recorded and analyzed. The subject model was a MEVATRON77 DX67 (Siemens, Inc). The failure rate for each system, the form classification of the contents of failure, the operation situation at the time of failure, and the average performance life of the main parts were totaled. Moreover, the relation between the number of therapies that patients received (operating efficiency) and the failure rate within that number and the relation between environment (temperature and humidity) and the failure rate attributed to other systems were analyzed. In this report, irradiation interruption was also included with situations where treatment was unable to begin in total for the number of failure cases. The cases of failure were classified into three kinds, irradiation possible: irradiation capacity decreased, and: irradiation impossible. Consequently, the total failure number of cases for ten years and eight months was 1,036, and the number of cases/rate of each kind were irradiation possible: 49/4.7%, irradiation capacity: 919/88.7%, and irradiation impossible: 68/6.6%. In the classification according to the system, the acceleration section accounted for 59.0% and the pulse section 23.2% of the total number of failure cases. Every year, an operating efficiency of 95% or higher was maintained. The average lives of a thyratron, a klystron, and radio frequency (RF) driver were 4,886 hours, 17,383 hours, and 5,924 hours respectively. Moreover, although analysis of the relation between the number of therapies performed (or operating time) and the number of failures for each main machine part was observed, the tendency was not to associate them

  7. The statistical analysis of failure of a MEVATRON77 DX67 linear accelerator over a ten year period

    CERN Document Server

    Aoyama, H; Tahara, S; Uno, H; Kadohisa, S; Azuma, Y; Nakagiri, Y; Hiraki, Y

    2003-01-01

    A linear accelerator (linac) takes a leading role in radiation therapy. A linac consists of complicated main parts and systems and it is required that highly accurate operational procedures should be maintained. Operational failure occurs for various reasons. In this report, the failure occurrences of one linac over a ten year period were recorded and analyzed. The subject model was a MEVATRON77 DX67 (Siemens, Inc). The failure rate for each system, the form classification of the contents of failure, the operation situation at the time of failure, and the average performance life of the main parts were totaled. Moreover, the relation between the number of therapies that patients received (operating efficiency) and the failure rate within that number and the relation between environment (temperature and humidity) and the failure rate attributed to other systems were analyzed. In this report, irradiation interruption was also included with situations where treatment was unable to begin in total for the number o...

  8. Failure rate analysis using GLIMMIX

    International Nuclear Information System (INIS)

    Moore, L.M.; Hemphill, G.M.; Martz, H.F.

    1998-01-01

    This paper illustrates use of a recently developed SAS macro, GLIMMIX, for implementing an analysis suggested by Wolfinger and O'Connell (1993) in modeling failure count data with random as well as fixed factor effects. Interest in this software tool arose from consideration of modernizing the Failure Rate Analysis Code (FRAC), developed at Los Alamos National Laboratory in the early 1980's by Martz, Beckman and McInteer (1982). FRAC is a FORTRAN program developed to analyze Poisson distributed failure count data as a log-linear model, possibly with random as well as fixed effects. These statistical modeling assumptions are a special case of generalized linear mixed models, identified as GLMM in the current statistics literature. In the nearly 15 years since FRAC was developed, there have been considerable advances in computing capability, statistical methodology and available statistical software tools allowing worthwhile consideration of the tasks of modernizing FRAC. In this paper, the approaches to GLMM estimation implemented in GLIMMIX and in FRAC are described and a comparison of results for the two approaches is made with data on catastrophic time-dependent pump failures from a report by Martz and Whiteman (1984). Additionally, statistical and graphical model diagnostics are suggested and illustrated with the GLIMMIX analysis results

  9. Isogeometric failure analysis

    NARCIS (Netherlands)

    Verhoosel, C.V.; Scott, M.A.; Borden, M.J.; Borst, de R.; Hughes, T.J.R.; Mueller-Hoeppe, D.; Loehnert, S.; Reese, S.

    2011-01-01

    Isogeometric analysis is a versatile tool for failure analysis. On the one hand, the excellent control over the inter-element continuity conditions enables a natural incorporation of continuum constitutive relations that incorporate higher-order strain gradients, as in gradient plasticity or damage.

  10. Statistical data analysis handbook

    National Research Council Canada - National Science Library

    Wall, Francis J

    1986-01-01

    It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...

  11. Statistical characteristics of serious network failures in Japan

    International Nuclear Information System (INIS)

    Uchida, Masato

    2014-01-01

    Due to significant environmental changes in the telecommunications market, network failures affect socioeconomic activities more than ever before. However, the health of public networks at a national level has not been investigated in detail. In this paper, we investigate the statistical characteristics of interval, duration, and the number of users affected for serious network failures, which are defined as network failures that last for more than two hours and affect more than 30,000 users, that occurred in Japan during Japanese fiscal years 2008–2012 (April 2008–March 2013). The results show that (i) the interval follows a Poisson process, (ii) the duration follows a Pareto distribution, (iii) the number of users affected follows a piecewise Pareto distribution, (iv) the product of duration and the number of users affected roughly follow a distribution that can be derived from a convolution of two distributions of duration and the number of users affected, and (v) the relationship between duration and the number of users affected differs from service to service. - Highlights: • The statistical characteristics of serious network failures in Japan are analyzed. • The analysis is based on public information that is available at the moment. • The interval follows a Poisson process. • The duration follows a Pareto distribution. • The number of users affected follows a piecewise Pareto distribution

  12. Cube or block. Statistical analysis, historial review, failure mode and behaviour; Cubo o bloque. Ajuste estadistico, analisis historico, modo de fallo y comportamiento

    Energy Technology Data Exchange (ETDEWEB)

    Negro, V.; Varela, O.; Campo, J. M. del; Lopez Gutierrez, J. S.

    2010-07-01

    Many different concrete shapes have been developed as armour units for rubble mound breakwaters. Nearly all are mass concrete construction and can be classified as random placed or regular patterns Placed. the majority of artificial armour unit are placed in two layers and they are massive. they intended to function in a similar way to natural rock (cubes, blocks, antifer cubes,...). More complex armour units were designed to achieve greater stability by obtaining a high degree of interlock (dolosse, accropode, Xbloc, core-loc,...). finally, the third group are the regular pattern placed units with a greater percentage of voids for giving a stronger dissipation of cement hydration (cob, shed, hollow cubes,...), This research deals about the comparison between two massive concrete units, the cubes and the blocks and the analysis of the geometry, the porosity, the construction process and the failure mode. The first stage is the statistical analysis. the scope of it is based on the historical reference of the Spanish Breakwaters with main layer of cubes and blocks (ministry of Public Works, General Directorate of Ports, 1988). (Author) 9 refs.

  13. Retrospective analysis of 56 edentulous dental arches restored with 344 single-stage implants using an immediate loading fixed provisional protocol: statistical predictors of implant failure.

    Science.gov (United States)

    Kinsel, Richard P; Liss, Mindy

    2007-01-01

    The purpose of this retrospective study was to evaluate the effects of implant dimensions, surface treatment, location in the dental arch, numbers of supporting implant abutments, surgical technique, and generally recognized risk factors on the survival of a series of single-stage Straumann dental implants placed into edentulous arches using an immediate loading protocol. Each patient received between 4 and 18 implants in one or both dental arches. Periapical radiographs were obtained over a 2- to 10-year follow-up period to evaluate crestal bone loss following insertion of the definitive metal-ceramic fixed prostheses. Univariate tests for failure rates as a function of age ( or = 60 years), gender, smoking, bone grafting, dental arch, surface type, anterior versus posterior, number of implants per arch, and surgical technique were made using Fisher exact tests. The Cochran-Armitage test for trend was used to evaluate the presence of a linear trend in failure rates regarding implant length and implant diameter. Logistic regression modeling was used to determine which, if any, of the aforementioned factors would predict patient and implant failure. A significance criterion of P = .05 was utilized. Data were collected for 344 single-stage implants placed into 56 edentulous arches (39 maxillae and 17 mandibles) of 43 patients and immediately loaded with a 1-piece provisional fixed prosthesis. A total of 16 implants failed to successfully integrate, for a survival rate of 95.3%. Increased rates of failure were associated with reduced implant length, placement in the posterior region of the jaw, increased implant diameter, and surface treatment. Implant length emerged as the sole significant predictor of implant failure. In this retrospective analysis of 56 consecutively treated edentulous arches with multiple single-stage dental implants loaded immediately, reduced implant length was the sole significant predictor of failure.

  14. Beginning statistics with data analysis

    CERN Document Server

    Mosteller, Frederick; Rourke, Robert EK

    2013-01-01

    This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.

  15. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  16. Lower head failure analysis

    International Nuclear Information System (INIS)

    Rempe, J.L.; Thinnes, G.L.; Allison, C.M.; Cronenberg, A.W.

    1991-01-01

    The US Nuclear Regulatory Commission is sponsoring a lower vessel head research program to investigate plausible modes of reactor vessel failure in order to determine (a) which modes have the greatest likelihood of occurrence during a severe accident and (b) the range of core debris and accident conditions that lead to these failures. This paper presents the methodology and preliminary results of an investigation of reactor designs and thermodynamic conditions using analytic closed-form approximations to assess the important governing parameters in non-dimensional form. Preliminary results illustrate the importance of vessel and tube geometrical parameters, material properties, and external boundary conditions on predicting vessel failure. Thermal analyses indicate that steady-state temperature distributions will occur in the vessel within several hours, although the exact time is dependent upon vessel thickness. In-vessel tube failure is governed by the tube-to-debris mass ratio within the lower head, where most penetrations are predicted to fail if surrounded by molten debris. Melt penetration distance is dependent upon the effective flow diameter of the tube. Molten debris is predicted to penetrate through tubes with a larger effective flow diameter, such as a boiling water reactor (BWR) drain nozzle. Ex-vessel tube failure for depressurized reactor vessels is predicted to be more likely for a BWR drain nozzle penetration because of its larger effective diameter. At high pressures (between ∼0.1 MPa and ∼12 MPa) ex-vessel tube rupture becomes a dominant failure mechanism, although tube ejection dominates control rod guide tube failure at lower temperatures. However, tube ejection and tube rupture predictions are sensitive to the vessel and tube radial gap size and material coefficients of thermal expansion

  17. Approaches to statistical analysis of repeated echocardiographic measurements after myocardial infarction and its relation to heart failure : Application of a random-effects model

    NARCIS (Netherlands)

    de Kam, PJ; Voors, AA; Brouwer, J; van Gilst, WH

    Background: Extensive left ventricular (LV) dilatation after myocardial infarction (MI) is associated with increased heart failure risk. Aims: To investigate whether the power to demonstrate the relation between LV dilatation and heart failure depends on the method applied to predict LV dilatation

  18. Some calculations of the failure statistics of coated fuel particles

    International Nuclear Information System (INIS)

    Martin, D.G.; Hobbs, J.E.

    1977-03-01

    Statistical variations of coated fuel particle parameters were considered in stress model calculations and the resulting particle failure fraction versus burn-up evaluated. Variations in the following parameters were considered simultaneously: kernel diameter and porosity, thickness of the buffer, seal, silicon carbide and inner and outer pyrocarbon layers, which were all assumed to be normally distributed, and the silicon carbide fracture stress which was assumed to follow a Weibull distribution. Two methods, based respectively on random sampling and convolution of the variations were employed and applied to particles manufactured by Dragon Project and RFL Springfields. Convolution calculations proved the more satisfactory. In the present calculations variations in the silicon carbide fracture stress caused the greatest spread in burn-up for a given change in failure fraction; kernel porosity is the next most important parameter. (author)

  19. Lessons learned from failure analysis

    International Nuclear Information System (INIS)

    Le May, I.

    2006-01-01

    Failure analysis can be a very useful tool to designers and operators of plant and equipment. It is not simply something that is done for lawyers and insurance companies, but is a tool from which lessons can be learned and by means of which the 'breed' can be improved. In this presentation, several failure investigations that have contributed to understanding will be presented. Specifically, the following cases will be discussed: 1) A fire at a refinery that occurred in a desulphurization unit. 2) The failure of a pipeline before it was even put into operation. 3) Failures in locomotive axles that took place during winter operation. The refinery fire was initially blamed on defective Type 321 seamless stainless steel tubing, but there were conflicting views between 'experts' involved as to the mechanism of failure and the writer was called upon to make an in-depth study. This showed that there were a variety of failure mechanism involved, including high temperature fracture, environmentally-induced cracking and possible manufacturing defects. The unraveling of the failure sequence is described and illustrated. The failure of an oil transmission was discovered when the line was pressure tested some months after it had been installed and before it was put into service. Repairs were made and failure occurred in another place upon the next pressure test being conducted. After several more repairs had been made the line was abandoned and a lawsuit was commenced on the basis that the steel was defective. An investigation disclosed that the material was sensitive to embrittlement and the causes of this were determined. As a result, changes were made in the microstructural control of the product to avoid similar problems in future. A series of axle failures occurred in diesel electric locomotives during winter. An investigation was made to determine the nature of the failures which were not by classical fatigue, nor did they correspond to published illustrations of Cu

  20. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  1. Statistical Analysis and validation

    NARCIS (Netherlands)

    Hoefsloot, H.C.J.; Horvatovich, P.; Bischoff, R.

    2013-01-01

    In this chapter guidelines are given for the selection of a few biomarker candidates from a large number of compounds with a relative low number of samples. The main concepts concerning the statistical validation of the search for biomarkers are discussed. These complicated methods and concepts are

  2. Statistical data analysis

    International Nuclear Information System (INIS)

    Hahn, A.A.

    1994-11-01

    The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques

  3. Statistical Analysis Plan

    DEFF Research Database (Denmark)

    Ris Hansen, Inge; Søgaard, Karen; Gram, Bibi

    2015-01-01

    This is the analysis plan for the multicentre randomised control study looking at the effect of training and exercises in chronic neck pain patients that is being conducted in Jutland and Funen, Denmark. This plan will be used as a work description for the analyses of the data collected....

  4. Failure of the statistical hypothesis for compound nucleus decay

    International Nuclear Information System (INIS)

    Chrien, R.E.

    1976-01-01

    In the past five years, conclusive evidence has accumulated that channel correlations are important in resonance reactions. Experiments showing the failure of the statistical hypothesis for compound nucleus decay are described. The emphasis is on the radiative neutron capture reaction, where much detailed work has been done. A short summary of the theory of the (n,γ) reaction is presented; it is demonstrated that this reaction is a sensitive probe of the wave functions for highly excited nuclear states. Various experimental techniques using reactor and accelerator-based neutron sources are presented. The experiments have shown that both resonant and non-resonant reactions can show single particle effects where the external part of configuration space is dominant. In the non-resonant case, hard sphere capture is important; on resonances, valence particle motion and the contributions of 1 and 3 quasi-particle doorway states make up a significant fraction of the radiative width

  5. Research design and statistical analysis

    CERN Document Server

    Myers, Jerome L; Lorch Jr, Robert F

    2013-01-01

    Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data.  The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations.  Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions.  Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations

  6. Regularized Statistical Analysis of Anatomy

    DEFF Research Database (Denmark)

    Sjöstrand, Karl

    2007-01-01

    This thesis presents the application and development of regularized methods for the statistical analysis of anatomical structures. Focus is on structure-function relationships in the human brain, such as the connection between early onset of Alzheimer’s disease and shape changes of the corpus...... and mind. Statistics represents a quintessential part of such investigations as they are preluded by a clinical hypothesis that must be verified based on observed data. The massive amounts of image data produced in each examination pose an important and interesting statistical challenge...... efficient algorithms which make the analysis of large data sets feasible, and gives examples of applications....

  7. Failure analysis: Status and future trends

    International Nuclear Information System (INIS)

    Anderson, R.E.; Soden, J.M.; Henderson, C.L.

    1995-01-01

    Failure analysis is a critical element in the integrated circuit manufacturing industry. This paper reviews the changing role of failure analysis and describes major techniques employed in the industry today. Several advanced failure analysis techniques that meet the challenges imposed by advancements in integrated circuit technology are described and their applications are discussed. Future trends in failure analysis needed to keep pace with the continuing advancements in integrated circuit technology are anticipated

  8. Dependent failure analysis of NPP data bases

    International Nuclear Information System (INIS)

    Cooper, S.E.; Lofgren, E.V.; Samanta, P.K.; Wong Seemeng

    1993-01-01

    A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)

  9. Signal analysis for failure detection

    International Nuclear Information System (INIS)

    Parpaglione, M.C.; Perez, L.V.; Rubio, D.A.; Czibener, D.; D'Attellis, C.E.; Brudny, P.I.; Ruzzante, J.E.

    1994-01-01

    Several methods for analysis of acoustic emission signals are presented. They are mainly oriented to detection of changes in noisy signals and characterization of higher amplitude discrete pulses or bursts. The aim was to relate changes and events with failure, crack or wear in materials, being the final goal to obtain automatic means of detecting such changes and/or events. Performance evaluation was made using both simulated and laboratory test signals. The methods being presented are the following: 1. Application of the Hopfield Neural Network (NN) model for classifying faults in pipes and detecting wear of a bearing. 2. Application of the Kohonnen and Back Propagation Neural Network model for the same problem. 3. Application of Kalman filtering to determine time occurrence of bursts. 4. Application of a bank of Kalman filters (KF) for failure detection in pipes. 5. Study of amplitude distribution of signals for detecting changes in their shape. 6. Application of the entropy distance to measure differences between signals. (author). 10 refs, 11 figs

  10. Constructing Ontology for Knowledge Sharing of Materials Failure Analysis

    Directory of Open Access Journals (Sweden)

    Peng Shi

    2014-01-01

    Full Text Available Materials failure indicates the fault with materials or components during their performance. To avoid the reoccurrence of similar failures, materials failure analysis is executed to investigate the reasons for the failure and to propose improved strategies. The whole procedure needs sufficient domain knowledge and also produces valuable new knowledge. However, the information about the materials failure analysis is usually retained by the domain expert, and its sharing is technically difficult. This phenomenon may seriously reduce the efficiency and decrease the veracity of the failure analysis. To solve this problem, this paper adopts ontology, a novel technology from the Semantic Web, as a tool for knowledge representation and sharing and describes the construction of the ontology to obtain information concerning the failure analysis, application area, materials, and failure cases. The ontology represented information is machine-understandable and can be easily shared through the Internet. At the same time, failure case intelligent retrieval, advanced statistics, and even automatic reasoning can be accomplished based on ontology represented knowledge. Obviously this can promote the knowledge sharing of materials service safety and improve the efficiency of failure analysis. The case of a nuclear power plant area is presented to show the details and benefits of this method.

  11. Establishment, maintenance and application of failure statistics as a basis for availability optimization

    International Nuclear Information System (INIS)

    Poll, H.

    1989-01-01

    The purpose of failure statistics is to obtain hints on weak points due to operation and design. The present failure statistics of Rheinisch-Westfaelisches Elektrizitaetswerk (RWE) is based on reducing availability of power station units. If damage or trouble occurs with a unit, data will be recorded in order to calculate the unavailability and to describe the occurence, the extent, and the removal of damage. Following a survey of the most important data, a short explanation is given on updating of failure statistics and some problems of this job are mentioned. Finally some examples are given, how failure statistics can be used for analyses. (orig.) [de

  12. Two-Sample Statistics for Testing the Equality of Survival Functions Against Improper Semi-parametric Accelerated Failure Time Alternatives: An Application to the Analysis of a Breast Cancer Clinical Trial

    Science.gov (United States)

    BROËT, PHILIPPE; TSODIKOV, ALEXANDER; DE RYCKE, YANN; MOREAU, THIERRY

    2010-01-01

    This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests. PMID:15293627

  13. Two-sample statistics for testing the equality of survival functions against improper semi-parametric accelerated failure time alternatives: an application to the analysis of a breast cancer clinical trial.

    Science.gov (United States)

    Broët, Philippe; Tsodikov, Alexander; De Rycke, Yann; Moreau, Thierry

    2004-06-01

    This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests.

  14. Bayesian Inference in Statistical Analysis

    CERN Document Server

    Box, George E P

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Rob

  15. Analysis of failures in concrete containments

    International Nuclear Information System (INIS)

    Moreno-Gonzalez, A.

    1989-09-01

    The function of Containment, in an accident event, is to avoid the release of radioactive substances into the surroundings. Containment failure, therefore, is defined as the appearance of leak paths to the external environment. These leak paths may appear either as a result of loss of leaktightness due to degradation of design conditions or structural failure with containment material break. This document is a survey of the state of the art of Containment Failure Analysis. It gives a detailed description of all failure mechanisms, indicating all the possible failure modes and their causes, right from failure resulting from degradation of the materials to structural failure and linear breake failure. Following the description of failure modes, possible failure criteria are identified, with special emphasis on structural failure criteria. These criteria have been obtained not only from existing codes but also from the latest experimental results. A chapter has been dedicated exclusively to failure criteria in conventional structures, for the purpose of evaluating the possibility of application to the case of containment. As the structural behaviour of the containment building is very complex, it is not possible to define failure through a single parameter. It is therefore advisable to define a methodology for containment failure analysis which could be applied to a particular containment. This methodology should include prevailing load and material conditions together with the behaviour of complex conditions such as the liner-anchorage-cracked concrete interaction

  16. Failure analysis of buried tanks

    International Nuclear Information System (INIS)

    Watkins, R.K.

    1994-01-01

    Failure of a buried tank can be hazardous. Failure may be a leak through which product is lost from the tank; but also through which contamination can occur. Failures are epidemic -- because buried tanks are out of sight, but also because designers of buried tanks have adopted analyses developed for pressure tanks. So why do pressure tanks fail when they are buried? Most failures of buried tanks are really soil failures. Soil compresses, or slips, or liquefies. Soil is not only a load, it is a support without which the tank deforms. A high water table adds to the load on the tank. It also reduces the strength of the soil. Based on tests, structural analyses are proposed for empty tanks buried in soils of various quality, with the water table at various levels, and with internal vacuum. Failure may be collapse tank. Such collapse is a sudden, audible inversion of the cylinder when the sidefill soil slips. Failure may be flotation. Failure may be a leak. Most leaks are fractures in the welds in overlap seams at flat spots. Flat spots are caused by a hard bedding or a heavy surface wheel load. Because the tank wall is double thick at the overlap, shearing stress in the weld is increased. Other weld failures occur when an end plate shears down past a cylinder; or when the tank is supported only at its ends like a beam. These, and other, failures can be analyzed with justifiable accuracy using basic principles of mechanics of materials. 10 figs

  17. The statistical analysis of anisotropies

    International Nuclear Information System (INIS)

    Webster, A.

    1977-01-01

    One of the many uses to which a radio survey may be put is an analysis of the distribution of the radio sources on the celestial sphere to find out whether they are bunched into clusters or lie in preferred regions of space. There are many methods of testing for clustering in point processes and since they are not all equally good this contribution is presented as a brief guide to what seems to be the best of them. The radio sources certainly do not show very strong clusering and may well be entirely unclustered so if a statistical method is to be useful it must be both powerful and flexible. A statistic is powerful in this context if it can efficiently distinguish a weakly clustered distribution of sources from an unclustered one, and it is flexible if it can be applied in a way which avoids mistaking defects in the survey for true peculiarities in the distribution of sources. The paper divides clustering statistics into two classes: number density statistics and log N/log S statistics. (Auth.)

  18. Statistical analysis of environmental data

    International Nuclear Information System (INIS)

    Beauchamp, J.J.; Bowman, K.O.; Miller, F.L. Jr.

    1975-10-01

    This report summarizes the analyses of data obtained by the Radiological Hygiene Branch of the Tennessee Valley Authority from samples taken around the Browns Ferry Nuclear Plant located in Northern Alabama. The data collection was begun in 1968 and a wide variety of types of samples have been gathered on a regular basis. The statistical analysis of environmental data involving very low-levels of radioactivity is discussed. Applications of computer calculations for data processing are described

  19. Failure analysis of boiler tube

    International Nuclear Information System (INIS)

    Mehmood, K.; Siddiqui, A.R.

    2007-01-01

    Boiler tubes are energy conversion components where heat energy is used to convert water into high pressure superheated steam, which is then delivered to a turbine for electric power generation in thermal power plants or to run plant and machineries in a process or manufacturing industry. It was reported that one of the tubes of a fire-tube boiler used in a local industry had leakage after the formation of pits at the external surface of the tube. The inner side of the fire tube was working with hot flue gasses with a pressure of 10 Kg/cm/sup 2/ and temperature 225 degree C. The outside of the tube was surrounded by feed water. The purpose of this study was to determine the cause of pits developed at the external surface of the failed boiler tube sample. In the present work boiler tube samples of steel grade ASTM AI61/ASTM A192 were analyzed using metallographic analysis, chemical analysis, and mechanical testing. It was concluded that the appearance of defects on the boiler tube sample indicates cavitation type corrosion failure. Cavitation damage superficially resembled pitting, but surface appeared considerably rougher and had many closely spaced pits. (author)

  20. VDEW statistic of failures and damage 1972. VDEW Stoerungs- und Schadensstatistik 1972

    Energy Technology Data Exchange (ETDEWEB)

    1975-01-01

    Results of the VDEW's statistics on failures and damage concerning the high-voltage network of the FRG and West Berlin in 1972 are presented. The tables, columns, and standard charts published in this brochure were elaborated by the VDEW working group 'Failures and damage statistics' under the leadership of Dipl.-Ing. H. Reisner.

  1. The interaction of NDE and failure analysis

    International Nuclear Information System (INIS)

    Nichols, R.W.

    1988-01-01

    This paper deals with the use of Non-Destructive Examination (NDE) and failure analysis for the assessment of the structural integrity. It appears that failure analysis enables to know whether NDE is required or not, and can help to direct NDE into the most useful directions by identifying the areas where it is most important that defects are absent. It also appears that failure analysis can help the operator to decide which NDE method is best suited to the component studied and provides detailed specifications for this NDE method. The interaction between failure analysis and NDE is then described. (TEC)

  2. The interaction of NDE and failure analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, R W

    1988-12-31

    This paper deals with the use of Non-Destructive Examination (NDE) and failure analysis for the assessment of the structural integrity. It appears that failure analysis enables to know whether NDE is required or not, and can help to direct NDE into the most useful directions by identifying the areas where it is most important that defects are absent. It also appears that failure analysis can help the operator to decide which NDE method is best suited to the component studied and provides detailed specifications for this NDE method. The interaction between failure analysis and NDE is then described. (TEC).

  3. Data needs for common cause failure analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Paula, H.M.; Rasmuson, D.; Whitehead, D.

    1990-01-01

    The procedures guide for common cause failure analysis published jointly by USNRC and EPRI requires a detailed historical event analysis. Recent work on the further development of the cause-defense picture of common cause failures introduced in that guide identified the information that is necessary to perform the detailed analysis in an objective manner. This paper summarizes these information needs

  4. Statistical considerations on safety analysis

    International Nuclear Information System (INIS)

    Pal, L.; Makai, M.

    2004-01-01

    The authors have investigated the statistical methods applied to safety analysis of nuclear reactors and arrived at alarming conclusions: a series of calculations with the generally appreciated safety code ATHLET were carried out to ascertain the stability of the results against input uncertainties in a simple experimental situation. Scrutinizing those calculations, we came to the conclusion that the ATHLET results may exhibit chaotic behavior. A further conclusion is that the technological limits are incorrectly set when the output variables are correlated. Another formerly unnoticed conclusion of the previous ATHLET calculations that certain innocent looking parameters (like wall roughness factor, the number of bubbles per unit volume, the number of droplets per unit volume) can influence considerably such output parameters as water levels. The authors are concerned with the statistical foundation of present day safety analysis practices and can only hope that their own misjudgment will be dispelled. Until then, the authors suggest applying correct statistical methods in safety analysis even if it makes the analysis more expensive. It would be desirable to continue exploring the role of internal parameters (wall roughness factor, steam-water surface in thermal hydraulics codes, homogenization methods in neutronics codes) in system safety codes and to study their effects on the analysis. In the validation and verification process of a code one carries out a series of computations. The input data are not precisely determined because measured data have an error, calculated data are often obtained from a more or less accurate model. Some users of large codes are content with comparing the nominal output obtained from the nominal input, whereas all the possible inputs should be taken into account when judging safety. At the same time, any statement concerning safety must be aleatory, and its merit can be judged only when the probability is known with which the

  5. Probabilistic analysis of ''common mode failures''

    International Nuclear Information System (INIS)

    Easterling, R.G.

    1978-01-01

    Common mode failure is a topic of considerable interest in reliability and safety analyses of nuclear reactors. Common mode failures are often discussed in terms of examples: two systems fail simultaneously due to an external event such as an earthquake; two components in redundant channels fail because of a common manufacturing defect; two systems fail because a component common to both fails; the failure of one system increases the stress on other systems and they fail. The common thread running through these is a dependence of some sort--statistical or physical--among multiple failure events. However, the nature of the dependence is not the same in all these examples. An attempt is made to model situations, such as the above examples, which have been termed ''common mode failures.'' In doing so, it is found that standard probability concepts and terms, such as statistically dependent and independent events, and conditional and unconditional probabilities, suffice. Thus, it is proposed that the term ''common mode failures'' be dropped, at least from technical discussions of these problems. A corollary is that the complementary term, ''random failures,'' should also be dropped. The mathematical model presented may not cover all situations which have been termed ''common mode failures,'' but provides insight into the difficulty of obtaining estimates of the probabilities of these events

  6. Statistical analysis of JET disruptions

    International Nuclear Information System (INIS)

    Tanga, A.; Johnson, M.F.

    1991-07-01

    In the operation of JET and of any tokamak many discharges are terminated by a major disruption. The disruptive termination of a discharge is usually an unwanted event which may cause damage to the structure of the vessel. In a reactor disruptions are potentially a very serious problem, hence the importance of studying them and devising methods to avoid disruptions. Statistical information has been collected about the disruptions which have occurred at JET over a long span of operations. The analysis is focused on the operational aspects of the disruptions rather than on the underlining physics. (Author)

  7. Statistical Analysis of Protein Ensembles

    Science.gov (United States)

    Máté, Gabriell; Heermann, Dieter

    2014-04-01

    As 3D protein-configuration data is piling up, there is an ever-increasing need for well-defined, mathematically rigorous analysis approaches, especially that the vast majority of the currently available methods rely heavily on heuristics. We propose an analysis framework which stems from topology, the field of mathematics which studies properties preserved under continuous deformations. First, we calculate a barcode representation of the molecules employing computational topology algorithms. Bars in this barcode represent different topological features. Molecules are compared through their barcodes by statistically determining the difference in the set of their topological features. As a proof-of-principle application, we analyze a dataset compiled of ensembles of different proteins, obtained from the Ensemble Protein Database. We demonstrate that our approach correctly detects the different protein groupings.

  8. Statistical evaluations concerning the failure behaviour of formed parts with superheated steam flow. Pt. 1

    International Nuclear Information System (INIS)

    Oude-Hengel, H.H.; Vorwerk, K.; Heuser, F.W.; Boesebeck, K.

    1976-01-01

    Statistical evaluations concerning the failure behaviour of formed parts with superheated-steam flow were carried out using data from VdTUEV inventory and failure statistics. Due to the great number of results, the findings will be published in two volumes. This first part will describe and classify the stock of data and will make preliminary quantitative statements on failure behaviour. More differentiated statements are made possible by including the operation time and the number of start-ups per failed part. On the basis of time-constant failure rates some materials-specific statements are given. (orig./ORU) [de

  9. Statistical data analysis using SAS intermediate statistical methods

    CERN Document Server

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  10. Failure analysis and failure prevention in electric power systems

    International Nuclear Information System (INIS)

    Rau, C.A. Jr.; Becker, D.G.; Besuner, P.M.; Cipolla, R.C.; Egan, G.R.; Gupta, P.; Johnson, D.P.; Omry, U.; Tetelman, A.S.; Rettig, T.W.; Peters, D.C.

    1977-01-01

    New methods have been developed and applied to better quantify and increase the reliability, safety, and availability of electric power plants. Present and potential problem areas have been identified both by development of an improved computerized data base of malfunctions in nuclear power plants and by detailed metallurgical and mechanical failure analyses of selected problems. Significant advances in the accuracy and speed of structural analyses have been made through development and application of the boundary integral equation and influence function methods of stress and fracture mechanics analyses. The currently specified flaw evaluation procedures of the ASME Boiler and Pressure Vessel Code have been computerized. Results obtained from these procedures for evaluation of specific in-service inspection indications have been compared with results obtained utilizing the improved analytical methods. Mathematical methods have also been developed to describe and analyze the statistical variations in materials properties and in component loading, and uncertainties in the flaw size that might be passed by quality assurance systems. These new methods have been combined to develop accurate failure rate predictions based upon probabilistic fracture mechanics. Improved failure prevention strategies have been formulated by combining probabilistic fracture mechanics and cost optimization techniques. The approach has been demonstrated by optimizing the nondestructive inspection level with regard to both reliability and cost. (Auth.)

  11. Corrosion induced failure analysis of subsea pipelines

    International Nuclear Information System (INIS)

    Yang, Yongsheng; Khan, Faisal; Thodi, Premkumar; Abbassi, Rouzbeh

    2017-01-01

    Pipeline corrosion is one of the main causes of subsea pipeline failure. It is necessary to monitor and analyze pipeline condition to effectively predict likely failure. This paper presents an approach to analyze the observed abnormal events to assess the condition of subsea pipelines. First, it focuses on establishing a systematic corrosion failure model by Bow-Tie (BT) analysis, and subsequently the BT model is mapped into a Bayesian Network (BN) model. The BN model facilitates the modelling of interdependency of identified corrosion causes, as well as the updating of failure probabilities depending on the arrival of new information. Furthermore, an Object-Oriented Bayesian Network (OOBN) has been developed to better structure the network and to provide an efficient updating algorithm. Based on this OOBN model, probability updating and probability adaptation are performed at regular intervals to estimate the failure probabilities due to corrosion and potential consequences. This results in an interval-based condition assessment of subsea pipeline subjected to corrosion. The estimated failure probabilities would help prioritize action to prevent and control failures. Practical application of the developed model is demonstrated using a case study. - Highlights: • A Bow-Tie (BT) based corrosion failure model linking causation with the potential losses. • A novel Object-Oriented Bayesian Network (OOBN) based corrosion failure risk model. • Probability of failure updating and adaptation with respect to time using OOBN model. • Application of the proposed model to develop and test strategies to minimize failure risk.

  12. Imaging mass spectrometry statistical analysis.

    Science.gov (United States)

    Jones, Emrys A; Deininger, Sören-Oliver; Hogendoorn, Pancras C W; Deelder, André M; McDonnell, Liam A

    2012-08-30

    Imaging mass spectrometry is increasingly used to identify new candidate biomarkers. This clinical application of imaging mass spectrometry is highly multidisciplinary: expertise in mass spectrometry is necessary to acquire high quality data, histology is required to accurately label the origin of each pixel's mass spectrum, disease biology is necessary to understand the potential meaning of the imaging mass spectrometry results, and statistics to assess the confidence of any findings. Imaging mass spectrometry data analysis is further complicated because of the unique nature of the data (within the mass spectrometry field); several of the assumptions implicit in the analysis of LC-MS/profiling datasets are not applicable to imaging. The very large size of imaging datasets and the reporting of many data analysis routines, combined with inadequate training and accessible reviews, have exacerbated this problem. In this paper we provide an accessible review of the nature of imaging data and the different strategies by which the data may be analyzed. Particular attention is paid to the assumptions of the data analysis routines to ensure that the reader is apprised of their correct usage in imaging mass spectrometry research. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Parametric statistical change point analysis

    CERN Document Server

    Chen, Jie

    2000-01-01

    This work is an in-depth study of the change point problem from a general point of view and a further examination of change point analysis of the most commonly used statistical models Change point problems are encountered in such disciplines as economics, finance, medicine, psychology, signal processing, and geology, to mention only several The exposition is clear and systematic, with a great deal of introductory material included Different models are presented in each chapter, including gamma and exponential models, rarely examined thus far in the literature Other models covered in detail are the multivariate normal, univariate normal, regression, and discrete models Extensive examples throughout the text emphasize key concepts and different methodologies are used, namely the likelihood ratio criterion, and the Bayesian and information criterion approaches A comprehensive bibliography and two indices complete the study

  14. Common cause failure analysis methodology for complex systems

    International Nuclear Information System (INIS)

    Wagner, D.P.; Cate, C.L.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complex system reliability analysis. This paper extends existing methods of computer aided common cause failure analysis by allowing analysis of the complex systems often encountered in practice. The methods presented here aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  15. Preliminary failure mode and effect analysis

    International Nuclear Information System (INIS)

    Addison, J.V.

    1972-01-01

    A preliminary Failure Mode and Effect Analysis (FMEA) was made on the overall 5 Kwe system. A general discussion of the system and failure effect is given in addition to the tabulated FMEA and a primary block diagram of the system. (U.S.)

  16. PV System Component Fault and Failure Compilation and Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Geoffrey Taylor; Lavrova, Olga; Gooding, Renee Lynne

    2018-02-01

    This report describes data collection and analysis of solar photovoltaic (PV) equipment events, which consist of faults and fa ilures that occur during the normal operation of a distributed PV system or PV power plant. We present summary statistics from locations w here maintenance data is being collected at various intervals, as well as reliability statistics gathered from that da ta, consisting of fault/failure distributions and repair distributions for a wide range of PV equipment types.

  17. Light water reactor lower head failure analysis

    International Nuclear Information System (INIS)

    Rempe, J.L.; Chavez, S.A.; Thinnes, G.L.

    1993-10-01

    This document presents the results from a US Nuclear Regulatory Commission-sponsored research program to investigate the mode and timing of vessel lower head failure. Major objectives of the analysis were to identify plausible failure mechanisms and to develop a method for determining which failure mode would occur first in different light water reactor designs and accident conditions. Failure mechanisms, such as tube ejection, tube rupture, global vessel failure, and localized vessel creep rupture, were studied. Newly developed models and existing models were applied to predict which failure mechanism would occur first in various severe accident scenarios. So that a broader range of conditions could be considered simultaneously, calculations relied heavily on models with closed-form or simplified numerical solution techniques. Finite element techniques-were employed for analytical model verification and examining more detailed phenomena. High-temperature creep and tensile data were obtained for predicting vessel and penetration structural response

  18. Light water reactor lower head failure analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rempe, J.L.; Chavez, S.A.; Thinnes, G.L. [EG and G Idaho, Inc., Idaho Falls, ID (United States)] [and others

    1993-10-01

    This document presents the results from a US Nuclear Regulatory Commission-sponsored research program to investigate the mode and timing of vessel lower head failure. Major objectives of the analysis were to identify plausible failure mechanisms and to develop a method for determining which failure mode would occur first in different light water reactor designs and accident conditions. Failure mechanisms, such as tube ejection, tube rupture, global vessel failure, and localized vessel creep rupture, were studied. Newly developed models and existing models were applied to predict which failure mechanism would occur first in various severe accident scenarios. So that a broader range of conditions could be considered simultaneously, calculations relied heavily on models with closed-form or simplified numerical solution techniques. Finite element techniques-were employed for analytical model verification and examining more detailed phenomena. High-temperature creep and tensile data were obtained for predicting vessel and penetration structural response.

  19. Failure and damage analysis of advanced materials

    CERN Document Server

    Sadowski, Tomasz

    2015-01-01

    The papers in this volume present basic concepts and new developments in failure and damage analysis with focus on advanced materials such as composites, laminates, sandwiches and foams, and also new metallic materials. Starting from some mathematical foundations (limit surfaces, symmetry considerations, invariants) new experimental results and their analysis are shown. Finally, new concepts for failure prediction and analysis will be introduced and discussed as well as new methods of failure and damage prediction for advanced metallic and non-metallic materials. Based on experimental results the traditional methods will be revised.

  20. Failure analysis of superconducting bearings

    Energy Technology Data Exchange (ETDEWEB)

    Rastogi, Amit; Campbell, A M; Coombs, T A [Department of Engineering, University of Cambridge, Cambridge CB2 1PZ (United Kingdom)

    2006-06-01

    The dynamics of superconductor bearings in a cryogenic failure scenario have been analyzed. As the superconductor warms up, the rotor goes through multiple resonance frequencies, begins to slow down and finally touches down when the superconductor goes through its transition temperature. The bearing can be modelled as a system of springs with axial, radial and cross stiffness. These springs go through various resonant modes as the temperature of the superconductor begins to rise. We have presented possible explanations for such behavio0008.

  1. Analysis of dependent failures in risk assessment and reliability evaluation

    International Nuclear Information System (INIS)

    Fleming, K.N.; Mosleh, A.; Kelley, A.P. Jr.; Gas-Cooled Reactors Associates, La Jolla, CA)

    1983-01-01

    The ability to estimate the risk of potential reactor accidents is largely determined by the ability to analyze statistically dependent multiple failures. The importance of dependent failures has been indicated in recent probabilistic risk assessment (PRA) studies as well as in reports of reactor operating experiences. This article highlights the importance of several different types of dependent failures from the perspective of the risk and reliability analyst and provides references to the methods and data available for their analysis. In addition to describing the current state of the art, some recent advances, pitfalls, misconceptions, and limitations of some approaches to dependent failure analysis are addressed. A summary is included of the discourse on this subject, which is presented in the Institute of Electrical and Electronics Engineers/American Nuclear Society PRA Procedures Guide

  2. Prediction of failure enthalpy and reliability of irradiated fuel rod under reactivity-initiated accidents by means of statistical approach

    International Nuclear Information System (INIS)

    Nam, Cheol; Choi, Byeong Kwon; Jeong, Yong Hwan; Jung, Youn Ho

    2001-01-01

    During the last decade, the failure behavior of high-burnup fuel rods under RIA has been an extensive concern since observations of fuel rod failures at low enthalpy. Of great importance is placed on failure prediction of fuel rod in the point of licensing criteria and safety in extending burnup achievement. To address the issue, a statistics-based methodology is introduced to predict failure probability of irradiated fuel rods. Based on RIA simulation results in literature, a failure enthalpy correlation for irradiated fuel rod is constructed as a function of oxide thickness, fuel burnup, and pulse width. From the failure enthalpy correlation, a single damage parameter, equivalent enthalpy, is defined to reflect the effects of the three primary factors as well as peak fuel enthalpy. Moreover, the failure distribution function with equivalent enthalpy is derived, applying a two-parameter Weibull statistical model. Using these equations, the sensitivity analysis is carried out to estimate the effects of burnup, corrosion, peak fuel enthalpy, pulse width and cladding materials used

  3. Statistically qualified neuro-analytic failure detection method and system

    Science.gov (United States)

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  4. Statistical analysis and data management

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This report provides an overview of the history of the WIPP Biology Program. The recommendations of the American Institute of Biological Sciences (AIBS) for the WIPP biology program are summarized. The data sets available for statistical analyses and problems associated with these data sets are also summarized. Biological studies base maps are presented. A statistical model is presented to evaluate any correlation between climatological data and small mammal captures. No statistically significant relationship between variance in small mammal captures on Dr. Gennaro's 90m x 90m grid and precipitation records from the Duval Potash Mine were found

  5. Statistical analysis of management data

    CERN Document Server

    Gatignon, Hubert

    2013-01-01

    This book offers a comprehensive approach to multivariate statistical analyses. It provides theoretical knowledge of the concepts underlying the most important multivariate techniques and an overview of actual applications.

  6. Failure analysis for WWER-fuel elements

    International Nuclear Information System (INIS)

    Boehmert, J.; Huettig, W.

    1986-10-01

    If the fuel defect rate proves significantly high, failure analysis has to be performed in order to trace down the defect causes, to implement corrective actions, and to take measures of failure prevention. Such analyses are work-consuming and very skill-demanding technical tasks, which require examination methods and devices excellently developed and a rich stock of experience in evaluation of features of damage. For that this work specifies the procedure of failure analyses in detail. Moreover prerequisites and experimental equipment for the investigation of WWER-type fuel elements are described. (author)

  7. Failure analysis of medical Linac (LMR-15)

    International Nuclear Information System (INIS)

    Kato, Kiyotaka; Nakamura, Katsumi; Ogihara, Kiyoshi; Takahashi, Katsuhiko; Sato, Kazuhisa.

    1994-01-01

    In August 1978, Linac (LMR-15, Z4 Toshiba) was installed at our hospital and in use for 12 years up to September 1990. Recently, we completed working and failure records on this apparatus during the 12-year period, for the purpose of their analysis in the basis of reliability engineering. The results revealed operation rate of 97.85% on the average, mean time between failures (MTBF) from 40-70 hours about the beginning of its working to 280 hours for 2 years before renewal and practically satisfactory values of mean life of parts of life such as magnetron, thyratron and electron gun; the above respective values proved to be above those reported by other literature. On the other hand, we classified, by occurring system, the contents of failures in the apparatus and determined the number of failures and the temperature and humidities in case of failures to examine the correlation between the working environment and failure. The results indicated a change in humidity to gain control of failures in the dosimetric system, especially the monitoring chamber and we could back up the strength of the above correlation from a coefficient of correlation value of 0.84. (author)

  8. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  9. Statistics Analysis Measures Painting of Cooling Tower

    Directory of Open Access Journals (Sweden)

    A. Zacharopoulou

    2013-01-01

    Full Text Available This study refers to the cooling tower of Megalopolis (construction 1975 and protection from corrosive environment. The maintenance of the cooling tower took place in 2008. The cooling tower was badly damaged from corrosion of reinforcement. The parabolic cooling towers (factory of electrical power are a typical example of construction, which has a special aggressive environment. The protection of cooling towers is usually achieved through organic coatings. Because of the different environmental impacts on the internal and external side of the cooling tower, a different system of paint application is required. The present study refers to the damages caused by corrosion process. The corrosive environments, the application of this painting, the quality control process, the measures and statistics analysis, and the results were discussed in this study. In the process of quality control the following measurements were taken into consideration: (1 examination of the adhesion with the cross-cut test, (2 examination of the film thickness, and (3 controlling of the pull-off resistance for concrete substrates and paintings. Finally, this study refers to the correlations of measurements, analysis of failures in relation to the quality of repair, and rehabilitation of the cooling tower. Also this study made a first attempt to apply the specific corrosion inhibitors in such a large structure.

  10. Failure analysis on a ruptured petrochemical pipe

    Energy Technology Data Exchange (ETDEWEB)

    Harun, Mohd [Industrial Technology Division, Malaysian Nuclear Agency, Ministry of Science, Technology and Innovation Malaysia, Bangi, Kajang, Selangor (Malaysia); Shamsudin, Shaiful Rizam; Kamardin, A. [Univ. Malaysia Perlis, Jejawi, Arau (Malaysia). School of Materials Engineering

    2010-08-15

    The failure took place on a welded elbow pipe which exhibited a catastrophic transverse rupture. The failure was located on the welding HAZ region, parallel to the welding path. Branching cracks were detected at the edge of the rupture area. Deposits of corrosion products were also spotted. The optical microscope analysis showed the presence of transgranular failures which were related to the stress corrosion cracking (SCC) and were predominantly caused by the welding residual stress. The significant difference in hardness between the welded area and the pipe confirmed the findings. Moreover, the failure was also caused by the low Mo content in the stainless steel pipe which was detected by means of spark emission spectrometer. (orig.)

  11. Statistical evaluation of failures and repairs of the V-1 measuring and control system

    International Nuclear Information System (INIS)

    Laurinec, R.; Korec, J.; Mitosinka, J.; Zarnovican, V.

    1984-01-01

    A failure record card system was introduced for evaluating the reliability of the measurement and control equipment of the V-1 nuclear power plant. The SPU-800 microcomputer system is used for recording data on magnetic tape and their transmission to the central data processing department. The data are used for evaluating the reliability of components and circuits and a selection is made of the most failure-prone components, and the causes of failures are evaluated as are failure identification, repair and causes of outages. The system provides monthly, annual and total assessment data since the system was commissioned. The results of the statistical evaluation of failures are used for planning preventive maintenance and for determining optimal repair intervals. (E.S.)

  12. Failure Analysis Of Industrial Boiler Pipe

    International Nuclear Information System (INIS)

    Natsir, Muhammad; Soedardjo, B.; Arhatari, Dewi; Andryansyah; Haryanto, Mudi; Triyadi, Ari

    2000-01-01

    Failure analysis of industrial boiler pipe has been done. The tested pipe material is carbon steel SA 178 Grade A refer to specification data which taken from Fertilizer Company. Steps in analysis were ; collection of background operation and material specification, visual inspection, dye penetrant test, radiography test, chemical composition test, hardness test, metallography test. From the test and analysis result, it is shown that the pipe failure caused by erosion and welding was shown porosity and incomplete penetration. The main cause of failure pipe is erosion due to cavitation, which decreases the pipe thickness. Break in pipe thickness can be done due to decreasing in pipe thickness. To anticipate this problem, the ppe will be replaced with new pipe

  13. Failure analysis of real-time systems

    International Nuclear Information System (INIS)

    Jalashgar, A.; Stoelen, K.

    1998-01-01

    This paper highlights essential aspects of real-time software systems that are strongly related to the failures and their course of propagation. The significant influence of means-oriented and goal-oriented system views in the description, understanding and analysing of those aspects is elaborated. The importance of performing failure analysis prior to reliability analysis of real-time systems is equally addressed. Problems of software reliability growth models taking the properties of such systems into account are discussed. Finally, the paper presents a preliminary study of a goal-oriented approach to model the static and dynamic characteristics of real-time systems, so that the corresponding analysis can be based on a more descriptive and informative picture of failures, their effects and the possibility of their occurrence. (author)

  14. A Statistical Analysis of Cryptocurrencies

    Directory of Open Access Journals (Sweden)

    Stephen Chan

    2017-05-01

    Full Text Available We analyze statistical properties of the largest cryptocurrencies (determined by market capitalization, of which Bitcoin is the most prominent example. We characterize their exchange rates versus the U.S. Dollar by fitting parametric distributions to them. It is shown that returns are clearly non-normal, however, no single distribution fits well jointly to all the cryptocurrencies analysed. We find that for the most popular currencies, such as Bitcoin and Litecoin, the generalized hyperbolic distribution gives the best fit, while for the smaller cryptocurrencies the normal inverse Gaussian distribution, generalized t distribution, and Laplace distribution give good fits. The results are important for investment and risk management purposes.

  15. A streamlined failure mode and effects analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Eric C., E-mail: eford@uw.edu; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, MD 21287 (United States)

    2014-06-15

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  16. A streamlined failure mode and effects analysis

    International Nuclear Information System (INIS)

    Ford, Eric C.; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-01-01

    Purpose: Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. Methods: FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Results: Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes hadRPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Conclusions: Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed

  17. A streamlined failure mode and effects analysis.

    Science.gov (United States)

    Ford, Eric C; Smith, Koren; Terezakis, Stephanie; Croog, Victoria; Gollamudi, Smitha; Gage, Irene; Keck, Jordie; DeWeese, Theodore; Sibley, Greg

    2014-06-01

    Explore the feasibility and impact of a streamlined failure mode and effects analysis (FMEA) using a structured process that is designed to minimize staff effort. FMEA for the external beam process was conducted at an affiliate radiation oncology center that treats approximately 60 patients per day. A structured FMEA process was developed which included clearly defined roles and goals for each phase. A core group of seven people was identified and a facilitator was chosen to lead the effort. Failure modes were identified and scored according to the FMEA formalism. A risk priority number,RPN, was calculated and used to rank failure modes. Failure modes with RPN > 150 received safety improvement interventions. Staff effort was carefully tracked throughout the project. Fifty-two failure modes were identified, 22 collected during meetings, and 30 from take-home worksheets. The four top-ranked failure modes were: delay in film check, missing pacemaker protocol/consent, critical structures not contoured, and pregnant patient simulated without the team's knowledge of the pregnancy. These four failure modes had RPN > 150 and received safety interventions. The FMEA was completed in one month in four 1-h meetings. A total of 55 staff hours were required and, additionally, 20 h by the facilitator. Streamlined FMEA provides a means of accomplishing a relatively large-scale analysis with modest effort. One potential value of FMEA is that it potentially provides a means of measuring the impact of quality improvement efforts through a reduction in risk scores. Future study of this possibility is needed.

  18. Failure diagnosis and fault tree analysis

    International Nuclear Information System (INIS)

    Weber, G.

    1982-07-01

    In this report a methodology of failure diagnosis for complex systems is presented. Systems which can be represented by fault trees are considered. This methodology is based on switching algebra, failure diagnosis of digital circuits and fault tree analysis. Relations between these disciplines are shown. These relations are due to Boolean algebra and Boolean functions used throughout. It will be shown on this basis that techniques of failure diagnosis and fault tree analysis are useful to solve the following problems: 1. describe an efficient search of all failed components if the system is failed. 2. Describe an efficient search of all states which are close to a system failure if the system is still operating. The first technique will improve the availability, the second the reliability and safety. For these problems, the relation to methods of failure diagnosis for combinational circuits is required. Moreover, the techniques are demonstrated for a number of systems which can be represented by fault trees. (orig./RW) [de

  19. Corrosion failure analysis as related to prevention of corrosion failures

    International Nuclear Information System (INIS)

    Suss, H.

    1977-10-01

    The factors and conditions which have contributed to many of the corrosion related service failures are discussed based on a review of actual case histories. The anti-corrosion devices which developed as a result of these failure analyses are reviewed, and the method which must be adopted and used to take advantage of the available corrosion prevention techniques is discussed

  20. Micromechanics Based Failure Analysis of Heterogeneous Materials

    Science.gov (United States)

    Sertse, Hamsasew M.

    are performed for both brittle failure/high cycle fatigue (HCF) for negligible plastic strain and ductile failure/low cycle fatigue (LCF) for large plastic strain. The proposed approach is incorporated in SwiftComp and used to predict the initial failure envelope, stress-strain curve for various loading conditions, and fatigue life of heterogeneous materials. The combined effects of strain hardening and progressive fatigue damage on the effective properties of heterogeneous materials are also studied. The capability of the current approach is validated using several representative examples of heterogeneous materials including binary composites, continuous fiber-reinforced composites, particle-reinforced composites, discontinuous fiber-reinforced composites, and woven composites. The predictions of MSG are also compared with the predictions obtained using various micromechanics approaches such as Generalized Methods of Cells (GMC), Mori-Tanaka (MT), and Double Inclusions (DI) and Representative Volume Element (RVE) Analysis (called as 3-dimensional finite element analysis (3D FEA) in this document). This study demonstrates that a micromechanics based failure analysis has a great potential to rigorously and more accurately analyze initiation and progression of damage in heterogeneous materials. However, this approach requires material properties specific to damage analysis, which are needed to be independently calibrated for each constituent.

  1. Sequentially linear analysis for simulating brittle failure

    NARCIS (Netherlands)

    van de Graaf, A.V.

    2017-01-01

    The numerical simulation of brittle failure at structural level with nonlinear finite
    element analysis (NLFEA) remains a challenge due to robustness issues. We attribute these problems to the dimensions of real-world structures combined with softening behavior and negative tangent stiffness at

  2. Failure analysis on a chemical waste pipe

    International Nuclear Information System (INIS)

    Ambler, J.R.

    1985-01-01

    A failure analysis of a chemical waste pipe illustrates how nuclear technology can spin off metallurgical consultant services. The pipe, made of zirconium alloy (Zr-2.5 wt percent Nb, UNS 60705), had cracked in several places, all at butt welds. A combination of fractography and metallography indicated delayed hydride cracking

  3. Failure analysis of fractured dental zirconia implants.

    Science.gov (United States)

    Gahlert, M; Burtscher, D; Grunert, I; Kniha, H; Steinhauser, E

    2012-03-01

    The purpose of the present study was the macroscopic and microscopic failure analysis of fractured zirconia dental implants. Thirteen fractured one-piece zirconia implants (Z-Look3) out of 170 inserted implants with an average in situ period of 36.75±5.34 months (range from 20 to 56 months, median 38 months) were prepared for macroscopic and microscopic (scanning electron microscopy [SEM]) failure analysis. These 170 implants were inserted in 79 patients. The patient histories were compared with fracture incidences to identify the reasons for the failure of the implants. Twelve of these fractured implants had a diameter of 3.25 mm and one implant had a diameter of 4 mm. All fractured implants were located in the anterior side of the maxilla and mandibula. The patient with the fracture of the 4 mm diameter implant was adversely affected by strong bruxism. By failure analysis (SEM), it could be demonstrated that in all cases, mechanical overloading caused the fracture of the implants. Inhomogeneities and internal defects of the ceramic material could be excluded, but notches and scratches due to sandblasting of the surface led to local stress concentrations that led to the mentioned mechanical overloading by bending loads. The present study identified a fracture rate of nearly 10% within a follow-up period of 36.75 months after prosthetic loading. Ninety-two per cent of the fractured implants were so-called diameter reduced implants (diameter 3.25 mm). These diameter reduced implants cannot be recommended for further clinical use. Improvement of the ceramic material and modification of the implant geometry has to be carried out to reduce the failure rate of small-sized ceramic implants. Nevertheless, due to the lack of appropriate laboratory testing, only clinical studies will demonstrate clearly whether and how far the failure rate can be reduced. © 2011 John Wiley & Sons A/S.

  4. Statistical Power in Meta-Analysis

    Science.gov (United States)

    Liu, Jin

    2015-01-01

    Statistical power is important in a meta-analysis study, although few studies have examined the performance of simulated power in meta-analysis. The purpose of this study is to inform researchers about statistical power estimation on two sample mean difference test under different situations: (1) the discrepancy between the analytical power and…

  5. Statistical methods for astronomical data analysis

    CERN Document Server

    Chattopadhyay, Asis Kumar

    2014-01-01

    This book introduces “Astrostatistics” as a subject in its own right with rewarding examples, including work by the authors with galaxy and Gamma Ray Burst data to engage the reader. This includes a comprehensive blending of Astrophysics and Statistics. The first chapter’s coverage of preliminary concepts and terminologies for astronomical phenomenon will appeal to both Statistics and Astrophysics readers as helpful context. Statistics concepts covered in the book provide a methodological framework. A unique feature is the inclusion of different possible sources of astronomical data, as well as software packages for converting the raw data into appropriate forms for data analysis. Readers can then use the appropriate statistical packages for their particular data analysis needs. The ideas of statistical inference discussed in the book help readers determine how to apply statistical tests. The authors cover different applications of statistical techniques already developed or specifically introduced for ...

  6. Comparative analysis of positive and negative attitudes toward statistics

    Science.gov (United States)

    Ghulami, Hassan Rahnaward; Ab Hamid, Mohd Rashid; Zakaria, Roslinazairimah

    2015-02-01

    Many statistics lecturers and statistics education researchers are interested to know the perception of their students' attitudes toward statistics during the statistics course. In statistics course, positive attitude toward statistics is a vital because it will be encourage students to get interested in the statistics course and in order to master the core content of the subject matters under study. Although, students who have negative attitudes toward statistics they will feel depressed especially in the given group assignment, at risk for failure, are often highly emotional, and could not move forward. Therefore, this study investigates the students' attitude towards learning statistics. Six latent constructs have been the measurement of students' attitudes toward learning statistic such as affect, cognitive competence, value, difficulty, interest, and effort. The questionnaire was adopted and adapted from the reliable and validate instrument of Survey of Attitudes towards Statistics (SATS). This study is conducted among engineering undergraduate engineering students in the university Malaysia Pahang (UMP). The respondents consist of students who were taking the applied statistics course from different faculties. From the analysis, it is found that the questionnaire is acceptable and the relationships among the constructs has been proposed and investigated. In this case, students show full effort to master the statistics course, feel statistics course enjoyable, have confidence that they have intellectual capacity, and they have more positive attitudes then negative attitudes towards statistics learning. In conclusion in terms of affect, cognitive competence, value, interest and effort construct the positive attitude towards statistics was mostly exhibited. While negative attitudes mostly exhibited by difficulty construct.

  7. Dissimilar weld failure analysis and development program

    International Nuclear Information System (INIS)

    Holko, K.H.; Li, C.C.

    1982-01-01

    The problem of dissimilar weld cracking and failure is examined. This problem occurs in boiler superheater and reheater sections as well as main steam piping. Typically, a dissimilar weld joins low-alloy steel tubing such as Fe-2-1/4 Cr-1Mo to stainless steel tubing such as 321H and 304H. Cracking and failure occur in the low-alloy steel heat-affected zone very close to the weld interface. The 309 stainless steel filler previously used has been replaced with nickel-base fillers such as Inconel 132, Inconel 182, and Incoweld A. This change has extended the time to cracking and failure, but has not solved the problem. To illustrate and define the problem, the metallography of damaged and failed dissimilar welds is described. Results of mechanical tests of dissimilar welds removed from service are presented, and factors believed to be influential in causing damage and failure are discussed. In addition, the importance of dissimilar weldment service history is demonstrated, and the Dissimilar Weld Failure Analysis and Development Program is described. 15 figures

  8. Statistical analysis with Excel for dummies

    CERN Document Server

    Schmuller, Joseph

    2013-01-01

    Take the mystery out of statistical terms and put Excel to work! If you need to create and interpret statistics in business or classroom settings, this easy-to-use guide is just what you need. It shows you how to use Excel's powerful tools for statistical analysis, even if you've never taken a course in statistics. Learn the meaning of terms like mean and median, margin of error, standard deviation, and permutations, and discover how to interpret the statistics of everyday life. You'll learn to use Excel formulas, charts, PivotTables, and other tools to make sense of everything fro

  9. Combinatorial analysis of systems with competing failures subject to failure isolation and propagation effects

    International Nuclear Information System (INIS)

    Xing Liudong; Levitin, Gregory

    2010-01-01

    This paper considers the reliability analysis of binary-state systems, subject to propagated failures with global effect, and failure isolation phenomena. Propagated failures with global effect are common-cause failures originated from a component of a system/subsystem causing the failure of the entire system/subsystem. Failure isolation occurs when the failure of one component (referred to as a trigger component) causes other components (referred to as dependent components) within the same system to become isolated from the system. On the one hand, failure isolation makes the isolated dependent components unusable; on the other hand, it prevents the propagation of failures originated from those dependent components. However, the failure isolation effect does not exist if failures originated in the dependent components already propagate globally before the trigger component fails. In other words, there exists a competition in the time domain between the failure of the trigger component that causes failure isolation and propagated failures originated from the dependent components. This paper presents a combinatorial method for the reliability analysis of systems subject to such competing propagated failures and failure isolation effect. Based on the total probability theorem, the proposed method is analytical, exact, and has no limitation on the type of time-to-failure distributions for the system components. An illustrative example is given to demonstrate the basics and advantages of the proposed method.

  10. Failure analysis of a Francis turbine runner

    Energy Technology Data Exchange (ETDEWEB)

    Frunzaverde, D; Campian, V [Research Center in Hydraulics, Automation and Heat Transfer, ' Eftimie Murgu' University of Resita P-ta Traian Vuia 1-4, RO-320085, Resita (Romania); Muntean, S [Centre of Advanced Research in Engineering Sciences, Romanian Academy - Timisoara Branch Bv. Mihai Viteazu 24, RO-300223, Timisoara (Romania); Marginean, G [University of Applied Sciences Gelsenkirchen, Neidenburger Str. 10, 45877 Gelsenkirchen (Germany); Marsavina, L [Department of Strength, ' Politehnica' University of Timisoara, Bv. Mihai Viteazu 1, RO-300222, Timisoara (Romania); Terzi, R; Serban, V, E-mail: gabriela.marginean@fh-gelsenkirchen.d, E-mail: d.frunzaverde@uem.r [Ramnicu Valcea Subsidiary, S.C. Hidroelectrica S.A., Str. Decebal 11, RO-240255, Ramnicu Valcea (Romania)

    2010-08-15

    The variable demand on the energy market requires great flexibility in operating hydraulic turbines. Therefore, turbines are frequently operated over an extended range of regimes. Francis turbines operating at partial load present pressure fluctuations due to the vortex rope in the draft tube cone. This phenomenon generates strong vibrations and noise that may produce failures on the mechanical elements of the machine. This paper presents the failure analysis of a broken Francis turbine runner blade. The failure appeared some months after the welding repair work realized in situ on fatigue cracks initiated near to the trailing edge at the junction with the crown, where stress concentration occurs. In order to determine the causes that led to the fracture of the runner blade, the metallographic investigations on a sample obtained from the blade is carried out. The metallographic investigations included macroscopic and microscopic examinations, both performed with light and scanning electron microscopy, as well as EDX - analyses. These investigations led to the conclusion, that the cracking of the blade was caused by fatigue, initiated by the surface unevenness of the welding seam. The failure was accelerated by the hydrogen embrittlement of the filling material, which appeared as a consequence of improper welding conditions. In addition to the metallographic investigations, numerical computations with finite element analysis are performed in order to evaluate the deformation and stress distribution on blade.

  11. Failure analysis of a Francis turbine runner

    International Nuclear Information System (INIS)

    Frunzaverde, D; Campian, V; Muntean, S; Marginean, G; Marsavina, L; Terzi, R; Serban, V

    2010-01-01

    The variable demand on the energy market requires great flexibility in operating hydraulic turbines. Therefore, turbines are frequently operated over an extended range of regimes. Francis turbines operating at partial load present pressure fluctuations due to the vortex rope in the draft tube cone. This phenomenon generates strong vibrations and noise that may produce failures on the mechanical elements of the machine. This paper presents the failure analysis of a broken Francis turbine runner blade. The failure appeared some months after the welding repair work realized in situ on fatigue cracks initiated near to the trailing edge at the junction with the crown, where stress concentration occurs. In order to determine the causes that led to the fracture of the runner blade, the metallographic investigations on a sample obtained from the blade is carried out. The metallographic investigations included macroscopic and microscopic examinations, both performed with light and scanning electron microscopy, as well as EDX - analyses. These investigations led to the conclusion, that the cracking of the blade was caused by fatigue, initiated by the surface unevenness of the welding seam. The failure was accelerated by the hydrogen embrittlement of the filling material, which appeared as a consequence of improper welding conditions. In addition to the metallographic investigations, numerical computations with finite element analysis are performed in order to evaluate the deformation and stress distribution on blade.

  12. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  13. System reliability analysis using dominant failure modes identified by selective searching technique

    International Nuclear Information System (INIS)

    Kim, Dong-Seok; Ok, Seung-Yong; Song, Junho; Koh, Hyun-Moo

    2013-01-01

    The failure of a redundant structural system is often described by innumerable system failure modes such as combinations or sequences of local failures. An efficient approach is proposed to identify dominant failure modes in the space of random variables, and then perform system reliability analysis to compute the system failure probability. To identify dominant failure modes in the decreasing order of their contributions to the system failure probability, a new simulation-based selective searching technique is developed using a genetic algorithm. The system failure probability is computed by a multi-scale matrix-based system reliability (MSR) method. Lower-scale MSR analyses evaluate the probabilities of the identified failure modes and their statistical dependence. A higher-scale MSR analysis evaluates the system failure probability based on the results of the lower-scale analyses. Three illustrative examples demonstrate the efficiency and accuracy of the approach through comparison with existing methods and Monte Carlo simulations. The results show that the proposed method skillfully identifies the dominant failure modes, including those neglected by existing approaches. The multi-scale MSR method accurately evaluates the system failure probability with statistical dependence fully considered. The decoupling between the failure mode identification and the system reliability evaluation allows for effective applications to larger structural systems

  14. Methods for dependency estimation and system unavailability evaluation based on failure data statistics

    International Nuclear Information System (INIS)

    Azarm, M.A.; Hsu, F.; Martinez-Guridi, G.; Vesely, W.E.

    1993-07-01

    This report introduces a new perspective on the basic concept of dependent failures where the definition of dependency is based on clustering in failure times of similar components. This perspective has two significant implications: first, it relaxes the conventional assumption that dependent failures must be simultaneous and result from a severe shock; second, it allows the analyst to use all the failures in a time continuum to estimate the potential for multiple failures in a window of time (e.g., a test interval), therefore arriving at a more accurate value for system unavailability. In addition, the models developed here provide a method for plant-specific analysis of dependency, reflecting the plant-specific maintenance practices that reduce or increase the contribution of dependent failures to system unavailability. The proposed methodology can be used for screening analysis of failure data to estimate the fraction of dependent failures among the failures. In addition, the proposed method can evaluate the impact of the observed dependency on system unavailability and plant risk. The formulations derived in this report have undergone various levels of validations through computer simulation studies and pilot applications. The pilot applications of these methodologies showed that the contribution of dependent failures of diesel generators in one plant was negligible, while in another plant was quite significant. It also showed that in the plant with significant contribution of dependency to Emergency Power System (EPS) unavailability, the contribution changed with time. Similar findings were reported for the Containment Fan Cooler breakers. Drawing such conclusions about system performance would not have been possible with any other reported dependency methodologies

  15. Statistical analysis of absorptive laser damage in dielectric thin films

    International Nuclear Information System (INIS)

    Budgor, A.B.; Luria-Budgor, K.F.

    1978-01-01

    The Weibull distribution arises as an example of the theory of extreme events. It is commonly used to fit statistical data arising in the failure analysis of electrical components and in DC breakdown of materials. This distribution is employed to analyze time-to-damage and intensity-to-damage statistics obtained when irradiating thin film coated samples of SiO 2 , ZrO 2 , and Al 2 O 3 with tightly focused laser beams. The data used is furnished by Milam. The fit to the data is excellent; and least squared correlation coefficients greater than 0.9 are often obtained

  16. Importance analysis for the systems with common cause failures

    International Nuclear Information System (INIS)

    Pan Zhijie; Nonaka, Yasuo

    1995-01-01

    This paper extends the importance analysis technique to the research field of common cause failures to evaluate the structure importance, probability importance, and β-importance for the systems with common cause failures. These importance measures would help reliability analysts to limit the common cause failure analysis framework and find efficient defence strategies against common cause failures

  17. 14 CFR 417.224 - Probability of failure analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Probability of failure analysis. 417.224..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.224 Probability of failure..., must account for launch vehicle failure probability in a consistent manner. A launch vehicle failure...

  18. Explorations in Statistics: The Analysis of Change

    Science.gov (United States)

    Curran-Everett, Douglas; Williams, Calvin L.

    2015-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This tenth installment of "Explorations in Statistics" explores the analysis of a potential change in some physiological response. As researchers, we often express absolute change as percent change so we can…

  19. Recognition and Analysis of Corrosion Failure Mechanisms

    Directory of Open Access Journals (Sweden)

    Steven Suess

    2006-02-01

    Full Text Available Corrosion has a vast impact on the global and domestic economy, and currently incurs losses of nearly $300 billion annually to the U.S. economy alone. Because of the huge impact of corrosion, it is imperative to have a systematic approach to recognizing and mitigating corrosion problems as soon as possible after they become apparent. A proper failure analysis includes collection of pertinent background data and service history, followed by visual inspection, photographic documentation, material evaluation, data review and conclusion procurement. In analyzing corrosion failures, one must recognize the wide range of common corrosion mechanisms. The features of any corrosion failure give strong clues as to the most likely cause of the corrosion. This article details a proven approach to properly determining the root cause of a failure, and includes pictographic illustrations of the most common corrosion mechanisms, including general corrosion, pitting, galvanic corrosion, dealloying, crevice corrosion, microbiologically-influenced corrosion (MIC, corrosion fatigue, stress corrosion cracking (SCC, intergranular corrosion, fretting, erosion corrosion and hydrogen damage.

  20. Recognition and Analysis of Corrosion Failure Mechanisms

    OpenAIRE

    Steven Suess

    2006-01-01

    Corrosion has a vast impact on the global and domestic economy, and currently incurs losses of nearly $300 billion annually to the U.S. economy alone. Because of the huge impact of corrosion, it is imperative to have a systematic approach to recognizing and mitigating corrosion problems as soon as possible after they become apparent. A proper failure analysis includes collection of pertinent background data and service history, followed by visual inspection, photographic documentation, materi...

  1. The application of Petri nets to failure analysis

    International Nuclear Information System (INIS)

    Liu, T.S.; Chiou, S.B.

    1997-01-01

    Unlike the technique of fault tree analysis that has been widely applied to system failure analysis in reliability engineering, this study presents a Petri net approach to failure analysis. It is essentially a graphical method for describing relations between conditions and events. The use of Petri nets in failure analysis enables to replace logic gate functions in fault trees, efficiently obtain minimal cut sets, and absorb models. It is demonstrated that for failure analysis Petri nets are more efficient than fault trees. In addition, this study devises an alternative; namely, a trapezoidal graph method in order to account for failure scenarios. Examples validate this novel method in dealing with failure analysis

  2. Sensor Failure Detection of FASSIP System using Principal Component Analysis

    Science.gov (United States)

    Sudarno; Juarsa, Mulya; Santosa, Kussigit; Deswandri; Sunaryo, Geni Rina

    2018-02-01

    In the nuclear reactor accident of Fukushima Daiichi in Japan, the damages of core and pressure vessel were caused by the failure of its active cooling system (diesel generator was inundated by tsunami). Thus researches on passive cooling system for Nuclear Power Plant are performed to improve the safety aspects of nuclear reactors. The FASSIP system (Passive System Simulation Facility) is an installation used to study the characteristics of passive cooling systems at nuclear power plants. The accuracy of sensor measurement of FASSIP system is essential, because as the basis for determining the characteristics of a passive cooling system. In this research, a sensor failure detection method for FASSIP system is developed, so the indication of sensor failures can be detected early. The method used is Principal Component Analysis (PCA) to reduce the dimension of the sensor, with the Squarred Prediction Error (SPE) and statistic Hotteling criteria for detecting sensor failure indication. The results shows that PCA method is capable to detect the occurrence of a failure at any sensor.

  3. Statistical shape analysis with applications in R

    CERN Document Server

    Dryden, Ian L

    2016-01-01

    A thoroughly revised and updated edition of this introduction to modern statistical methods for shape analysis Shape analysis is an important tool in the many disciplines where objects are compared using geometrical features. Examples include comparing brain shape in schizophrenia; investigating protein molecules in bioinformatics; and describing growth of organisms in biology. This book is a significant update of the highly-regarded `Statistical Shape Analysis’ by the same authors. The new edition lays the foundations of landmark shape analysis, including geometrical concepts and statistical techniques, and extends to include analysis of curves, surfaces, images and other types of object data. Key definitions and concepts are discussed throughout, and the relative merits of different approaches are presented. The authors have included substantial new material on recent statistical developments and offer numerous examples throughout the text. Concepts are introduced in an accessible manner, while reta...

  4. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  5. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  6. Electrical failure analysis for root-cause determination

    International Nuclear Information System (INIS)

    Riddle, J.

    1990-01-01

    This paper outlines a practical failure analysis sequence. Several technical definitions are required. A failure is defined as a component that was operating in a system where the system malfunctioned and the replacement of the device restored system functionality. The failure mode is the malfunctioning behavior of the device. The failure mechanism is the underlying cause or source of the failure mode. The failure mechanism is the root cause of the failure mode. The failure analysis procedure needs to be adequately refined to result in the determination of the cause of failure to the degree that corrective action or design changes will prevent recurrence of the failure mode or mechanism. An example of a root-cause determination analysis performed for a nuclear power industry customer serves to illustrate the analysis methodology

  7. Challenges in Resolution for IC Failure Analysis

    Science.gov (United States)

    Martinez, Nick

    1999-10-01

    Resolution is becoming more and more of a challenge in the world of Failure Analysis in integrated circuits. This is a result of the ongoing size reduction in microelectronics. Determining the cause of a failure depends upon being able to find the responsible defect. The time it takes to locate a given defect is extremely important so that proper corrective actions can be taken. The limits of current microscopy tools are being pushed. With sub-micron feature sizes and even smaller killing defects, optical microscopes are becoming obsolete. With scanning electron microscopy (SEM), the resolution is high but the voltage involved can make these small defects transparent due to the large mean-free path of incident electrons. In this presentation, I will give an overview of the use of inspection methods in Failure Analysis and show example studies of my work as an Intern student at Texas Instruments. 1. Work at Texas Instruments, Stafford, TX, was supported by TI. 2. Work at Texas Tech University, was supported by NSF Grant DMR9705498.

  8. ANALYSIS OF RELIABILITY OF NONRECTORABLE REDUNDANT POWER SYSTEMS TAKING INTO ACCOUNT COMMON FAILURES

    Directory of Open Access Journals (Sweden)

    V. A. Anischenko

    2014-01-01

    Full Text Available Reliability Analysis of nonrestorable redundant power Systems of industrial plants and other consumers of electric energy was carried out. The main attention was paid to numbers failures influence, caused by failures of all elements of System due to one general reason. Noted the main possible reasons of common failures formation. Two main indicators of reliability of non-restorable systems are considered: average time of no-failure operation and mean probability of no-failure operation. Modeling of failures were carried out by mean of division of investigated system into two in-series connected subsystems, one of them indicated independent failures, but the other indicated common failures. Due to joined modeling of single and common failures resulting intensity of failures is the amount incompatible components: intensity statistically independent failures and intensity of common failures of elements and system in total.It is shown the influence of common failures of elements on average time of no-failure operation of system. There is built the scale of preference of systems according to criterion of  average time maximum of no-failure operation, depending on portion of common failures. It is noticed that such common failures don’t influence on the scale of preference, but  change intervals of time, determining the moments of systems failures and excepting them from the number of comparators. There were discussed two problems  of conditionally optimization of  systems’  reservation choice, taking into account their reliability and cost. The first problem is solved due to criterion of minimum cost of system providing mean probability of no-failure operation, the second problem is solved due to criterion of maximum of mean probability of no-failure operation with cost limitation of system.

  9. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  10. Classification, (big) data analysis and statistical learning

    CERN Document Server

    Conversano, Claudio; Vichi, Maurizio

    2018-01-01

    This edited book focuses on the latest developments in classification, statistical learning, data analysis and related areas of data science, including statistical analysis of large datasets, big data analytics, time series clustering, integration of data from different sources, as well as social networks. It covers both methodological aspects as well as applications to a wide range of areas such as economics, marketing, education, social sciences, medicine, environmental sciences and the pharmaceutical industry. In addition, it describes the basic features of the software behind the data analysis results, and provides links to the corresponding codes and data sets where necessary. This book is intended for researchers and practitioners who are interested in the latest developments and applications in the field. The peer-reviewed contributions were presented at the 10th Scientific Meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in Santa Margherita di Pul...

  11. Statistical hot spot analysis of reactor cores

    International Nuclear Information System (INIS)

    Schaefer, H.

    1974-05-01

    This report is an introduction into statistical hot spot analysis. After the definition of the term 'hot spot' a statistical analysis is outlined. The mathematical method is presented, especially the formula concerning the probability of no hot spots in a reactor core is evaluated. A discussion with the boundary conditions of a statistical hot spot analysis is given (technological limits, nominal situation, uncertainties). The application of the hot spot analysis to the linear power of pellets and the temperature rise in cooling channels is demonstrated with respect to the test zone of KNK II. Basic values, such as probability of no hot spots, hot spot potential, expected hot spot diagram and cumulative distribution function of hot spots, are discussed. It is shown, that the risk of hot channels can be dispersed equally over all subassemblies by an adequate choice of the nominal temperature distribution in the core

  12. Statistics and analysis of scientific data

    CERN Document Server

    Bonamente, Massimiliano

    2013-01-01

    Statistics and Analysis of Scientific Data covers the foundations of probability theory and statistics, and a number of numerical and analytical methods that are essential for the present-day analyst of scientific data. Topics covered include probability theory, distribution functions of statistics, fits to two-dimensional datasheets and parameter estimation, Monte Carlo methods and Markov chains. Equal attention is paid to the theory and its practical application, and results from classic experiments in various fields are used to illustrate the importance of statistics in the analysis of scientific data. The main pedagogical method is a theory-then-application approach, where emphasis is placed first on a sound understanding of the underlying theory of a topic, which becomes the basis for an efficient and proactive use of the material for practical applications. The level is appropriate for undergraduates and beginning graduate students, and as a reference for the experienced researcher. Basic calculus is us...

  13. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  14. Semiclassical analysis, Witten Laplacians, and statistical mechanis

    CERN Document Server

    Helffer, Bernard

    2002-01-01

    This important book explains how the technique of Witten Laplacians may be useful in statistical mechanics. It considers the problem of analyzing the decay of correlations, after presenting its origin in statistical mechanics. In addition, it compares the Witten Laplacian approach with other techniques, such as the transfer matrix approach and its semiclassical analysis. The author concludes by providing a complete proof of the uniform Log-Sobolev inequality. Contents: Witten Laplacians Approach; Problems in Statistical Mechanics with Discrete Spins; Laplace Integrals and Transfer Operators; S

  15. A statistical approach to plasma profile analysis

    International Nuclear Information System (INIS)

    Kardaun, O.J.W.F.; McCarthy, P.J.; Lackner, K.; Riedel, K.S.

    1990-05-01

    A general statistical approach to the parameterisation and analysis of tokamak profiles is presented. The modelling of the profile dependence on both the radius and the plasma parameters is discussed, and pertinent, classical as well as robust, methods of estimation are reviewed. Special attention is given to statistical tests for discriminating between the various models, and to the construction of confidence intervals for the parameterised profiles and the associated global quantities. The statistical approach is shown to provide a rigorous approach to the empirical testing of plasma profile invariance. (orig.)

  16. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  17. Foundation of statistical energy analysis in vibroacoustics

    CERN Document Server

    Le Bot, A

    2015-01-01

    This title deals with the statistical theory of sound and vibration. The foundation of statistical energy analysis is presented in great detail. In the modal approach, an introduction to random vibration with application to complex systems having a large number of modes is provided. For the wave approach, the phenomena of propagation, group speed, and energy transport are extensively discussed. Particular emphasis is given to the emergence of diffuse field, the central concept of the theory.

  18. Timing analysis of PWR fuel pin failures

    International Nuclear Information System (INIS)

    Jones, K.R.; Wade, N.L.; Katsma, K.R.; Siefken, L.J.; Straka, M.

    1992-09-01

    Research has been conducted to develop and demonstrate a methodology for calculation of the time interval between receipt of the containment isolation signals and the first fuel pin failure for loss-of-coolant accidents (LOCAs). Demonstration calculations were performed for a Babcock and Wilcox (B ampersand W) design (Oconee) and a Westinghouse (W) four-loop design (Seabrook). Sensitivity studies were performed to assess the impacts of fuel pin bumup, axial peaking factor, break size, emergency core cooling system availability, and main coolant pump trip on these times. The analysis was performed using the following codes: FRAPCON-2, for the calculation of steady-state fuel behavior; SCDAP/RELAP5/MOD3 and TRACPF1/MOD1, for the calculation of the transient thermal-hydraulic conditions in the reactor system; and FRAP-T6, for the calculation of transient fuel behavior. In addition to the calculation of fuel pin failure timing, this analysis provides a comparison of the predicted results of SCDAP/RELAP5/MOD3 and TRAC-PFL/MOD1 for large-break LOCA analysis. Using SCDAP/RELAP5/MOD3 thermal-hydraulic data, the shortest time intervals calculated between initiation of containment isolation and fuel pin failure are 10.4 seconds and 19.1 seconds for the B ampersand W and W plants, respectively. Using data generated by TRAC-PF1/MOD1, the shortest intervals are 10.3 seconds and 29.1 seconds for the B ampersand W and W plants, respectively. These intervals are for a double-ended, offset-shear, cold leg break, using the technical specification maximum peaking factor and applied to fuel with maximum design bumup. Using peaking factors commensurate widi actual bumups would result in longer intervals for both reactor designs. This document also contains appendices A through J of this report

  19. A Statistical Toolkit for Data Analysis

    International Nuclear Information System (INIS)

    Donadio, S.; Guatelli, S.; Mascialino, B.; Pfeiffer, A.; Pia, M.G.; Ribon, A.; Viarengo, P.

    2006-01-01

    The present project aims to develop an open-source and object-oriented software Toolkit for statistical data analysis. Its statistical testing component contains a variety of Goodness-of-Fit tests, from Chi-squared to Kolmogorov-Smirnov, to less known, but generally much more powerful tests such as Anderson-Darling, Goodman, Fisz-Cramer-von Mises, Kuiper, Tiku. Thanks to the component-based design and the usage of the standard abstract interfaces for data analysis, this tool can be used by other data analysis systems or integrated in experimental software frameworks. This Toolkit has been released and is downloadable from the web. In this paper we describe the statistical details of the algorithms, the computational features of the Toolkit and describe the code validation

  20. Analysis of statistical misconception in terms of statistical reasoning

    Science.gov (United States)

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  1. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    Science.gov (United States)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be

  2. Statistical analysis of network data with R

    CERN Document Server

    Kolaczyk, Eric D

    2014-01-01

    Networks have permeated everyday life through everyday realities like the Internet, social networks, and viral marketing. As such, network analysis is an important growth area in the quantitative sciences, with roots in social network analysis going back to the 1930s and graph theory going back centuries. Measurement and analysis are integral components of network research. As a result, statistical methods play a critical role in network analysis. This book is the first of its kind in network research. It can be used as a stand-alone resource in which multiple R packages are used to illustrate how to conduct a wide range of network analyses, from basic manipulation and visualization, to summary and characterization, to modeling of network data. The central package is igraph, which provides extensive capabilities for studying network graphs in R. This text builds on Eric D. Kolaczyk’s book Statistical Analysis of Network Data (Springer, 2009).

  3. Statistical models for competing risk analysis

    International Nuclear Information System (INIS)

    Sather, H.N.

    1976-08-01

    Research results on three new models for potential applications in competing risks problems. One section covers the basic statistical relationships underlying the subsequent competing risks model development. Another discusses the problem of comparing cause-specific risk structure by competing risks theory in two homogeneous populations, P1 and P2. Weibull models which allow more generality than the Berkson and Elveback models are studied for the effect of time on the hazard function. The use of concomitant information for modeling single-risk survival is extended to the multiple failure mode domain of competing risks. The model used to illustrate the use of this methodology is a life table model which has constant hazards within pre-designated intervals of the time scale. Two parametric models for bivariate dependent competing risks, which provide interesting alternatives, are proposed and examined

  4. Statistics and analysis of scientific data

    CERN Document Server

    Bonamente, Massimiliano

    2017-01-01

    The revised second edition of this textbook provides the reader with a solid foundation in probability theory and statistics as applied to the physical sciences, engineering and related fields. It covers a broad range of numerical and analytical methods that are essential for the correct analysis of scientific data, including probability theory, distribution functions of statistics, fits to two-dimensional data and parameter estimation, Monte Carlo methods and Markov chains. Features new to this edition include: • a discussion of statistical techniques employed in business science, such as multiple regression analysis of multivariate datasets. • a new chapter on the various measures of the mean including logarithmic averages. • new chapters on systematic errors and intrinsic scatter, and on the fitting of data with bivariate errors. • a new case study and additional worked examples. • mathematical derivations and theoretical background material have been appropriately marked,to improve the readabili...

  5. Statistical analysis on extreme wave height

    Digital Repository Service at National Institute of Oceanography (India)

    Teena, N.V.; SanilKumar, V.; Sudheesh, K.; Sajeev, R.

    -294. • WAFO (2000) – A MATLAB toolbox for analysis of random waves and loads, Lund University, Sweden, homepage http://www.maths.lth.se/matstat/wafo/,2000. 15    Table 1: Statistical results of data and fitted distribution for cumulative distribution...

  6. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  7. The fuzzy approach to statistical analysis

    NARCIS (Netherlands)

    Coppi, Renato; Gil, Maria A.; Kiers, Henk A. L.

    2006-01-01

    For the last decades, research studies have been developed in which a coalition of Fuzzy Sets Theory and Statistics has been established with different purposes. These namely are: (i) to introduce new data analysis problems in which the objective involves either fuzzy relationships or fuzzy terms;

  8. Plasma data analysis using statistical analysis system

    International Nuclear Information System (INIS)

    Yoshida, Z.; Iwata, Y.; Fukuda, Y.; Inoue, N.

    1987-01-01

    Multivariate factor analysis has been applied to a plasma data base of REPUTE-1. The characteristics of the reverse field pinch plasma in REPUTE-1 are shown to be explained by four independent parameters which are described in the report. The well known scaling laws F/sub chi/ proportional to I/sub p/, T/sub e/ proportional to I/sub p/, and tau/sub E/ proportional to N/sub e/ are also confirmed. 4 refs., 8 figs., 1 tab

  9. Statistical analysis of metallicity in spiral galaxies

    Energy Technology Data Exchange (ETDEWEB)

    Galeotti, P [Consiglio Nazionale delle Ricerche, Turin (Italy). Lab. di Cosmo-Geofisica; Turin Univ. (Italy). Ist. di Fisica Generale)

    1981-04-01

    A principal component analysis of metallicity and other integral properties of 33 spiral galaxies is presented; the involved parameters are: morphological type, diameter, luminosity and metallicity. From the statistical analysis it is concluded that the sample has only two significant dimensions and additonal tests, involving different parameters, show similar results. Thus it seems that only type and luminosity are independent variables, being the other integral properties of spiral galaxies correlated with them.

  10. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  11. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  12. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  13. Reliability analysis for the creep rupture mode of failure

    International Nuclear Information System (INIS)

    Vaidyanathan, S.

    1975-01-01

    An analytical study has been carried out to relate the factors of safety employed in the design of a component to the probability of failure in the thermal creep rupture mode. The analysis considers the statistical variations in the operating temperature, stress and rupture time, and applies the life fraction damage criterion as the indicator of failure. Typical results for solution annealed type 304-stainless steel material for the temperature and stress variations expected in an LMFBR environment have been obtained. The analytical problem was solved by considering the joint distribution of the independent variables and deriving the distribution for the function associated with the probability of failure by integrating over proper regions as dictated by the deterministic design rule. This leads to a triple integral for the final probability of failure where the coefficients of variation associated with the temperature, stress and rupture time distributions can be specified by the user. The derivation is general, and can be used for time varying stress histories and the case of irradiated material where the rupture time varies with accumulated fluence. Example calculations applied to solution annealed type 304 stainless steel material have been carried out for an assumed coefficient of variation of 2% for temperature and 6% for stress. The results show that the probability of failure associated with dependent stress intensity limits specified in the ASME Boiler and Pressure Vessel Section III Code Case 1592 is less than 5x10 -8 . Rupture under thermal creep conditions is a highly complicated phenomenon. It is believed that the present study will help in quantizing the reliability to be expected with deterministic design factors of safety

  14. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2003-01-01

    Statistical analyses are performed for material strength parameters from a large number of specimens of structural timber. Non-parametric statistical analysis and fits have been investigated for the following distribution types: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...... fits to the data available, especially if tail fits are used whereas the Log Normal distribution generally gives a poor fit and larger coefficients of variation, especially if tail fits are used. The implications on the reliability level of typical structural elements and on partial safety factors...... for timber are investigated....

  15. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  16. Statistical analysis and planning of multihundred-watt impact tests

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Waterman, M.S.

    1977-10-01

    Modular multihundred-watt (MHW) radioisotope thermoelectric generators (RTG's) are used as a power source for spacecraft. Due to possible environmental contamination by radioactive materials, numerous tests are required to determine and verify the safety of the RTG. There are results available from 27 fueled MHW impact tests regarding hoop failure, fingerprint failure, and fuel failure. Data from the 27 tests are statistically analyzed for relationships that exist between the test design variables and the failure types. Next, these relationships are used to develop a statistical procedure for planning and conducting either future MHW impact tests or similar tests on other RTG fuel sources. Finally, some conclusions are given

  17. Service reliability assessment using failure mode and effect analysis ...

    African Journals Online (AJOL)

    user

    Statistical Process Control Teng and Ho (1996) .... are still remaining left on modelling the interaction between impact of internal service failure and ..... Design error proofing: development of automated error-proofing information systems, Proceedings of.

  18. Developments in statistical analysis in quantitative genetics

    DEFF Research Database (Denmark)

    Sorensen, Daniel

    2009-01-01

    of genetic means and variances, models for the analysis of categorical and count data, the statistical genetics of a model postulating that environmental variance is partly under genetic control, and a short discussion of models that incorporate massive genetic marker information. We provide an overview......A remarkable research impetus has taken place in statistical genetics since the last World Conference. This has been stimulated by breakthroughs in molecular genetics, automated data-recording devices and computer-intensive statistical methods. The latter were revolutionized by the bootstrap...... and by Markov chain Monte Carlo (McMC). In this overview a number of specific areas are chosen to illustrate the enormous flexibility that McMC has provided for fitting models and exploring features of data that were previously inaccessible. The selected areas are inferences of the trajectories over time...

  19. Statistical Analysis of Big Data on Pharmacogenomics

    Science.gov (United States)

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  20. Statistical analysis of next generation sequencing data

    CERN Document Server

    Nettleton, Dan

    2014-01-01

    Next Generation Sequencing (NGS) is the latest high throughput technology to revolutionize genomic research. NGS generates massive genomic datasets that play a key role in the big data phenomenon that surrounds us today. To extract signals from high-dimensional NGS data and make valid statistical inferences and predictions, novel data analytic and statistical techniques are needed. This book contains 20 chapters written by prominent statisticians working with NGS data. The topics range from basic preprocessing and analysis with NGS data to more complex genomic applications such as copy number variation and isoform expression detection. Research statisticians who want to learn about this growing and exciting area will find this book useful. In addition, many chapters from this book could be included in graduate-level classes in statistical bioinformatics for training future biostatisticians who will be expected to deal with genomic data in basic biomedical research, genomic clinical trials and personalized med...

  1. Metallized Film Capacitor Lifetime Evaluation and Failure Mode Analysis

    CERN Document Server

    Gallay, R.

    2015-06-15

    One of the main concerns for power electronic engineers regarding capacitors is to predict their remaining lifetime in order to anticipate costly failures or system unavailability. This may be achieved using a Weibull statistical law combined with acceleration factors for the temperature, the voltage, and the humidity. This paper discusses the different capacitor failure modes and their effects and consequences.

  2. Robust statistics and geochemical data analysis

    International Nuclear Information System (INIS)

    Di, Z.

    1987-01-01

    Advantages of robust procedures over ordinary least-squares procedures in geochemical data analysis is demonstrated using NURE data from the Hot Springs Quadrangle, South Dakota, USA. Robust principal components analysis with 5% multivariate trimming successfully guarded the analysis against perturbations by outliers and increased the number of interpretable factors. Regression with SINE estimates significantly increased the goodness-of-fit of the regression and improved the correspondence of delineated anomalies with known uranium prospects. Because of the ubiquitous existence of outliers in geochemical data, robust statistical procedures are suggested as routine procedures to replace ordinary least-squares procedures

  3. Use of fuel failure correlations in accident analysis

    International Nuclear Information System (INIS)

    O'Dell, L.D.; Baars, R.E.; Waltar, A.E.

    1975-05-01

    The MELT-III code for analysis of a Transient Overpower (TOP) accident in an LMFBR is briefly described, including failure criteria currently applied in the code. Preliminary results of calculations exploring failure patterns in time and space in the reactor core are reported and compared for the two empirical fuel failure correlations employed in the code. (U.S.)

  4. BACFIRE, Minimal Cut Sets Common Cause Failure Fault Tree Analysis

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1983-01-01

    1 - Description of problem or function: BACFIRE, designed to aid in common cause failure analysis, searches among the basic events of a minimal cut set of the system logic model for common potential causes of failure. The potential cause of failure is called a qualitative failure characteristics. The algorithm searches qualitative failure characteristics (that are part of the program input) of the basic events contained in a set to find those characteristics common to all basic events. This search is repeated for all cut sets input to the program. Common cause failure analysis is thereby performed without inclusion of secondary failure in the system logic model. By using BACFIRE, a common cause failure analysis can be added to an existing system safety and reliability analysis. 2 - Method of solution: BACFIRE searches the qualitative failure characteristics of the basic events contained in the fault tree minimal cut set to find those characteristics common to all basic events by either of two criteria. The first criterion can be met if all the basic events in a minimal cut set are associated by a condition which alone may increase the probability of multiple component malfunction. The second criterion is met if all the basic events in a minimal cut set are susceptible to the same secondary failure cause and are located in the same domain for that cause of secondary failure. 3 - Restrictions on the complexity of the problem - Maxima of: 1001 secondary failure maps, 101 basic events, 10 cut sets

  5. Analysis of photon statistics with Silicon Photomultiplier

    International Nuclear Information System (INIS)

    D'Ascenzo, N.; Saveliev, V.; Wang, L.; Xie, Q.

    2015-01-01

    The Silicon Photomultiplier (SiPM) is a novel silicon-based photodetector, which represents the modern perspective of low photon flux detection. The aim of this paper is to provide an introduction on the statistical analysis methods needed to understand and estimate in quantitative way the correct features and description of the response of the SiPM to a coherent source of light

  6. Vapor Pressure Data Analysis and Statistics

    Science.gov (United States)

    2016-12-01

    near 8, 2000, and 200, respectively. The A (or a) value is directly related to vapor pressure and will be greater for high vapor pressure materials...1, (10) where n is the number of data points, Yi is the natural logarithm of the i th experimental vapor pressure value, and Xi is the...VAPOR PRESSURE DATA ANALYSIS AND STATISTICS ECBC-TR-1422 Ann Brozena RESEARCH AND TECHNOLOGY DIRECTORATE

  7. Statistical analysis of brake squeal noise

    Science.gov (United States)

    Oberst, S.; Lai, J. C. S.

    2011-06-01

    Despite substantial research efforts applied to the prediction of brake squeal noise since the early 20th century, the mechanisms behind its generation are still not fully understood. Squealing brakes are of significant concern to the automobile industry, mainly because of the costs associated with warranty claims. In order to remedy the problems inherent in designing quieter brakes and, therefore, to understand the mechanisms, a design of experiments study, using a noise dynamometer, was performed by a brake system manufacturer to determine the influence of geometrical parameters (namely, the number and location of slots) of brake pads on brake squeal noise. The experimental results were evaluated with a noise index and ranked for warm and cold brake stops. These data are analysed here using statistical descriptors based on population distributions, and a correlation analysis, to gain greater insight into the functional dependency between the time-averaged friction coefficient as the input and the peak sound pressure level data as the output quantity. The correlation analysis between the time-averaged friction coefficient and peak sound pressure data is performed by applying a semblance analysis and a joint recurrence quantification analysis. Linear measures are compared with complexity measures (nonlinear) based on statistics from the underlying joint recurrence plots. Results show that linear measures cannot be used to rank the noise performance of the four test pad configurations. On the other hand, the ranking of the noise performance of the test pad configurations based on the noise index agrees with that based on nonlinear measures: the higher the nonlinearity between the time-averaged friction coefficient and peak sound pressure, the worse the squeal. These results highlight the nonlinear character of brake squeal and indicate the potential of using nonlinear statistical analysis tools to analyse disc brake squeal.

  8. Failure propagation tests and analysis at PNC

    International Nuclear Information System (INIS)

    Tanabe, H.; Miyake, O.; Daigo, Y.; Sato, M.

    1984-01-01

    Failure propagation tests have been conducted using the Large Leak Sodium Water Reaction Test Rig (SWAT-1) and the Steam Generator Safety Test Facility (SWAT-3) at PNC in order to establish the safety design of the LMFBR prototype Monju steam generators. Test objectives are to provide data for selecting a design basis leak (DBL), data on the time history of failure propagations, data on the mechanism of the failures, and data on re-use of tubes in the steam generators that have suffered leaks. Eighteen fundamental tests have been performed in an intermediate leak region using the SWAT-1 test rig, and ten failure propagation tests have been conducted in the region from a small leak to a large leak using the SWAT-3 test facility. From the test results it was concluded that a dominant mechanism was tube wastage, and it took more than one minute until each failure propagation occurred. Also, the total leak rate in full sequence simulation tests including a water dump was far less than that of one double-ended-guillotine (DEG) failure. Using such experimental data, a computer code, LEAP (Leak Enlargement and Propagation), has been developed for the purpose of estimating the possible maximum leak rate due to failure propagation. This paper describes the results of the failure propagation tests and the model structure and validation studies of the LEAP code. (author)

  9. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  10. Early failure analysis of machining centers: a case study

    International Nuclear Information System (INIS)

    Wang Yiqiang; Jia Yazhou; Jiang Weiwei

    2001-01-01

    To eliminate the early failures and improve the reliability, nine ex-factory machining centers are traced under field conditions in workshops. Their early failure information throughout the ex-factory run-in test is collected. The field early failure database is constructed based on the collection of field early failure data and the codification of data. Early failure mode and effects analysis is performed to indicate the weak subsystem of a machining center or the troublemaker. The distribution of the time between early failures is analyzed and the optimal ex-factory run-in test time for machining center that may expose sufficiently the early failures and cost minimum is discussed. Suggestions how to arrange ex-factory run-in test and how to take actions to reduce early failures for machining center is proposed

  11. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  12. Statistical Analysis of Zebrafish Locomotor Response.

    Science.gov (United States)

    Liu, Yiwen; Carmer, Robert; Zhang, Gaonan; Venkatraman, Prahatha; Brown, Skye Ashton; Pang, Chi-Pui; Zhang, Mingzhi; Ma, Ping; Leung, Yuk Fai

    2015-01-01

    Zebrafish larvae display rich locomotor behaviour upon external stimulation. The movement can be simultaneously tracked from many larvae arranged in multi-well plates. The resulting time-series locomotor data have been used to reveal new insights into neurobiology and pharmacology. However, the data are of large scale, and the corresponding locomotor behavior is affected by multiple factors. These issues pose a statistical challenge for comparing larval activities. To address this gap, this study has analyzed a visually-driven locomotor behaviour named the visual motor response (VMR) by the Hotelling's T-squared test. This test is congruent with comparing locomotor profiles from a time period. Different wild-type (WT) strains were compared using the test, which shows that they responded differently to light change at different developmental stages. The performance of this test was evaluated by a power analysis, which shows that the test was sensitive for detecting differences between experimental groups with sample numbers that were commonly used in various studies. In addition, this study investigated the effects of various factors that might affect the VMR by multivariate analysis of variance (MANOVA). The results indicate that the larval activity was generally affected by stage, light stimulus, their interaction, and location in the plate. Nonetheless, different factors affected larval activity differently over time, as indicated by a dynamical analysis of the activity at each second. Intriguingly, this analysis also shows that biological and technical repeats had negligible effect on larval activity. This finding is consistent with that from the Hotelling's T-squared test, and suggests that experimental repeats can be combined to enhance statistical power. Together, these investigations have established a statistical framework for analyzing VMR data, a framework that should be generally applicable to other locomotor data with similar structure.

  13. Importance of competing risks in the analysis of anti-epileptic drug failure

    Directory of Open Access Journals (Sweden)

    Sander Josemir W

    2007-03-01

    Full Text Available Abstract Background Retention time (time to treatment failure is a commonly used outcome in antiepileptic drug (AED studies. Methods Two datasets are used to demonstrate the issues in a competing risks analysis of AEDs. First, data collection and follow-up considerations are discussed with reference to information from 15 monotherapy trials. Recommendations for improved data collection and cumulative incidence analysis are then illustrated using the SANAD trial dataset. The results are compared to the more common approach using standard survival analysis methods. Results A non-significant difference in overall treatment failure time between gabapentin and topiramate (logrank test statistic = 0.01, 1 degree of freedom, p-value = 0.91 masked highly significant differences in opposite directions with gabapentin resulting in fewer withdrawals due to side effects (Gray's test statistic = 11.60, 1 degree of freedom, p = 0.0007 but more due to poor seizure control (Gray's test statistic = 14.47, 1 degree of freedom, p-value = 0.0001. The significant difference in overall treatment failure time between lamotrigine and carbamazepine (logrank test statistic = 5.6, 1 degree of freedom, p-value = 0.018 was due entirely to a significant benefit of lamotrigine in terms of side effects (Gray's test statistic = 10.27, 1 degree of freedom, p = 0.001. Conclusion Treatment failure time can be measured reliably but care is needed to collect sufficient information on reasons for drug withdrawal to allow a competing risks analysis. Important differences between the profiles of AEDs may be missed unless appropriate statistical methods are used to fully investigate treatment failure time. Cumulative incidence analysis allows comparison of the probability of failure between two AEDs and is likely to be a more powerful approach than logrank analysis for most comparisons of standard and new anti-epileptic drugs.

  14. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  15. Failure analysis of prestressed concrete beam under impact loading

    International Nuclear Information System (INIS)

    Ishikawa, N.; Sonoda, Y.; Kobayashi, N.

    1993-01-01

    This paper presents a failure analysis of prestressed concrete (PC) beam under impact loading. At first, the failure analysis of PC beam section is performed by using the discrete section element method in order to obtain the dynamic bending moment-curvature relation. Secondary, the failure analysis of PC beam is performed by using the rigid panel-spring model. Finally, the numerical calculation is executed and is compared with the experimental results. It is found that this approach can simulate well the experiments at the local and overall failure of the PC beam as well as the impact load and the displacement-time relations. (author)

  16. On the Statistical Validation of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Rosane Riera Freire

    2007-06-01

    Full Text Available Technical analysis, or charting, aims on visually identifying geometrical patterns in price charts in order to antecipate price "trends". In this paper we revisit the issue of thecnical analysis validation which has been tackled in the literature without taking care for (i the presence of heterogeneity and (ii statistical dependence in the analyzed data - various agglutinated return time series from distinct financial securities. The main purpose here is to address the first cited problem by suggesting a validation methodology that also "homogenizes" the securities according to the finite dimensional probability distribution of their return series. The general steps go through the identification of the stochastic processes for the securities returns, the clustering of similar securities and, finally, the identification of presence, or absence, of informatinal content obtained from those price patterns. We illustrate the proposed methodology with a real data exercise including several securities of the global market. Our investigation shows that there is a statistically significant informational content in two out of three common patterns usually found through technical analysis, namely: triangle, rectangle and head and shoulders.

  17. Statistical trend analysis methods for temporal phenomena

    International Nuclear Information System (INIS)

    Lehtinen, E.; Pulkkinen, U.; Poern, K.

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods

  18. Statistical trend analysis methods for temporal phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Lehtinen, E.; Pulkkinen, U. [VTT Automation, (Finland); Poern, K. [Poern Consulting, Nykoeping (Sweden)

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods. 14 refs, 10 figs.

  19. Structures for common-cause failure analysis

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1981-01-01

    Common-cause failure methodology and terminology have been reviewed and structured to provide a systematical basis for addressing and developing models and methods for quantification. The structure is based on (1) a specific set of definitions, (2) categories based on the way faults are attributable to a common cause, and (3) classes based on the time of entry and the time of elimination of the faults. The failure events are then characterized by their likelihood or frequency and the average residence time. The structure provides a basis for selecting computational models, collecting and evaluating data and assessing the importance of various failure types, and for developing effective defences against common-cause failure. The relationships of this and several other structures are described

  20. Statistical analysis of field data for aircraft warranties

    Science.gov (United States)

    Lakey, Mary J.

    Air Force and Navy maintenance data collection systems were researched to determine their scientific applicability to the warranty process. New and unique algorithms were developed to extract failure distributions which were then used to characterize how selected families of equipment typically fails. Families of similar equipment were identified in terms of function, technology and failure patterns. Statistical analyses and applications such as goodness-of-fit test, maximum likelihood estimation and derivation of confidence intervals for the probability density function parameters were applied to characterize the distributions and their failure patterns. Statistical and reliability theory, with relevance to equipment design and operational failures were also determining factors in characterizing the failure patterns of the equipment families. Inferences about the families with relevance to warranty needs were then made.

  1. Statistical analysis of solar proton events

    Directory of Open Access Journals (Sweden)

    V. Kurt

    2004-06-01

    Full Text Available A new catalogue of 253 solar proton events (SPEs with energy >10MeV and peak intensity >10 protons/cm2.s.sr (pfu at the Earth's orbit for three complete 11-year solar cycles (1970-2002 is given. A statistical analysis of this data set of SPEs and their associated flares that occurred during this time period is presented. It is outlined that 231 of these proton events are flare related and only 22 of them are not associated with Ha flares. It is also noteworthy that 42 of these events are registered as Ground Level Enhancements (GLEs in neutron monitors. The longitudinal distribution of the associated flares shows that a great number of these events are connected with west flares. This analysis enables one to understand the long-term dependence of the SPEs and the related flare characteristics on the solar cycle which are useful for space weather prediction.

  2. Recent advances in statistical energy analysis

    Science.gov (United States)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  3. STATISTICS, Program System for Statistical Analysis of Experimental Data

    International Nuclear Information System (INIS)

    Helmreich, F.

    1991-01-01

    1 - Description of problem or function: The package is composed of 83 routines, the most important of which are the following: BINDTR: Binomial distribution; HYPDTR: Hypergeometric distribution; POIDTR: Poisson distribution; GAMDTR: Gamma distribution; BETADTR: Beta-1 and Beta-2 distributions; NORDTR: Normal distribution; CHIDTR: Chi-square distribution; STUDTR : Distribution of 'Student's T'; FISDTR: Distribution of F; EXPDTR: Exponential distribution; WEIDTR: Weibull distribution; FRAKTIL: Calculation of the fractiles of the normal, chi-square, Student's, and F distributions; VARVGL: Test for equality of variance for several sample observations; ANPAST: Kolmogorov-Smirnov test and chi-square test of goodness of fit; MULIRE: Multiple linear regression analysis for a dependent variable and a set of independent variables; STPRG: Performs a stepwise multiple linear regression analysis for a dependent variable and a set of independent variables. At each step, the variable entered into the regression equation is the one which has the greatest amount of variance between it and the dependent variable. Any independent variable can be forced into or deleted from the regression equation, irrespective of its contribution to the equation. LTEST: Tests the hypotheses of linearity of the data. SPRANK: Calculates the Spearman rank correlation coefficient. 2 - Method of solution: VARVGL: The Bartlett's Test, the Cochran's Test and the Hartley's Test are performed in the program. MULIRE: The Gauss-Jordan method is used in the solution of the normal equations. STPRG: The abbreviated Doolittle method is used to (1) determine variables to enter into the regression, and (2) complete regression coefficient calculation. 3 - Restrictions on the complexity of the problem: VARVGL: The Hartley's Test is only performed if the sample observations are all of the same size

  4. Failure mode and effects analysis: an empirical comparison of failure mode scoring procedures.

    Science.gov (United States)

    Ashley, Laura; Armitage, Gerry

    2010-12-01

    To empirically compare 2 different commonly used failure mode and effects analysis (FMEA) scoring procedures with respect to their resultant failure mode scores and prioritization: a mathematical procedure, where scores are assigned independently by FMEA team members and averaged, and a consensus procedure, where scores are agreed on by the FMEA team via discussion. A multidisciplinary team undertook a Healthcare FMEA of chemotherapy administration. This included mapping the chemotherapy process, identifying and scoring failure modes (potential errors) for each process step, and generating remedial strategies to counteract them. Failure modes were scored using both an independent mathematical procedure and a team consensus procedure. Almost three-fifths of the 30 failure modes generated were scored differently by the 2 procedures, and for just more than one-third of cases, the score discrepancy was substantial. Using the Healthcare FMEA prioritization cutoff score, almost twice as many failure modes were prioritized by the consensus procedure than by the mathematical procedure. This is the first study to empirically demonstrate that different FMEA scoring procedures can score and prioritize failure modes differently. It found considerable variability in individual team members' opinions on scores, which highlights the subjective and qualitative nature of failure mode scoring. A consensus scoring procedure may be most appropriate for FMEA as it allows variability in individuals' scores and rationales to become apparent and to be discussed and resolved by the team. It may also yield team learning and communication benefits unlikely to result from a mathematical procedure.

  5. Statistical analysis of tourism destination competitiveness

    Directory of Open Access Journals (Sweden)

    Attilio Gardini

    2013-05-01

    Full Text Available The growing relevance of tourism industry for modern advanced economies has increased the interest among researchers and policy makers in the statistical analysis of destination competitiveness. In this paper we outline a new model of destination competitiveness based on sound theoretical grounds and we develop a statistical test of the model on sample data based on Italian tourist destination decisions and choices. Our model focuses on the tourism decision process which starts from the demand schedule for holidays and ends with the choice of a specific holiday destination. The demand schedule is a function of individual preferences and of destination positioning, while the final decision is a function of the initial demand schedule and the information concerning services for accommodation and recreation in the selected destinations. Moreover, we extend previous studies that focused on image or attributes (such as climate and scenery by paying more attention to the services for accommodation and recreation in the holiday destinations. We test the proposed model using empirical data collected from a sample of 1.200 Italian tourists interviewed in 2007 (October - December. Data analysis shows that the selection probability for the destination included in the consideration set is not proportional to the share of inclusion because the share of inclusion is determined by the brand image, while the selection of the effective holiday destination is influenced by the real supply conditions. The analysis of Italian tourists preferences underline the existence of a latent demand for foreign holidays which points out a risk of market share reduction for Italian tourism system in the global market. We also find a snow ball effect which helps the most popular destinations, mainly in the northern Italian regions.

  6. Competing failure analysis in phased-mission systems with functional dependence in one of phases

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Levitin, Gregory

    2012-01-01

    This paper proposes an algorithm for the reliability analysis of non-repairable phased-mission systems (PMS) subject to competing failure propagation and isolation effects. A failure originating from a system component which causes extensive damage to other system components is a propagated failure. When the propagated failure affects all the system components, causing the entire system failure, a propagated failure with global effect (PFGE) is said to occur. However, the failure propagation can be isolated in systems subject to functional dependence (FDEP) behavior, where the failure of a component (referred to as trigger component) causes some other components (referred to as dependent components) to become inaccessible or unusable (isolated from the system), and thus further failures from these dependent components have no effect on the system failure behavior. On the other hand, if any PFGE from dependent components occurs before the trigger failure, the failure propagation effect takes place, causing the overall system failure. In summary, there are two distinct consequences of a PFGE due to the competition between the failure isolation and failure propagation effects in the time domain. Existing works on such competing failures focus only on single-phase systems. However, many real-world systems are phased-mission systems (PMS), which involve multiple, consecutive and non-overlapping phases of operations or tasks. Consideration of competing failures for PMS is a challenging and difficult task because PMS exhibit dynamics in the system configuration and component behavior as well as statistical dependencies across phases for a given component. This paper proposes a combinatorial method to address the competing failure effects in the reliability analysis of binary non-repairable PMS. The proposed method is verified using a Markov-based method through a numerical example. Different from the Markov-based approach that is limited to exponential distribution, the

  7. Failure mode analysis using state variables derived from fault trees with application

    International Nuclear Information System (INIS)

    Bartholomew, R.J.

    1982-01-01

    Fault Tree Analysis (FTA) is used extensively to assess both the qualitative and quantitative reliability of engineered nuclear power systems employing many subsystems and components. FTA is very useful, but the method is limited by its inability to account for failure mode rate-of-change interdependencies (coupling) of statistically independent failure modes. The state variable approach (using FTA-derived failure modes as states) overcomes these difficulties and is applied to the determination of the lifetime distribution function for a heat pipe-thermoelectric nuclear power subsystem. Analyses are made using both Monte Carlo and deterministic methods and compared with a Markov model of the same subsystem

  8. Multivariate statistical analysis of wildfires in Portugal

    Science.gov (United States)

    Costa, Ricardo; Caramelo, Liliana; Pereira, Mário

    2013-04-01

    Several studies demonstrate that wildfires in Portugal present high temporal and spatial variability as well as cluster behavior (Pereira et al., 2005, 2011). This study aims to contribute to the characterization of the fire regime in Portugal with the multivariate statistical analysis of the time series of number of fires and area burned in Portugal during the 1980 - 2009 period. The data used in the analysis is an extended version of the Rural Fire Portuguese Database (PRFD) (Pereira et al, 2011), provided by the National Forest Authority (Autoridade Florestal Nacional, AFN), the Portuguese Forest Service, which includes information for more than 500,000 fire records. There are many multiple advanced techniques for examining the relationships among multiple time series at the same time (e.g., canonical correlation analysis, principal components analysis, factor analysis, path analysis, multiple analyses of variance, clustering systems). This study compares and discusses the results obtained with these different techniques. Pereira, M.G., Trigo, R.M., DaCamara, C.C., Pereira, J.M.C., Leite, S.M., 2005: "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology. 129, 11-25. Pereira, M. G., Malamud, B. D., Trigo, R. M., and Alves, P. I.: The history and characteristics of the 1980-2005 Portuguese rural fire database, Nat. Hazards Earth Syst. Sci., 11, 3343-3358, doi:10.5194/nhess-11-3343-2011, 2011 This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692, the project FLAIR (PTDC/AAC-AMB/104702/2008) and the EU 7th Framework Program through FUME (contract number 243888).

  9. Failure analysis of stainless steel femur fixation plate.

    Science.gov (United States)

    Hussain, P B; Mohammad, M

    2004-05-01

    Failure analysis was performed to investigate the failure of the femur fixation plate which was previously fixed on the femur of a girl. Radiography, metallography, fractography and mechanical testing were conducted in this study. The results show that the failure was due to the formation of notches on the femur plate. These notches act as stress raisers from where the cracks start to propagate. Finally fracture occurred on the femur plate and subsequently, the plate failed.

  10. Nuclear reactor component populations, reliability data bases, and their relationship to failure rate estimation and uncertainty analysis

    International Nuclear Information System (INIS)

    Martz, H.F.; Beckman, R.J.

    1981-12-01

    Probabilistic risk analyses are used to assess the risks inherent in the operation of existing and proposed nuclear power reactors. In performing such risk analyses the failure rates of various components which are used in a variety of reactor systems must be estimated. These failure rate estimates serve as input to fault trees and event trees used in the analyses. Component failure rate estimation is often based on relevant field failure data from different reliability data sources such as LERs, NPRDS, and the In-Plant Data Program. Various statistical data analysis and estimation methods have been proposed over the years to provide the required estimates of the component failure rates. This report discusses the basis and extent to which statistical methods can be used to obtain component failure rate estimates. The report is expository in nature and focuses on the general philosophical basis for such statistical methods. Various terms and concepts are defined and illustrated by means of numerous simple examples

  11. VALIDATION OF SPRING OPERATED PRESSURE RELIEF VALVE TIME TO FAILURE AND THE IMPORTANCE OF STATISTICALLY SUPPORTED MAINTENANCE INTERVALS

    Energy Technology Data Exchange (ETDEWEB)

    Gross, R; Stephen Harris, S

    2009-02-18

    The Savannah River Site operates a Relief Valve Repair Shop certified by the National Board of Pressure Vessel Inspectors to NB-23, The National Board Inspection Code. Local maintenance forces perform inspection, testing, and repair of approximately 1200 spring-operated relief valves (SORV) each year as the valves are cycled in from the field. The Site now has over 7000 certified test records in the Computerized Maintenance Management System (CMMS); a summary of that data is presented in this paper. In previous papers, several statistical techniques were used to investigate failure on demand and failure rates including a quantal response method for predicting the failure probability as a function of time in service. The non-conservative failure mode for SORV is commonly termed 'stuck shut'; industry defined as the valve opening at greater than or equal to 1.5 times the cold set pressure. Actual time to failure is typically not known, only that failure occurred some time since the last proof test (censored data). This paper attempts to validate the assumptions underlying the statistical lifetime prediction results using Monte Carlo simulation. It employs an aging model for lift pressure as a function of set pressure, valve manufacturer, and a time-related aging effect. This paper attempts to answer two questions: (1) what is the predicted failure rate over the chosen maintenance/ inspection interval; and do we understand aging sufficient enough to estimate risk when basing proof test intervals on proof test results?

  12. A statistical analysis of electrical cerebral activity

    International Nuclear Information System (INIS)

    Bassant, Marie-Helene

    1971-01-01

    The aim of this work was to study the statistical properties of the amplitude of the electroencephalographic signal. The experimental method is described (implantation of electrodes, acquisition and treatment of data). The program of the mathematical analysis is given (calculation of probability density functions, study of stationarity) and the validity of the tests discussed. The results concerned ten rabbits. Trips of EEG were sampled during 40 s. with very short intervals (500 μs). The probability density functions established for different brain structures (especially the dorsal hippocampus) and areas, were compared during sleep, arousal and visual stimulus. Using a Χ 2 test, it was found that the Gaussian distribution assumption was rejected in 96.7 per cent of the cases. For a given physiological state, there was no mathematical reason to reject the assumption of stationarity (in 96 per cent of the cases). (author) [fr

  13. Statistical analysis of ultrasonic measurements in concrete

    Science.gov (United States)

    Chiang, Chih-Hung; Chen, Po-Chih

    2002-05-01

    Stress wave techniques such as measurements of ultrasonic pulse velocity are often used to evaluate concrete quality in structures. For proper interpretation of measurement results, the dependence of pulse transit time on the average acoustic impedance and the material homogeneity along the sound path need to be examined. Semi-direct measurement of pulse velocity could be more convenient than through transmission measurement. It is not necessary to assess both sides of concrete floors or walls. A novel measurement scheme is proposed and verified based on statistical analysis. It is shown that Semi-direct measurements are very effective for gathering large amount of pulse velocity data from concrete reference specimens. The variability of measurements is comparable with that reported by American Concrete Institute using either break-off or pullout tests.

  14. Failure analysis of tubes with wastages

    International Nuclear Information System (INIS)

    Prachuktam, S.; Reich, M.; Rajan, J.

    1979-01-01

    A finite element method for large strain calculation using the constitutive relation due to Hill was developed. This constitutive relation relates the co-rotational rate of the Kirchoff stress and deformation rate tensor which leads to a symmetric structure stiffness. This method is used to calculate failure pressures of degraded tubes

  15. Numerical Analysis of Solids at Failure

    Science.gov (United States)

    2011-08-20

    vector fields, leading to singular distributions of the fluid contents (fluid accumulation and drainage along the failure surface) while the fluid...and Environmental Engineering, University of Illinois at Urbana - Champaign, September 29 2008. Final Report, FA9550-08-1-0410 19 6. “Finite Elements

  16. Failure Analysis of Storage Data Magnetic Systems

    Directory of Open Access Journals (Sweden)

    Ortiz–Prado A.

    2010-10-01

    Full Text Available This paper shows the conclusions about the corrosion mechanics in storage data magnetic systems (hard disk. It was done from the inspection of 198 units that were in service in nine different climatic regions characteristic for Mexico. The results allow to define trends about the failure forms and the factors that affect them. In turn, this study has analyzed the causes that led to mechanical failure and those due to deterioration by atmospheric corrosion. On the basis of the results obtained from the field sampling, demonstrates that the hard disk failure is fundamentally by mechanical effects. The deterioration by environmental effects were found in read-write heads, integrated circuits, printed circuit boards and in some of the electronic components of the controller card of the device, but not in magnetic storage surfaces. There fore, you can discard corrosion on the surface of the disk as the main kind of failure due to environmental deterioration. To avoid any inconvenience in the magnetic data storage system it is necessary to ensure sealing of the system.

  17. Lecture notes: meantime to failure analysis

    International Nuclear Information System (INIS)

    Hanlen, R.C.

    1976-01-01

    A method is presented which affects the Quality Assurance Engineer's place in management decision making by giving him a working parameter to base sound engineering and management decisions. The theory used in Reliability Engineering to determine the mean-time-to-failure of a component or system is reviewed. The method presented derives the probability density function for the parameter of the exponential distribution. The exponential distribution is commonly used by industry to determine the reliability of a component or system when the failure rate is assumed to be constant. Some examples of N Reactor performance data are used. To be specific: The ball system data with 4.9 x 10 6 unit hours of service and 7 individual failures indicates a demonstrated 98.8 percent reliability at a 95 percent confidence level for a 12 month mission period, and the diesel starts data with 7.2 x 10 5 unit hours of service and 1 failure indicates a demonstrated 94.4 percent reliability at a 95 percent confidence level for a 12 month mission period

  18. Reliability analysis based on the losses from failures.

    Science.gov (United States)

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  19. An analysis of human maintenance failures of a nuclear power plant

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-01-01

    In the report, a study of faults caused by maintenance activities is presented. The objective of the study was to draw conclusions on the unplanned effects of maintenance on nuclear power plant safety and system availability. More than 4400 maintenance history reports from the years 1992-1994 of Olkiluoto BWR nuclear power plant (NPP) were analysed together with the maintenance personnel. The human action induced faults were classified, e.g., according to their multiplicity and effects. This paper presents and discusses the results of a statistical analysis of the data. Instrumentation and electrical components appeared to be especially prone to human failures. Many human failures were found in safety related systems. Several failures also remained latent from outages to power operation. However, the safety significance of failures was generally small. Modifications were an important source of multiple human failures. Plant maintenance data is a good source of human reliability data and it should be used more in the future. (orig.)

  20. Comparison of stress-based and strain-based creep failure criteria for severe accident analysis

    International Nuclear Information System (INIS)

    Chavez, S.A.; Kelly, D.L.; Witt, R.J.; Stirn, D.P.

    1995-01-01

    We conducted a parametic analysis of stress-based and strain-based creep failure criteria to determine if there is a significant difference between the two criteria for SA533B vessel steel under severe accident conditions. Parametric variables include debris composition, system pressure, and creep strain histories derived from different testing programs and mathematically fit, with and without tertiary creep. Results indicate significant differences between the two criteria. Stress gradient plays an important role in determining which criterion will predict failure first. Creep failure was not very sensitive to different creep strain histories, except near the transition temperature of the vessel steel (900K to 1000K). Statistical analyses of creep failure data of four independent sources indicate that these data may be pooled, with a spline point at 1000K. We found the Manson-Haferd parameter to have better failure predictive capability than the Larson-Miller parameter for the data studied. (orig.)

  1. Statistical Analysis of Data for Timber Strengths

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Hoffmeyer, P.

    Statistical analyses are performed for material strength parameters from approximately 6700 specimens of structural timber. Non-parametric statistical analyses and fits to the following distributions types have been investigated: Normal, Lognormal, 2 parameter Weibull and 3-parameter Weibull...

  2. Computer aided approach to qualitative and quantitative common cause failure analysis for complex systems

    International Nuclear Information System (INIS)

    Cate, C.L.; Wagner, D.P.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complete system reliability analysis. Existing methods of computer aided common cause failure analysis are extended by allowing analysis of the complex systems often encountered in practice. The methods aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  3. Role of scanning electron microscope )SEM) in metal failure analysis

    International Nuclear Information System (INIS)

    Shaiful Rizam Shamsudin; Hafizal Yazid; Mohd Harun; Siti Selina Abd Hamid; Nadira Kamarudin; Zaiton Selamat; Mohd Shariff Sattar; Muhamad Jalil

    2005-01-01

    Scanning electron microscope (SEM) is a scientific instrument that uses a beam of highly energetic electrons to examine the surface and phase distribution of specimens on a micro scale through the live imaging of secondary electrons (SE) and back-scattered electrons (BSE) images. One of the main activities of SEM Laboratory at MINT is for failure analysis on metal part and components. The capability of SEM is excellent for determining the root cause of metal failures such as ductility or brittleness, stress corrosion, fatigue and other types of failures. Most of our customers that request for failure analysis are from local petrochemical plants, manufacturers of automotive components, pipeline maintenance personnel and engineers who involved in the development of metal parts and component. This paper intends to discuss some of the technical concepts in failure analysis associated with SEM. (Author)

  4. On two methods of statistical image analysis

    NARCIS (Netherlands)

    Missimer, J; Knorr, U; Maguire, RP; Herzog, H; Seitz, RJ; Tellman, L; Leenders, K.L.

    1999-01-01

    The computerized brain atlas (CBA) and statistical parametric mapping (SPM) are two procedures for voxel-based statistical evaluation of PET activation studies. Each includes spatial standardization of image volumes, computation of a statistic, and evaluation of its significance. In addition,

  5. X-framework: Space system failure analysis framework

    Science.gov (United States)

    Newman, John Steven

    Space program and space systems failures result in financial losses in the multi-hundred million dollar range every year. In addition to financial loss, space system failures may also represent the loss of opportunity, loss of critical scientific, commercial and/or national defense capabilities, as well as loss of public confidence. The need exists to improve learning and expand the scope of lessons documented and offered to the space industry project team. One of the barriers to incorporating lessons learned include the way in which space system failures are documented. Multiple classes of space system failure information are identified, ranging from "sound bite" summaries in space insurance compendia, to articles in journals, lengthy data-oriented (what happened) reports, and in some rare cases, reports that treat not only the what, but also the why. In addition there are periodically published "corporate crisis" reports, typically issued after multiple or highly visible failures that explore management roles in the failure, often within a politically oriented context. Given the general lack of consistency, it is clear that a good multi-level space system/program failure framework with analytical and predictive capability is needed. This research effort set out to develop such a model. The X-Framework (x-fw) is proposed as an innovative forensic failure analysis approach, providing a multi-level understanding of the space system failure event beginning with the proximate cause, extending to the directly related work or operational processes and upward through successive management layers. The x-fw focus is on capability and control at the process level and examines: (1) management accountability and control, (2) resource and requirement allocation, and (3) planning, analysis, and risk management at each level of management. The x-fw model provides an innovative failure analysis approach for acquiring a multi-level perspective, direct and indirect causation of

  6. A quantitative method for Failure Mode and Effects Analysis

    NARCIS (Netherlands)

    Braaksma, Anne Johannes Jan; Meesters, A.J.; Klingenberg, W.; Hicks, C.

    2012-01-01

    Failure Mode and Effects Analysis (FMEA) is commonly used for designing maintenance routines by analysing potential failures, predicting their effect and facilitating preventive action. It is used to make decisions on operational and capital expenditure. The literature has reported that despite its

  7. Application of descriptive statistics in analysis of experimental data

    OpenAIRE

    Mirilović Milorad; Pejin Ivana

    2008-01-01

    Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...

  8. Analysis of dependent failures in the ORNL precursor study

    International Nuclear Information System (INIS)

    Ballard, G.M.

    1985-01-01

    The study of dependent failures (or common cause/mode failures) in the safety assessment of potentially hazardous plant is one of the significant areas of uncertainty in performing probabilistic safety studies. One major reason for this uncertainty is that data on dependent failures is apparently not readily available in sufficient quantity to assist in the development and validation of models. The incident reports that were compiled for the ORNL study on Precursors to Severe Core Damage Accidents (NUREG/CR-2497) provide an opportunity to look at the importance of dependent failures in the most significant incidents of recent reactor operations, to look at the success of probabilistic risk assessment (PRA) methods in accounting for the contribution of dependent failures, and to look at the dependent failure incidents with the aim of identifying the most significant problem areas. In this paper an analysis has been made of the incidents compiled in NUREG/CR-2497 and events involving multiple failures which were not independent have been identified. From this analysis it is clear that dependent failures are a very significant contributor to the precursor incidents. The method of enumeration of accident frequency used in NUREG-2497 can be shown to take account of dependent failures and this may be a significant factor contributing to the apparent difference between the precursor accident frequency and typical PRA frequencies

  9. Statistical analysis in MSW collection performance assessment.

    Science.gov (United States)

    Teixeira, Carlos Afonso; Avelino, Catarina; Ferreira, Fátima; Bentes, Isabel

    2014-09-01

    The increase of Municipal Solid Waste (MSW) generated over the last years forces waste managers pursuing more effective collection schemes, technically viable, environmentally effective and economically sustainable. The assessment of MSW services using performance indicators plays a crucial role for improving service quality. In this work, we focus on the relevance of regular system monitoring as a service assessment tool. In particular, we select and test a core-set of MSW collection performance indicators (effective collection distance, effective collection time and effective fuel consumption) that highlights collection system strengths and weaknesses and supports pro-active management decision-making and strategic planning. A statistical analysis was conducted with data collected in mixed collection system of Oporto Municipality, Portugal, during one year, a week per month. This analysis provides collection circuits' operational assessment and supports effective short-term municipality collection strategies at the level of, e.g., collection frequency and timetables, and type of containers. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Improved methods for dependent failure analysis in PSA

    International Nuclear Information System (INIS)

    Ballard, G.M.; Games, A.M.

    1988-01-01

    The basic design principle used in ensuring the safe operation of nuclear power plant is defence in depth. This normally takes the form of redundant equipment and systems which provide protection even if a number of equipment failures occur. Such redundancy is particularly effective in ensuring that multiple, independent equipment failures with the potential for jeopardising reactor safety will be rare events. However the achievement of high reliability has served to highlight the potentially dominant role of multiple, dependent failures of equipment and systems. Analysis of reactor operating experience has shown that dependent failure events are the major contributors to safety system failures and reactor incidents and accidents. In parallel PSA studies have shown that the results of a safety analysis are sensitive to assumptions made about the dependent failure (CCF) probability for safety systems. Thus a Westinghouse Analysis showed that increasing system dependent failure probabilities by a factor of 5 led to a factor 4 increase in core. This paper particularly refers to the engineering concepts underlying dependent failure assessment touching briefly on aspects of data. It is specifically not the intent of our work to develop a new mathematical model of CCF but to aid the use of existing models

  11. Transit safety & security statistics & analysis 2002 annual report (formerly SAMIS)

    Science.gov (United States)

    2004-12-01

    The Transit Safety & Security Statistics & Analysis 2002 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  12. Transit safety & security statistics & analysis 2003 annual report (formerly SAMIS)

    Science.gov (United States)

    2005-12-01

    The Transit Safety & Security Statistics & Analysis 2003 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  13. Statistical analysis of long term spatial and temporal trends of ...

    Indian Academy of Sciences (India)

    Statistical analysis of long term spatial and temporal trends of temperature ... CGCM3; HadCM3; modified Mann–Kendall test; statistical analysis; Sutlej basin. ... Water Resources Systems Division, National Institute of Hydrology, Roorkee 247 ...

  14. Failure analysis of a repairable system: The case study of a cam-driven reciprocating pump

    Science.gov (United States)

    Dudenhoeffer, Donald D.

    1994-09-01

    This thesis supplies a statistical and economic tool for analysis of the failure characteristics of one typical piece of equipment under evaluation: a cam-driven reciprocating pump used in the submarine's distillation system. Comprehensive statistical techniques and parametric modeling are employed to identify and quantify pump failure characteristics. Specific areas of attention include: the derivation of an optimal maximum replacement interval based on costs, an evaluation of the mission reliability for the pump as a function of pump age, and a calculation of the expected times between failures. The purpose of this analysis is to evaluate current maintenance practices of time-based replacement and examine the consequences of different replacement intervals in terms of costs and mission reliability. Tradeoffs exist between cost savings and system reliability that must be fully understood prior to making any policy decisions.

  15. Overview and statistical failure analyses of the electrical insulation system for the SSC long dipole magnets from an industrialization point of view

    International Nuclear Information System (INIS)

    Roach, J.F.

    1992-01-01

    The electrical insulation system of the SSC long dipole magnets is reviewed and potential dielectric failure modes discussed. Electrical insulation fabrication and assembly issues with respect to rate production manufacturability are addressed. The automation required for rate assembly of electrical insulation components will require critical online visual and dielectric screening tests to insure production quality. Storage and assembly areas must bc designed to prevent foreign particles from becoming entrapped in the insulation during critical coil winding, molding, and collaring operations. All hand assembly procedures involving dielectrics must be performed with rigorous attention to their impact on insulation integrity. Individual dipole magnets must have a sufficiently low probability of electrical insulation failure under all normal and fault mode voltage conditions such that the series of magnets in the SSC rings have acceptable Mean Time Between Failure (MTBF) with respect to dielectric mode failure events. Statistical models appropriate for large electrical system breakdown failure analysis are applied to the SSC magnet rings. The MTBF of the SSC system is related to failure data base for individual dipole magnet samples

  16. Statistical approach to partial equilibrium analysis

    Science.gov (United States)

    Wang, Yougui; Stanley, H. E.

    2009-04-01

    A statistical approach to market equilibrium and efficiency analysis is proposed in this paper. One factor that governs the exchange decisions of traders in a market, named willingness price, is highlighted and constitutes the whole theory. The supply and demand functions are formulated as the distributions of corresponding willing exchange over the willingness price. The laws of supply and demand can be derived directly from these distributions. The characteristics of excess demand function are analyzed and the necessary conditions for the existence and uniqueness of equilibrium point of the market are specified. The rationing rates of buyers and sellers are introduced to describe the ratio of realized exchange to willing exchange, and their dependence on the market price is studied in the cases of shortage and surplus. The realized market surplus, which is the criterion of market efficiency, can be written as a function of the distributions of willing exchange and the rationing rates. With this approach we can strictly prove that a market is efficient in the state of equilibrium.

  17. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  18. Fuzzy logic prioritization of failures in a system failure mode, effects and criticality analysis

    International Nuclear Information System (INIS)

    Bowles, John B.; Pelaez, C.E.

    1995-01-01

    This paper describes a new technique, based on fuzzy logic, for prioritizing failures for corrective actions in a Failure Mode, Effects and Criticality Analysis (FMECA). As in a traditional criticality analysis, the assessment is based on the severity, frequency of occurrence, and detectability of an item failure. However, these parameters are here represented as members of a fuzzy set, combined by matching them against rules in a rule base, evaluated with min-max inferencing, and then defuzzified to assess the riskiness of the failure. This approach resolves some of the problems in traditional methods of evaluation and it has several advantages compared to strictly numerical methods: 1) it allows the analyst to evaluate the risk associated with item failure modes directly using the linguistic terms that are employed in making the criticality assessment; 2) ambiguous, qualitative, or imprecise information, as well as quantitative data, can be used in the assessment and they are handled in a consistent manner; and 3) it gives a more flexible structure for combining the severity, occurrence, and detectability parameters. Two fuzzy logic based approaches for assessing criticality are presented. The first is based on the numerical rankings used in a conventional Risk Priority Number (RPN) calculation and uses crisp inputs gathered from the user or extracted from a reliability analysis. The second, which can be used early in the design process when less detailed information is available, allows fuzzy inputs and also illustrates the direct use of the linguistic rankings defined for the RPN calculations

  19. Robustness Analysis of Real Network Topologies Under Multiple Failure Scenarios

    DEFF Research Database (Denmark)

    Manzano, M.; Marzo, J. L.; Calle, E.

    2012-01-01

    on topological characteristics. Recently approaches also consider the services supported by such networks. In this paper we carry out a robustness analysis of five real backbone telecommunication networks under defined multiple failure scenarios, taking into account the consequences of the loss of established......Nowadays the ubiquity of telecommunication networks, which underpin and fulfill key aspects of modern day living, is taken for granted. Significant large-scale failures have occurred in the last years affecting telecommunication networks. Traditionally, network robustness analysis has been focused...... connections. Results show which networks are more robust in response to a specific type of failure....

  20. Failure Modes and Effects Analysis (FMEA): A Bibliography

    Science.gov (United States)

    2000-01-01

    Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.

  1. Failure analysis of the boiler water-wall tube

    OpenAIRE

    S.W. Liu; W.Z. Wang; C.J. Liu

    2017-01-01

    Failure analysis of the boiler water-wall tube is presented in this work. In order to examine the causes of failure, various techniques including visual inspection, chemical analysis, optical microscopy, scanning electron microscopy and energy dispersive spectroscopy were carried out. Tube wall thickness measurements were performed on the ruptured tube. The fire-facing side of the tube was observed to have experienced significant wall thinning. The composition of the matrix material of the tu...

  2. Failure analysis of PB-1 (EBTS Be/Cu mockup)

    International Nuclear Information System (INIS)

    Odegard, B.C. Jr.; Cadden, C.H.

    1996-11-01

    Failure analysis was done on PB-1 (series of Be tiles joined to Cu alloy) following a tile failure during a high heat flux experiment in EBTS (electron beam test system). This heat flux load simulated ambient conditions inside ITER; the Be tiles were bonded to the Cu alloy using low-temperature diffusion bonding, which is being considered for fabricating plasma facing components in ITER. Results showed differences between the EBTS failure and a failure during a room temperature tensile test. The latter occurred at the Cu-Be interface in an intermetallic phase formed by reaction of the two metals at the bonding temperature. Fracture strengths measured by these tests were over 300 MPa. The high heat flux specimens failed at the Cu-Cu diffusion bond. Fracture morphology in both cases was a mixed mode of dimple rupture and transgranular cleavage. Several explanations for this difference in failure mechanism are suggested

  3. Statistical analysis of angular correlation measurements

    International Nuclear Information System (INIS)

    Oliveira, R.A.A.M. de.

    1986-01-01

    Obtaining the multipole mixing ratio, δ, of γ transitions in angular correlation measurements is a statistical problem characterized by the small number of angles in which the observation is made and by the limited statistic of counting, α. The inexistence of a sufficient statistics for the estimator of δ, is shown. Three different estimators for δ were constructed and their properties of consistency, bias and efficiency were tested. Tests were also performed in experimental results obtained in γ-γ directional correlation measurements. (Author) [pt

  4. Surface Properties of TNOs: Preliminary Statistical Analysis

    Science.gov (United States)

    Antonieta Barucci, Maria; Fornasier, S.; Alvarez-Cantal, A.; de Bergh, C.; Merlin, F.; DeMeo, F.; Dumas, C.

    2009-09-01

    An overview of the surface properties based on the last results obtained during the Large Program performed at ESO-VLT (2007-2008) will be presented. Simultaneous high quality visible and near-infrared spectroscopy and photometry have been carried out on 40 objects with various dynamical properties, using FORS1 (V), ISAAC (J) and SINFONI (H+K bands) mounted respectively at UT2, UT1 and UT4 VLT-ESO telescopes (Cerro Paranal, Chile). For spectroscopy we computed the spectral slope for each object and searched for possible rotational inhomogeneities. A few objects show features in their visible spectra such as Eris, whose spectral bands are displaced with respect to pure methane-ice. We identify new faint absorption features on 10199 Chariklo and 42355 Typhon, possibly due to the presence of aqueous altered materials. The H+K band spectroscopy was performed with the new instrument SINFONI which is a 3D integral field spectrometer. While some objects show no diagnostic spectral bands, others reveal surface deposits of ices of H2O, CH3OH, CH4, and N2. To investigate the surface properties of these bodies, a radiative transfer model has been applied to interpret the entire 0.4-2.4 micron spectral region. The diversity of the spectra suggests that these objects represent a substantial range of bulk compositions. These different surface compositions can be diagnostic of original compositional diversity, interior source and/or different evolution with different physical processes affecting the surfaces. A statistical analysis is in progress to investigate the correlation of the TNOs’ surface properties with size and dynamical properties.

  5. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Science.gov (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  6. FRAC (failure rate analysis code): a computer program for analysis of variance of failure rates. An application user's guide

    International Nuclear Information System (INIS)

    Martz, H.F.; Beckman, R.J.; McInteer, C.R.

    1982-03-01

    Probabilistic risk assessments (PRAs) require estimates of the failure rates of various components whose failure modes appear in the event and fault trees used to quantify accident sequences. Several reliability data bases have been designed for use in providing the necessary reliability data to be used in constructing these estimates. In the nuclear industry, the Nuclear Plant Reliability Data System (NPRDS) and the In-Plant Reliability Data System (IRPDS), among others, were designed for this purpose. An important characteristic of such data bases is the selection and identification of numerous factors used to classify each component that is reported and the subsequent failures of each component. However, the presence of such factors often complicates the analysis of reliability data in the sense that it is inappropriate to group (that is, pool) data for those combinations of factors that yield significantly different failure rate values. These types of data can be analyzed by analysis of variance. FRAC (Failure Rate Analysis Code) is a computer code that performs an analysis of variance of failure rates. In addition, FRAC provides failure rate estimates

  7. COMPARATIVE STATISTICAL ANALYSIS OF GENOTYPES’ COMBINING

    Directory of Open Access Journals (Sweden)

    V. Z. Stetsyuk

    2015-05-01

    The program provides the creation of desktop program complex for statistics calculations on a personal computer of doctor. Modern methods and tools for development of information systems were described to create program.

  8. Root cause of failure analysis and the system engineer

    International Nuclear Information System (INIS)

    Coppock, M.S.; Hartwig, A.W.

    1990-01-01

    In an industry where ever-increasing emphasis is being placed on root cause of failure determination, it is imperative that a successful nuclear utility have an effective means of identifying failures and performing the necessary analyses. The current Institute of Nuclear Power Operations (INPO) good practice, OE-907, root-cause analysis, gives references to methodology that will help determine breakdowns in procedures, programs, or design but gives very little guidance on how or when to perform component root cause of failure analyses. The system engineers of nuclear utilities are considered the focal point for their respective systems and are required by most programs to investigate component failures. The problem that the system engineer faces in determining a component root cause of failures lies in acquisition of the necessary data to identify the need to perform the analysis and in having the techniques and equipment available to perform it. The system engineers at the Palo Verde nuclear generating station routinely perform detailed component root cause of failure analyses. The Palo Verde program provides the system engineers with the information necessary to identify when a component root cause of failure is required. Palo Verde also has the necessary equipment on-site to perform the analyses

  9. Failure analysis of a helicopter's main rotor bearing

    International Nuclear Information System (INIS)

    Shahzad, M.; Qureshi, A.H.; Waqas, H.; Hussain, N.; Ali, N.

    2011-01-01

    Presented results report some of the findings of a detailed failure analysis carried out on a main rotor hub assembly, which had symptoms of burning and mechanical damage. The analysis suggests environmental degradation of the grease which causes pitting on bearing-balls. The consequent inefficient lubrication raises the temperature which leads to the smearing of cage material (brass) on the bearing-balls and ultimately causes the failure. The analysis has been supported by the microstructural studies, thermal analysis and micro-hardness testing performed on the affected main rotor bearing parts. (author)

  10. Failure mode analysis of a PCRV. Influence of some hypothesis

    International Nuclear Information System (INIS)

    Zimmermann, T.; Saugy, B.; Rebora, B.

    1975-01-01

    This paper is concerned with the most recent developments and results obtained using a mathematical model for the non-linear analysis of massive reinforced and prestressed concrete strucures developed by the IPEN at the Swiss Federal Institute of Technology, in Lausanne. The method is based on three-dimensional isoparametric finite elements. A linear solution is adapted step by step to the idealized behavior laws of the materials up to the failure of the structure. The laws proposed here for the non-linear behavior of concrete and steel have been described elsewhere but a simple extension to the time-dependent behavior is presented. A numerical algorithm for the superposition of creep deformations is also proposed, the basic creep law being supposed to satisfy a power expression. Time-dependent failure is discussed. The calculus of a PCRV of a helium cooled fast reactor is then performed and the influence of the liner on the failure mode is analyzed. The failure analysis under increasing internal pressure is run at the present time and the influence of an eventual pressure in the cracks is being investigated. The paper aims mainly to demonstrate the accuracy of a failure analysis by three-dimensional finite-elements and to compare it with a model test, in particular when complete deformation and failure tests of the materials are available. The proposed model has already been extensively tested on simple structures and has proved to be useful for the analysis of different simplifying hypotheses

  11. Debugging Nondeterministic Failures in Linux Programs through Replay Analysis

    Directory of Open Access Journals (Sweden)

    Shakaiba Majeed

    2018-01-01

    Full Text Available Reproducing a failure is the first and most important step in debugging because it enables us to understand the failure and track down its source. However, many programs are susceptible to nondeterministic failures that are hard to reproduce, which makes debugging extremely difficult. We first address the reproducibility problem by proposing an OS-level replay system for a uniprocessor environment that can capture and replay nondeterministic events needed to reproduce a failure in Linux interactive and event-based programs. We then present an analysis method, called replay analysis, based on the proposed record and replay system to diagnose concurrency bugs in such programs. The replay analysis method uses a combination of static analysis, dynamic tracing during replay, and delta debugging to identify failure-inducing memory access patterns that lead to concurrency failure. The experimental results show that the presented record and replay system has low-recording overhead and hence can be safely used in production systems to catch rarely occurring bugs. We also present few concurrency bug case studies from real-world applications to prove the effectiveness of the proposed bug diagnosis framework.

  12. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa; Malin, Jane T.

    2013-01-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component s functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  13. Failure Modes and Effects Analysis (FMEA) Assistant Tool Feasibility Study

    Science.gov (United States)

    Flores, Melissa D.; Malin, Jane T.; Fleming, Land D.

    2013-09-01

    An effort to determine the feasibility of a software tool to assist in Failure Modes and Effects Analysis (FMEA) has been completed. This new and unique approach to FMEA uses model based systems engineering concepts to recommend failure modes, causes, and effects to the user after they have made several selections from pick lists about a component's functions and inputs/outputs. Recommendations are made based on a library using common failure modes identified over the course of several major human spaceflight programs. However, the tool could be adapted for use in a wide range of applications from NASA to the energy industry.

  14. Statistical Analysis of Research Data | Center for Cancer Research

    Science.gov (United States)

    Recent advances in cancer biology have resulted in the need for increased statistical analysis of research data. The Statistical Analysis of Research Data (SARD) course will be held on April 5-6, 2018 from 9 a.m.-5 p.m. at the National Institutes of Health's Natcher Conference Center, Balcony C on the Bethesda Campus. SARD is designed to provide an overview on the general principles of statistical analysis of research data.  The first day will feature univariate data analysis, including descriptive statistics, probability distributions, one- and two-sample inferential statistics.

  15. Creep failure analysis of butt welded tubes

    International Nuclear Information System (INIS)

    Browne, R.J.; Parker, J.D.; Walters, D.J.

    1981-01-01

    As part of a major research programme to investigate the influence of butt welds on the life expectancy of tubular components, a series of internal-pressure, stress-rupture tests have been carried out. Thick walled 1/2Cr 1/2Mo 1/4V tube specimens were welded with mild steel, 1Cr 1/2Mo steel, 2 1/4Cr 1Mo steel or nominally matching 1/2Cr 1/2Mo 1/4V steel to give a wide range of weld metal creep strengths relative to the parent tube. The weldments were tested at 565 0 C at two values of internal pressure, and gave failure lives of up to 44,000 hrs. Finite element techniques have been used to determine the stationary state stress distribution in the weldment which was represented by a three material model. Significant stress redistribution was indicated and these results enabled the position and orientation of cracking and the rupture life to be predicted. The theoretical and experimental results have been used to highlight the limitations of current design methods which are based on the application of the mean diameter hoop stress to the parent material stress rupture data. (author)

  16. Advanced approaches to failure mode and effect analysis (FMEA applications

    Directory of Open Access Journals (Sweden)

    D. Vykydal

    2015-10-01

    Full Text Available The present paper explores advanced approaches to the FMEA method (Failure Mode and Effect Analysis which take into account the costs associated with occurrence of failures during the manufacture of a product. Different approaches are demonstrated using an example FMEA application to production of drawn wire. Their purpose is to determine risk levels, while taking account of the above-mentioned costs. Finally, the resulting priority levels are compared for developing actions mitigating the risks.

  17. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs......To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social......), and stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...

  18. Failure analysis for ultrasound machines in a radiology department after implementation of predictive maintenance method

    Directory of Open Access Journals (Sweden)

    Greg Chu

    2018-01-01

    Full Text Available Objective: The objective of the study was to perform quantitative failure and fault analysis to the diagnostic ultrasound (US scanners in a radiology department after the implementation of the predictive maintenance (PdM method; to study the reduction trend of machine failure; to understand machine operating parameters affecting the failure; to further optimize the method to maximize the machine clinically service time. Materials and Methods: The PdM method has been implemented to the 5 US machines since 2013. Log books were used to record machine failures and their root causes together with the time spent on repair, all of which were retrieved, categorized, and analyzed for the period between 2013 and 2016. Results: There were a total of 108 cases of failure occurred in these 5 US machines during the 4-year study period. The average number of failure per month for all these machines was 2.4. Failure analysis showed that there were 33 cases (30.5% due to software, 44 cases (40.7% due to hardware, and 31 cases (28.7% due to US probe. There was a statistically significant negative correlation between the time spent on regular quality assurance (QA by hospital physicists with the time spent on faulty parts replacement over the study period (P = 0.007. However, there was no statistically significant correlation between regular QA time and total yearly breakdown case (P = 0.12, although there has been a decreasing trend observed in the yearly total breakdown. Conclusion: There has been a significant improvement on the machine failure of US machines attributed to the concerted effort of sonographers and physicists in our department to practice the PdM method, in that system component repair time has been reduced, and a decreasing trend in the number of system breakdown has been observed.

  19. Failure analysis of vise jaw holders for hacksaw machine

    Directory of Open Access Journals (Sweden)

    Essam Ali Al-Bahkali

    2018-01-01

    Full Text Available Failure analysis in mechanical components has been investigated in many studies in the last few years. Failure analysis and prevention are important functions in all engineering disciplines. Materials engineers are often the lead role in the analysis of failures, where a component or product fails in service or if a failure occurs during manufacturing or production processing. In any case, one must determine the cause of the failure to prevent future occurrences and/or to improve the performance of the device, component or structure. For example, the vise jaw holders of hacksaws can break due to accidental heavy loads or machine misuse. The parts that break are the stationary and movable vise jaw holders and the connecter power screw between the holders. To investigate the failure of these components, a three-dimensional finite element model for stress analysis was performed. First, the analysis identified the broken components of the hacksaw machine. In addition, the type of materials of the broken parts was identified, a CAD model was built, and the hacksaw mechanism was analyzed to determine the accurate applied loads on the broken parts. After analyzing the model using Abaqus CAE software, the results showed that the location of the high stresses was identical with the high-stress locations in the original, broken parts. Furthermore, the power screw was subjected to a high load, which deformed the power screw. Also, the stationary vise jaw holder was broken by impact because it was not touched by the power screw until the movable vise jaw holder broke. A conclusion is drawn from the failure analysis and a way to improve the design of the broken parts is suggested.

  20. Extending Failure Modes and Effects Analysis Approach for Reliability Analysis at the Software Architecture Design Level

    NARCIS (Netherlands)

    Sözer, Hasan; Tekinerdogan, B.; Aksit, Mehmet; de Lemos, Rogerio; Gacek, Cristina

    2007-01-01

    Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.

  1. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  2. Statistical analysis of hydrodynamic cavitation events

    Science.gov (United States)

    Gimenez, G.; Sommer, R.

    1980-10-01

    The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.

  3. Statistical analysis of lineaments of Goa, India

    Digital Repository Service at National Institute of Oceanography (India)

    Iyer, S.D.; Banerjee, G.; Wagle, B.G.

    statistically to obtain the nonlinear pattern in the form of a cosine wave. Three distinct peaks were found at azimuths of 40-45 degrees, 90-95 degrees and 140-145 degrees, which have peak values of 5.85, 6.80 respectively. These three peaks are correlated...

  4. On statistical analysis of compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2006-01-01

    Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research

  5. Statistical Analysis Of Reconnaissance Geochemical Data From ...

    African Journals Online (AJOL)

    , Co, Mo, Hg, Sb, Tl, Sc, Cr, Ni, La, W, V, U, Th, Bi, Sr and Ga in 56 stream sediment samples collected from Orle drainage system were subjected to univariate and multivariate statistical analyses. The univariate methods used include ...

  6. A STATISTICAL ANALYSIS OF LARYNGEAL MALIGNANCIES AT OUR INSTITUTION

    Directory of Open Access Journals (Sweden)

    Bharathi Mohan Mathan

    2017-03-01

    Full Text Available BACKGROUND Malignancies of larynx are an increasing global burden with a distribution of approximately 2-5% of all malignancies with an incidence of 3.6/1,00,000 for men and 1.3/1,00,000 for women with a male-to-female ratio of 4:1. Smoking and alcohol are major established risk factors. More than 90-95% of all malignancies are squamous cell type. Three main subsite of laryngeal malignancies are glottis, supraglottis and subglottis. Improved surgical techniques and advanced chemoradiotherapy has increased the overall 5 year survival rate. The above study is statistical analysis of laryngeal malignancies at our institution for a period of one year and analysis of pattern of distribution, aetiology, sites and subsites and causes for recurrence. MATERIALS AND METHODS Based on the statistical data available in the institution for the period of one year from January 2016-December 2016, all laryngeal malignancies were analysed with respect to demographic pattern, age, gender, site, subsite, aetiology, staging, treatment received and probable cause for failure of treatment. Patients were followed up for 12 months period during the study. RESULTS Total number of cases studied are 27 (twenty seven. Male cases are 23 and female cases are 4, male-to-female ratio is 5.7:1, most common age is above 60 years, most common site is supraglottis, most common type is moderately-differentiated squamous cell carcinoma, most common cause for relapse or recurrence is advanced stage of disease and poor differentiation. CONCLUSION The commonest age occurrence at the end of the study is above 60 years and male-to-female ratio is 5.7:1, which is slightly above the international standards. Most common site is supraglottis and not glottis. The relapse and recurrences are higher compared to the international standards.

  7. Statistical analysis of medical data using SAS

    CERN Document Server

    Der, Geoff

    2005-01-01

    An Introduction to SASDescribing and Summarizing DataBasic InferenceScatterplots Correlation: Simple Regression and SmoothingAnalysis of Variance and CovarianceMultiple RegressionLogistic RegressionThe Generalized Linear ModelGeneralized Additive ModelsNonlinear Regression ModelsThe Analysis of Longitudinal Data IThe Analysis of Longitudinal Data II: Models for Normal Response VariablesThe Analysis of Longitudinal Data III: Non-Normal ResponseSurvival AnalysisAnalysis Multivariate Date: Principal Components and Cluster AnalysisReferences

  8. Failure and Redemption of Statistical and Nonstatistical Rate Theories in the Hydroboration of Alkenes.

    Science.gov (United States)

    Bailey, Johnathan O; Singleton, Daniel A

    2017-11-08

    Our previous work found that canonical forms of transition state theory incorrectly predict the regioselectivity of the hydroboration of propene with BH 3 in solution. In response, it has been suggested that alternative statistical and nonstatistical rate theories can adequately account for the selectivity. This paper uses a combination of experimental and theoretical studies to critically evaluate the ability of these rate theories, as well as dynamic trajectories and newly developed localized statistical models, to predict quantitative selectivities and qualitative trends in hydroborations on a broader scale. The hydroboration of a series of terminally substituted alkenes with BH 3 was examined experimentally, and a classically unexpected trend is that the selectivity increases as the alkyl chain is lengthened far from the reactive centers. Conventional and variational transition state theories can predict neither the selectivities nor the trends. The canonical competitive nonstatistical model makes somewhat better predictions for some alkenes but fails to predict trends, and it performs poorly with an alkene chosen to test a specific prediction of the model. Added nonstatistical corrections to this model make the predictions worse. Parametrized Rice-Ramsperger-Kassel-Marcus (RRKM)-master equation calculations correctly predict the direction of the trend in selectivity versus alkene size but overpredict its magnitude, and the selectivity with large alkenes remains unpredictable with any parametrization. Trajectory studies in explicit solvent can predict selectivities without parametrization but are impractical for predicting small changes in selectivity. From a lifetime and energy analysis of the trajectories, "localized RRKM-ME" and "competitive localized noncanonical" rate models are suggested as steps toward a general model. These provide the best predictions of the experimental observations and insight into the selectivities.

  9. Analysis of risk factors for cluster behavior of dental implant failures.

    Science.gov (United States)

    Chrcanovic, Bruno Ramos; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann

    2017-08-01

    Some studies indicated that implant failures are commonly concentrated in few patients. To identify and analyze cluster behavior of dental implant failures among subjects of a retrospective study. This retrospective study included patients receiving at least three implants only. Patients presenting at least three implant failures were classified as presenting a cluster behavior. Univariate and multivariate logistic regression models and generalized estimating equations analysis evaluated the effect of explanatory variables on the cluster behavior. There were 1406 patients with three or more implants (8337 implants, 592 failures). Sixty-seven (4.77%) patients presented cluster behavior, with 56.8% of all implant failures. The intake of antidepressants and bruxism were identified as potential negative factors exerting a statistically significant influence on a cluster behavior at the patient-level. The negative factors at the implant-level were turned implants, short implants, poor bone quality, age of the patient, the intake of medicaments to reduce the acid gastric production, smoking, and bruxism. A cluster pattern among patients with implant failure is highly probable. Factors of interest as predictors for implant failures could be a number of systemic and local factors, although a direct causal relationship cannot be ascertained. © 2017 Wiley Periodicals, Inc.

  10. A Big Data Analysis Approach for Rail Failure Risk Assessment.

    Science.gov (United States)

    Jamshidi, Ali; Faghih-Roohi, Shahrzad; Hajizadeh, Siamak; Núñez, Alfredo; Babuska, Robert; Dollevoet, Rolf; Li, Zili; De Schutter, Bart

    2017-08-01

    Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by analyzing a type of rail surface defect called squats that are detected automatically among the huge number of records from video cameras. We propose an image processing approach for automatic detection of squats, especially severe types that are prone to rail breaks. We measure the visual length of the squats and use them to model the failure risk. For the assessment of the rail failure risk, we estimate the probability of rail failure based on the growth of squats. Moreover, we perform severity and crack growth analyses to consider the impact of rail traffic loads on defects in three different growth scenarios. The failure risk estimations are provided for several samples of squats with different crack growth lengths on a busy rail track of the Dutch railway network. The results illustrate the practicality and efficiency of the proposed approach. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  11. Process Equipment Failure Mode Analysis in a Chemical Industry

    Directory of Open Access Journals (Sweden)

    J. Nasl Seraji

    2008-04-01

    Full Text Available Background and aims   Prevention of potential accidents and safety promotion in chemical processes requires systematic safety management in them. The main objective of this study was analysis of important process equipment components failure modes and effects in H2S and CO2  isolation from extracted natural gas process.   Methods   This study was done in sweetening unit of an Iranian gas refinery. Failure Mode and Effect Analysis (FMEA used for identification of process equipments failures.   Results   Totally 30 failures identified and evaluated using FMEA. P-1 blower's blade breaking and sour gas pressure control valve bearing tight moving had maximum risk Priority number (RPN, P-1 body corrosion and increasing plug lower side angle of reach DEAlevel control valve  in tower - 1 were minimum calculated RPN.   Conclusion   By providing a reliable documentation system for equipment failures and  incidents recording, maintaining of basic information for later safety assessments would be  possible. Also, the probability of failures and effects could be minimized by conducting preventive maintenance.

  12. Launch Vehicle Failure Dynamics and Abort Triggering Analysis

    Science.gov (United States)

    Hanson, John M.; Hill, Ashely D.; Beard, Bernard B.

    2011-01-01

    Launch vehicle ascent is a time of high risk for an on-board crew. There are many types of failures that can kill the crew if the crew is still on-board when the failure becomes catastrophic. For some failure scenarios, there is plenty of time for the crew to be warned and to depart, whereas in some there is insufficient time for the crew to escape. There is a large fraction of possible failures for which time is of the essence and a successful abort is possible if the detection and action happens quickly enough. This paper focuses on abort determination based primarily on data already available from the GN&C system. This work is the result of failure analysis efforts performed during the Ares I launch vehicle development program. Derivation of attitude and attitude rate abort triggers to ensure that abort occurs as quickly as possible when needed, but that false positives are avoided, forms a major portion of the paper. Some of the potential failure modes requiring use of these triggers are described, along with analysis used to determine the success rate of getting the crew off prior to vehicle demise.

  13. Fundamentals of statistical experimental design and analysis

    CERN Document Server

    Easterling, Robert G

    2015-01-01

    Professionals in all areas - business; government; the physical, life, and social sciences; engineering; medicine, etc. - benefit from using statistical experimental design to better understand their worlds and then use that understanding to improve the products, processes, and programs they are responsible for. This book aims to provide the practitioners of tomorrow with a memorable, easy to read, engaging guide to statistics and experimental design. This book uses examples, drawn from a variety of established texts, and embeds them in a business or scientific context, seasoned with a dash of humor, to emphasize the issues and ideas that led to the experiment and the what-do-we-do-next? steps after the experiment. Graphical data displays are emphasized as means of discovery and communication and formulas are minimized, with a focus on interpreting the results that software produce. The role of subject-matter knowledge, and passion, is also illustrated. The examples do not require specialized knowledge, and t...

  14. Common misconceptions about data analysis and statistics.

    Science.gov (United States)

    Motulsky, Harvey J

    2014-11-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason maybe that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: 1. P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. 2. Overemphasis on P values rather than on the actual size of the observed effect. 3. Overuse of statistical hypothesis testing, and being seduced by the word "significant". 4. Overreliance on standard errors, which are often misunderstood.

  15. Common misconceptions about data analysis and statistics.

    Science.gov (United States)

    Motulsky, Harvey J

    2015-02-01

    Ideally, any experienced investigator with the right tools should be able to reproduce a finding published in a peer-reviewed biomedical science journal. In fact, the reproducibility of a large percentage of published findings has been questioned. Undoubtedly, there are many reasons for this, but one reason may be that investigators fool themselves due to a poor understanding of statistical concepts. In particular, investigators often make these mistakes: (1) P-Hacking. This is when you reanalyze a data set in many different ways, or perhaps reanalyze with additional replicates, until you get the result you want. (2) Overemphasis on P values rather than on the actual size of the observed effect. (3) Overuse of statistical hypothesis testing, and being seduced by the word "significant". (4) Overreliance on standard errors, which are often misunderstood.

  16. Statistical analysis of radioactivity in the environment

    International Nuclear Information System (INIS)

    Barnes, M.G.; Giacomini, J.J.

    1980-05-01

    The pattern of radioactivity in surface soils of Area 5 of the Nevada Test Site is analyzed statistically by means of kriging. The 1962 event code-named Smallboy effected the greatest proportion of the area sampled, but some of the area was also affected by a number of other events. The data for this study were collected on a regular grid to take advantage of the efficiency of grid sampling

  17. Critical analysis of adsorption data statistically

    Science.gov (United States)

    Kaushal, Achla; Singh, S. K.

    2017-10-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are mango leaf powder.

  18. Analysis of reactor trips involving balance-of-plant failures

    International Nuclear Information System (INIS)

    Seth, S.; Skinner, L.; Ettlinger, L.; Lay, R.

    1986-01-01

    The relatively high frequency of plant transients leading to reactor trips at nuclear power plants in the US is of economic and safety concern to the industry. A majority of such transients is due to failures in the balance-of-plant (BOP) systems. As a part of a study conducted for the US Nuclear Regulatory Commission, Mitre has carried out a further analysis of the BOP failures associated with reactor trips. The major objectives of the analysis were to examine plant-to-plant variations in BOP-related trips, to understand the causes of failures, and to determine the extent of any associated safety system challenges. The analysis was based on the Licensee Event Reports submitted on all commercial light water reactors during the 2-yr period, 1984-1985

  19. Common pitfalls in statistical analysis: "P" values, statistical significance and confidence intervals

    Directory of Open Access Journals (Sweden)

    Priya Ranganathan

    2015-01-01

    Full Text Available In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ′P′ value, explain the importance of ′confidence intervals′ and clarify the importance of including both values in a paper

  20. Common pitfalls in statistical analysis: “P” values, statistical significance and confidence intervals

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2015-01-01

    In the second part of a series on pitfalls in statistical analysis, we look at various ways in which a statistically significant study result can be expressed. We debunk some of the myths regarding the ‘P’ value, explain the importance of ‘confidence intervals’ and clarify the importance of including both values in a paper PMID:25878958

  1. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  2. The analysis of failure data in the presence of critical and degraded failures

    International Nuclear Information System (INIS)

    Haugen, Knut; Hokstad, Per; Sandtorv, Helge

    1997-01-01

    Reported failures are often classified into severityclasses, e.g., as critical or degraded. The critical failures correspond to loss of function(s) and are those of main concern. The rate of critical failures is usually estimated by the number of observed critical failures divided by the exposure time, thus ignoring the observed degraded failures. In the present paper failure data are analyzed, applying an alternative estimate for the critical failure rate, also taking the number of observed degraded failures into account. The model includes two alternative failure mechanisms, one being of the shock type, immediately leading to a critical failure, another resulting in a gradual deterioration, leading to a degraded failure before the critical failure occurs. Failure data on safety valves from the OREDA (Offshore REliability DAta) data base are analyzed using this model. The estimate for the critical failure rate is obtained and compared with the standard estimate

  3. Practical guidance for statistical analysis of operational event data

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies

  4. Statistical learning methods in high-energy and astrophysics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, J. [Forschungszentrum Juelich GmbH, Zentrallabor fuer Elektronik, 52425 Juelich (Germany) and Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)]. E-mail: zimmerm@mppmu.mpg.de; Kiesling, C. [Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Munich (Germany)

    2004-11-21

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application.

  5. Statistical learning methods in high-energy and astrophysics analysis

    International Nuclear Information System (INIS)

    Zimmermann, J.; Kiesling, C.

    2004-01-01

    We discuss several popular statistical learning methods used in high-energy- and astro-physics analysis. After a short motivation for statistical learning we present the most popular algorithms and discuss several examples from current research in particle- and astro-physics. The statistical learning methods are compared with each other and with standard methods for the respective application

  6. Statistical analysis of partial reduced width distributions

    International Nuclear Information System (INIS)

    Tran Quoc Thuong.

    1973-01-01

    The aim of this study was to develop rigorous methods for analysing experimental event distributions according to a law in chi 2 and to check if the number of degrees of freedom ν is compatible with the value 1 for the reduced neutron width distribution. Two statistical methods were used (the maximum-likelihood method and the method of moments); it was shown, in a few particular cases, that ν is compatible with 1. The difference between ν and 1, if it exists, should not exceed 3%. These results confirm the validity of the compound nucleus model [fr

  7. Statistical analysis of random duration times

    International Nuclear Information System (INIS)

    Engelhardt, M.E.

    1996-04-01

    This report presents basic statistical methods for analyzing data obtained by observing random time durations. It gives nonparametric estimates of the cumulative distribution function, reliability function and cumulative hazard function. These results can be applied with either complete or censored data. Several models which are commonly used with time data are discussed, and methods for model checking and goodness-of-fit tests are discussed. Maximum likelihood estimates and confidence limits are given for the various models considered. Some results for situations where repeated durations such as repairable systems are also discussed

  8. Statistical analysis of random pulse trains

    International Nuclear Information System (INIS)

    Da Costa, G.

    1977-02-01

    Some experimental and theoretical results concerning the statistical properties of optical beams formed by a finite number of independent pulses are presented. The considered waves (corresponding to each pulse) present important spatial variations of the illumination distribution in a cross-section of the beam, due to the time-varying random refractive index distribution in the active medium. Some examples of this kind of emission are: (a) Free-running ruby laser emission; (b) Mode-locked pulse trains; (c) Randomly excited nonlinear media

  9. Statistical analysis of dragline monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Mirabediny, H.; Baafi, E.Y. [University of Tehran, Tehran (Iran)

    1998-07-01

    Dragline monitoring systems are normally the best tool used to collect data on the machine performance and operational parameters of a dragline operation. This paper discusses results of a time study using data from a dragline monitoring system captured over a four month period. Statistical summaries of the time study in terms of average values, standard deviation and frequency distributions showed that the mode of operation and the geological conditions have a significant influence on the dragline performance parameters. 6 refs., 14 figs., 3 tabs.

  10. Failure Propagation Modeling and Analysis via System Interfaces

    Directory of Open Access Journals (Sweden)

    Lin Zhao

    2016-01-01

    Full Text Available Safety-critical systems must be shown to be acceptably safe to deploy and use in their operational environment. One of the key concerns of developing safety-critical systems is to understand how the system behaves in the presence of failures, regardless of whether that failure is triggered by the external environment or caused by internal errors. Safety assessment at the early stages of system development involves analysis of potential failures and their consequences. Increasingly, for complex systems, model-based safety assessment is becoming more widely used. In this paper we propose an approach for safety analysis based on system interface models. By extending interaction models on the system interface level with failure modes as well as relevant portions of the physical system to be controlled, automated support could be provided for much of the failure analysis. We focus on fault modeling and on how to compute minimal cut sets. Particularly, we explore state space reconstruction strategy and bounded searching technique to reduce the number of states that need to be analyzed, which remarkably improves the efficiency of cut sets searching algorithm.

  11. Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.

    Science.gov (United States)

    Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V

    2018-04-01

    A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.

  12. Failure modes and effects analysis of fusion magnet systems

    International Nuclear Information System (INIS)

    Zimmermann, M.; Kazimi, M.S.; Siu, N.O.; Thome, R.J.

    1988-12-01

    A failure modes and consequence analysis of fusion magnet system is an important contributor towards enhancing the design by improving the reliability and reducing the risk associated with the operation of magnet systems. In the first part of this study, a failure mode analysis of a superconducting magnet system is performed. Building on the functional breakdown and the fault tree analysis of the Toroidal Field (TF) coils of the Next European Torus (NET), several subsystem levels are added and an overview of potential sources of failures in a magnet system is provided. The failure analysis is extended to the Poloidal Field (PF) magnet system. Furthermore, an extensive analysis of interactions within the fusion device caused by the operation of the PF magnets is presented in the form of an Interaction Matrix. A number of these interactions may have significant consequences for the TF magnet system particularly interactions triggered by electrical failures in the PF magnet system. In the second part of this study, two basic categories of electrical failures in the PF magnet system are examined: short circuits between the terminals of external PF coils, and faults with a constant voltage applied at external PF coil terminals. An electromagnetic model of the Compact Ignition Tokamak (CIT) is used to examine the mechanical load conditions for the PF and the TF coils resulting from these fault scenarios. It is found that shorts do not pose large threats to the PF coils. Also, the type of plasma disruption has little impact on the net forces on the PF and the TF coils. 39 refs., 30 figs., 12 tabs

  13. Development of component failure data for seismic risk analysis

    International Nuclear Information System (INIS)

    Fray, R.R.; Moulia, T.A.

    1981-01-01

    This paper describes the quantification and utilization of seismic failure data used in the Diablo Canyon Seismic Risk Study. A single variable representation of earthquake severity that uses peak horizontal ground acceleration to characterize earthquake severity was employed. The use of a multiple variable representation would allow direct consideration of vertical accelerations and the spectral nature of earthquakes but would have added such complexity that the study would not have been feasible. Vertical accelerations and spectral nature were indirectly considered because component failure data were derived from design analyses, qualification tests and engineering judgment that did include such considerations. Two types of functions were used to describe component failure probabilities. Ramp functions were used for components, such as piping and structures, qualified by stress analysis. 'Anchor points' for ramp functions were selected by assuming a zero probability of failure at code allowable stress levels and unity probability of failure at ultimate stress levels. The accelerations corresponding to allowable and ultimate stress levels were determined by conservatively assuming a linear relationship between seismic stress and ground acceleration. Step functions were used for components, such as mechanical and electrical equipment, qualified by testing. Anchor points for step functions were selected by assuming a unity probability of failure above the qualification acceleration. (orig./HP)

  14. CONFIDENCE LEVELS AND/VS. STATISTICAL HYPOTHESIS TESTING IN STATISTICAL ANALYSIS. CASE STUDY

    Directory of Open Access Journals (Sweden)

    ILEANA BRUDIU

    2009-05-01

    Full Text Available Estimated parameters with confidence intervals and testing statistical assumptions used in statistical analysis to obtain conclusions on research from a sample extracted from the population. Paper to the case study presented aims to highlight the importance of volume of sample taken in the study and how this reflects on the results obtained when using confidence intervals and testing for pregnant. If statistical testing hypotheses not only give an answer "yes" or "no" to some questions of statistical estimation using statistical confidence intervals provides more information than a test statistic, show high degree of uncertainty arising from small samples and findings build in the "marginally significant" or "almost significant (p very close to 0.05.

  15. Failure analysis of the boiler water-wall tube

    Directory of Open Access Journals (Sweden)

    S.W. Liu

    2017-10-01

    Full Text Available Failure analysis of the boiler water-wall tube is presented in this work. In order to examine the causes of failure, various techniques including visual inspection, chemical analysis, optical microscopy, scanning electron microscopy and energy dispersive spectroscopy were carried out. Tube wall thickness measurements were performed on the ruptured tube. The fire-facing side of the tube was observed to have experienced significant wall thinning. The composition of the matrix material of the tube meets the requirements of the relevant standards. Microscopic examinations showed that the spheroidization of pearlite is not very obvious. The failure mechanism is identified as a result of the significant localized wall thinning of the boiler water-wall tube due to oxidation.

  16. Reliability Analysis of Fatigue Failure of Cast Components for Wind Turbines

    Directory of Open Access Journals (Sweden)

    Hesam Mirzaei Rafsanjani

    2015-04-01

    Full Text Available Fatigue failure is one of the main failure modes for wind turbine drivetrain components made of cast iron. The wind turbine drivetrain consists of a variety of heavily loaded components, like the main shaft, the main bearings, the gearbox and the generator. The failure of each component will lead to substantial economic losses such as cost of lost energy production and cost of repairs. During the design lifetime, the drivetrain components are exposed to variable loads from winds and waves and other sources of loads that are uncertain and have to be modeled as stochastic variables. The types of loads are different for offshore and onshore wind turbines. Moreover, uncertainties about the fatigue strength play an important role in modeling and assessment of the reliability of the components. In this paper, a generic stochastic model for fatigue failure of cast iron components based on fatigue test data and a limit state equation for fatigue failure based on the SN-curve approach and Miner’s rule is presented. The statistical analysis of the fatigue data is performed using the Maximum Likelihood Method which also gives an estimate of the statistical uncertainties. Finally, illustrative examples are presented with reliability analyses depending on various stochastic models and partial safety factors.

  17. Statistical analysis of the ASME KIc database

    International Nuclear Information System (INIS)

    Sokolov, M.A.

    1998-01-01

    The American Society of Mechanical Engineers (ASME) K Ic curve is a function of test temperature (T) normalized to a reference nil-ductility temperature, RT NDT , namely, T-RT NDT . It was constructed as the lower boundary to the available K Ic database. Being a lower bound to the unique but limited database, the ASME K Ic curve concept does not discuss probability matters. However, a continuing evolution of fracture mechanics advances has led to employment of the Weibull distribution function to model the scatter of fracture toughness values in the transition range. The Weibull statistic/master curve approach was applied to analyze the current ASME K Ic database. It is shown that the Weibull distribution function models the scatter in K Ic data from different materials very well, while the temperature dependence is described by the master curve. Probabilistic-based tolerance-bound curves are suggested to describe lower-bound K Ic values

  18. Statistical analysis of earthquake ground motion parameters

    International Nuclear Information System (INIS)

    1979-12-01

    Several earthquake ground response parameters that define the strength, duration, and frequency content of the motions are investigated using regression analyses techniques; these techniques incorporate statistical significance testing to establish the terms in the regression equations. The parameters investigated are the peak acceleration, velocity, and displacement; Arias intensity; spectrum intensity; bracketed duration; Trifunac-Brady duration; and response spectral amplitudes. The study provides insight into how these parameters are affected by magnitude, epicentral distance, local site conditions, direction of motion (i.e., whether horizontal or vertical), and earthquake event type. The results are presented in a form so as to facilitate their use in the development of seismic input criteria for nuclear plants and other major structures. They are also compared with results from prior investigations that have been used in the past in the criteria development for such facilities

  19. Failure modes and effects analysis (FMEA) for Gamma Knife radiosurgery.

    Science.gov (United States)

    Xu, Andy Yuanguang; Bhatnagar, Jagdish; Bednarz, Greg; Flickinger, John; Arai, Yoshio; Vacsulka, Jonet; Feng, Wenzheng; Monaco, Edward; Niranjan, Ajay; Lunsford, L Dade; Huq, M Saiful

    2017-11-01

    Gamma Knife radiosurgery is a highly precise and accurate treatment technique for treating brain diseases with low risk of serious error that nevertheless could potentially be reduced. We applied the AAPM Task Group 100 recommended failure modes and effects analysis (FMEA) tool to develop a risk-based quality management program for Gamma Knife radiosurgery. A team consisting of medical physicists, radiation oncologists, neurosurgeons, radiation safety officers, nurses, operating room technologists, and schedulers at our institution and an external physicist expert on Gamma Knife was formed for the FMEA study. A process tree and a failure mode table were created for the Gamma Knife radiosurgery procedures using the Leksell Gamma Knife Perfexion and 4C units. Three scores for the probability of occurrence (O), the severity (S), and the probability of no detection for failure mode (D) were assigned to each failure mode by 8 professionals on a scale from 1 to 10. An overall risk priority number (RPN) for each failure mode was then calculated from the averaged O, S, and D scores. The coefficient of variation for each O, S, or D score was also calculated. The failure modes identified were prioritized in terms of both the RPN scores and the severity scores. The established process tree for Gamma Knife radiosurgery consists of 10 subprocesses and 53 steps, including a subprocess for frame placement and 11 steps that are directly related to the frame-based nature of the Gamma Knife radiosurgery. Out of the 86 failure modes identified, 40 Gamma Knife specific failure modes were caused by the potential for inappropriate use of the radiosurgery head frame, the imaging fiducial boxes, the Gamma Knife helmets and plugs, the skull definition tools as well as other features of the GammaPlan treatment planning system. The other 46 failure modes are associated with the registration, imaging, image transfer, contouring processes that are common for all external beam radiation therapy

  20. Rooting out causes in failure analysis; Risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Keith, Graeme

    2010-07-01

    The Deepwater Horizon disaster was a terrible reminder of the consequences of equipment failure on facilities operating in challenging environments. Thankfully catastrophes on the scale of the Deepwater Horizon are rare, but equipment failure is a daily occurrence on installations around the globe. The consequences range from short unexpected downtime, to a total stop on production. from a brief burst of flaring to lasting environmental damage and from the momentary discomfiture of a worker to incapability or death. (Author)

  1. Statistical power analysis for the behavioral sciences

    National Research Council Canada - National Science Library

    Cohen, Jacob

    1988-01-01

    .... A chapter has been added for power analysis in set correlation and multivariate methods (Chapter 10). Set correlation is a realization of the multivariate general linear model, and incorporates the standard multivariate methods...

  2. Statistical power analysis for the behavioral sciences

    National Research Council Canada - National Science Library

    Cohen, Jacob

    1988-01-01

    ... offers a unifying framework and some new data-analytic possibilities. 2. A new chapter (Chapter 11) considers some general topics in power analysis in more integrted form than is possible in the earlier...

  3. Statistical methods for categorical data analysis

    CERN Document Server

    Powers, Daniel

    2008-01-01

    This book provides a comprehensive introduction to methods and models for categorical data analysis and their applications in social science research. Companion website also available, at https://webspace.utexas.edu/dpowers/www/

  4. SU-E-T-205: MLC Predictive Maintenance Using Statistical Process Control Analysis.

    Science.gov (United States)

    Able, C; Hampton, C; Baydush, A; Bright, M

    2012-06-01

    MLC failure increases accelerator downtime and negatively affects the clinic treatment delivery schedule. This study investigates the use of Statistical Process Control (SPC), a modern quality control methodology, to retrospectively evaluate MLC performance data thereby predicting the impending failure of individual MLC leaves. SPC, a methodology which detects exceptional variability in a process, was used to analyze MLC leaf velocity data. A MLC velocity test is performed weekly on all leaves during morning QA. The leaves sweep 15 cm across the radiation field with the gantry pointing down. The leaf speed is analyzed from the generated dynalog file using quality assurance software. MLC leaf speeds in which a known motor failure occurred (8) and those in which no motor replacement was performed (11) were retrospectively evaluated for a 71 week period. SPC individual and moving range (I/MR) charts were used in the analysis. The I/MR chart limits were calculated using the first twenty weeks of data and set at 3 standard deviations from the mean. The MLCs in which a motor failure occurred followed two general trends: (a) no data indicating a change in leaf speed prior to failure (5 of 8) and (b) a series of data points exceeding the limit prior to motor failure (3 of 8). I/MR charts for a high percentage (8 of 11) of the non-replaced MLC motors indicated that only a single point exceeded the limit. These single point excesses were deemed false positives. SPC analysis using MLC performance data may be helpful in detecting a significant percentage of impending failures of MLC motors. The ability to detect MLC failure may depend on the method of failure (i.e. gradual or catastrophic). Further study is needed to determine if increasing the sampling frequency could increase reliability. Project was support by a grant from Varian Medical Systems, Inc. © 2012 American Association of Physicists in Medicine.

  5. Reliability test and failure analysis of high power LED packages

    International Nuclear Information System (INIS)

    Chen Zhaohui; Zhang Qin; Wang Kai; Luo Xiaobing; Liu Sheng

    2011-01-01

    A new type application specific light emitting diode (LED) package (ASLP) with freeform polycarbonate lens for street lighting is developed, whose manufacturing processes are compatible with a typical LED packaging process. The reliability test methods and failure criterions from different vendors are reviewed and compared. It is found that test methods and failure criterions are quite different. The rapid reliability assessment standards are urgently needed for the LED industry. 85 0 C/85 RH with 700 mA is used to test our LED modules with three other vendors for 1000 h, showing no visible degradation in optical performance for our modules, with two other vendors showing significant degradation. Some failure analysis methods such as C-SAM, Nano X-ray CT and optical microscope are used for LED packages. Some failure mechanisms such as delaminations and cracks are detected in the LED packages after the accelerated reliability testing. The finite element simulation method is helpful for the failure analysis and design of the reliability of the LED packaging. One example is used to show one currently used module in industry is vulnerable and may not easily pass the harsh thermal cycle testing. (semiconductor devices)

  6. Failure criterion of concrete type material and punching failure analysis of thick mortar plate

    International Nuclear Information System (INIS)

    Ohno, T.; Kuroiwa, M.; Irobe, M.

    1979-01-01

    In this paper falure surface of concrete type material is proposed and its validity to structural analysis is examined. The study is an introductory part of evaluation for ultimate strength of reinforced and prestressed concrete structures in reactor technology. The failure surface is expressed in a linear form in terms of octahedral normal and shear stresses. Coefficient of the latter stress is given by a trigonometric series in threefold angle of similarity. Hence, its meridians are multilinear and traces of its deviatoric sections are smooth curves having periodicity of 2π/3 around space diagonal in principal stress space. The mathematical expression of the surface has an arbitraty number of parameters so that material test results are well reflected. To confirm the effectiveness of proposed failure criterion, experiment and numerical analysis by the finite element method on punching failure of thick mortar plate in axial symmetry are compared. In the numerical procedure yield surface of the material is assumed to exist mainly in compression region, since a brittle cleavage or elastic fracture occurs in the concrete type material under stress state with tension, while a ductile or plastic fracture occurs under compressive stress state. (orig.)

  7. Universal avalanche statistics and triggering close to failure in a mean-field model of rheological fracture

    Science.gov (United States)

    Baró, Jordi; Davidsen, Jörn

    2018-03-01

    The hypothesis of critical failure relates the presence of an ultimate stability point in the structural constitutive equation of materials to a divergence of characteristic scales in the microscopic dynamics responsible for deformation. Avalanche models involving critical failure have determined common universality classes for stick-slip processes and fracture. However, not all empirical failure processes exhibit the trademarks of criticality. The rheological properties of materials introduce dissipation, usually reproduced in conceptual models as a hardening of the coarse grained elements of the system. Here, we investigate the effects of transient hardening on (i) the activity rate and (ii) the statistical properties of avalanches. We find the explicit representation of transient hardening in the presence of generalized viscoelasticity and solve the corresponding mean-field model of fracture. In the quasistatic limit, the accelerated energy release is invariant with respect to rheology and the avalanche propagation can be reinterpreted in terms of a stochastic counting process. A single universality class can be defined from such analogy, and all statistical properties depend only on the distance to criticality. We also prove that interevent correlations emerge due to the hardening—even in the quasistatic limit—that can be interpreted as "aftershocks" and "foreshocks."

  8. Statistical Modelling of Wind Proles - Data Analysis and Modelling

    DEFF Research Database (Denmark)

    Jónsson, Tryggvi; Pinson, Pierre

    The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles.......The aim of the analysis presented in this document is to investigate whether statistical models can be used to make very short-term predictions of wind profiles....

  9. Sensitivity analysis of ranked data: from order statistics to quantiles

    NARCIS (Netherlands)

    Heidergott, B.F.; Volk-Makarewicz, W.

    2015-01-01

    In this paper we provide the mathematical theory for sensitivity analysis of order statistics of continuous random variables, where the sensitivity is with respect to a distributional parameter. Sensitivity analysis of order statistics over a finite number of observations is discussed before

  10. Statistics

    CERN Document Server

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  11. Evaluating wood failure in plywood shear by optical image analysis

    Science.gov (United States)

    Charles W. McMillin

    1984-01-01

    This exploratory study evaulates the potential of using an automatic image analysis method to measure percent wood failure in plywood shear specimens. The results suggest that this method my be as accurate as the visual method in tracking long-term gluebond quality. With further refinement, the method could lead to automated equipment replacing the subjective visual...

  12. Failure analysis of multiple delaminated composite plates due

    Indian Academy of Sciences (India)

    The present work aims at the first ply failure analysis of laminated composite plates with arbitrarily located multiple delaminations subjected to transverse static load as well as impact. The theoretical formulation is based on a simple multiple delamination model. Conventional first order shear deformation is assumed using ...

  13. A pragmatic approach to estimate alpha factors for common cause failure analysis

    International Nuclear Information System (INIS)

    Hassija, Varun; Senthil Kumar, C.; Velusamy, K.

    2014-01-01

    Highlights: • Estimation of coefficients in alpha factor model for common cause analysis. • A derivation of plant specific alpha factors is demonstrated. • We examine sensitivity of common cause contribution to total system failure. • We compare beta factor and alpha factor models for various redundant configurations. • The use of alpha factors is preferable, especially for large redundant systems. - Abstract: Most of the modern technological systems are deployed with high redundancy but still they fail mainly on account of common cause failures (CCF). Various models such as Beta Factor, Multiple Greek Letter, Binomial Failure Rate and Alpha Factor exists for estimation of risk from common cause failures. Amongst all, alpha factor model is considered most suitable for high redundant systems as it arrives at common cause failure probabilities from a set of ratios of failures and the total component failure probability Q T . In the present study, alpha factor model is applied for the assessment of CCF of safety systems deployed at two nuclear power plants. A method to overcome the difficulties in estimation of the coefficients viz., alpha factors in the model, importance of deriving plant specific alpha factors and sensitivity of common cause contribution to the total system failure probability with respect to hazard imposed by various CCF events is highlighted. An approach described in NUREG/CR-5500 is extended in this study to provide more explicit guidance for a statistical approach to derive plant specific coefficients for CCF analysis especially for high redundant systems. The procedure is expected to aid regulators for independent safety assessment

  14. Risk analysis of geothermal power plants using Failure Modes and Effects Analysis (FMEA) technique

    International Nuclear Information System (INIS)

    Feili, Hamid Reza; Akar, Navid; Lotfizadeh, Hossein; Bairampour, Mohammad; Nasiri, Sina

    2013-01-01

    Highlights: • Using Failure Modes and Effects Analysis (FMEA) to find potential failures in geothermal power plants. • We considered 5 major parts of geothermal power plants for risk analysis. • Risk Priority Number (RPN) is calculated for all failure modes. • Corrective actions are recommended to eliminate or decrease the risk of failure modes. - Abstract: Renewable energy plays a key role in the transition toward a low carbon economy and the provision of a secure supply of energy. Geothermal energy is a versatile source as a form of renewable energy that meets popular demand. Since some Geothermal Power Plants (GPPs) face various failures, the requirement of a technique for team engineering to eliminate or decrease potential failures is considerable. Because no specific published record of considering an FMEA applied to GPPs with common failure modes have been found already, in this paper, the utilization of Failure Modes and Effects Analysis (FMEA) as a convenient technique for determining, classifying and analyzing common failures in typical GPPs is considered. As a result, an appropriate risk scoring of occurrence, detection and severity of failure modes and computing the Risk Priority Number (RPN) for detecting high potential failures is achieved. In order to expedite accuracy and ability to analyze the process, XFMEA software is utilized. Moreover, 5 major parts of a GPP is studied to propose a suitable approach for developing GPPs and increasing reliability by recommending corrective actions for each failure mode

  15. Statistical analysis of disruptions in JET

    International Nuclear Information System (INIS)

    De Vries, P.C.; Johnson, M.F.; Segui, I.

    2009-01-01

    The disruption rate (the percentage of discharges that disrupt) in JET was found to drop steadily over the years. Recent campaigns (2005-2007) show a yearly averaged disruption rate of only 6% while from 1991 to 1995 this was often higher than 20%. Besides the disruption rate, the so-called disruptivity, or the likelihood of a disruption depending on the plasma parameters, has been determined. The disruptivity of plasmas was found to be significantly higher close to the three main operational boundaries for tokamaks; the low-q, high density and β-limit. The frequency at which JET operated close to the density-limit increased six fold over the last decade; however, only a small reduction in disruptivity was found. Similarly the disruptivity close to the low-q and β-limit was found to be unchanged. The most significant reduction in disruptivity was found far from the operational boundaries, leading to the conclusion that the improved disruption rate is due to a better technical capability of operating JET, instead of safer operations close to the physics limits. The statistics showed that a simple protection system was able to mitigate the forces of a large fraction of disruptions, although it has proved to be at present more difficult to ameliorate the heat flux.

  16. Failure analysis of carbide fuels under transient overpower (TOP) conditions

    International Nuclear Information System (INIS)

    Nguyen, D.H.

    1980-06-01

    The failure of carbide fuels in the Fast Test Reactor (FTR) under Transient Overpower (TOP) conditions has been examined. The Beginning-of-Cycle Four (BOC-4) all-oxide base case, at $.50/sec ramp rate was selected as the reference case. A coupling between the advanced fuel performance code UNCLE-T and HCDA Code MELT-IIIA was necessary for the analysis. UNCLE-T was used to determine cladding failure and fuel preconditioning which served as initial conditions for MELT-III calculations. MELT-IIIA determined the time of molten fuel ejection from fuel pin

  17. Bruxism and dental implant failures: a multilevel mixed effects parametric survival analysis approach.

    Science.gov (United States)

    Chrcanovic, B R; Kisch, J; Albrektsson, T; Wennerberg, A

    2016-11-01

    Recent studies have suggested that the insertion of dental implants in patients being diagnosed with bruxism negatively affected the implant failure rates. The aim of the present study was to investigate the association between the bruxism and the risk of dental implant failure. This retrospective study is based on 2670 patients who received 10 096 implants at one specialist clinic. Implant- and patient-related data were collected. Descriptive statistics were used to describe the patients and implants. Multilevel mixed effects parametric survival analysis was used to test the association between bruxism and risk of implant failure adjusting for several potential confounders. Criteria from a recent international consensus (Lobbezoo et al., J Oral Rehabil, 40, 2013, 2) and from the International Classification of Sleep Disorders (International classification of sleep disorders, revised: diagnostic and coding manual, American Academy of Sleep Medicine, Chicago, 2014) were used to define and diagnose the condition. The number of implants with information available for all variables totalled 3549, placed in 994 patients, with 179 implants reported as failures. The implant failure rates were 13·0% (24/185) for bruxers and 4·6% (155/3364) for non-bruxers (P bruxism was a statistically significantly risk factor to implant failure (HR 3·396; 95% CI 1·314, 8·777; P = 0·012), as well as implant length, implant diameter, implant surface, bone quantity D in relation to quantity A, bone quality 4 in relation to quality 1 (Lekholm and Zarb classification), smoking and the intake of proton pump inhibitors. It is suggested that the bruxism may be associated with an increased risk of dental implant failure. © 2016 John Wiley & Sons Ltd.

  18. Failure mode and effects analysis outputs: are they valid?

    Science.gov (United States)

    Shebl, Nada Atef; Franklin, Bryony Dean; Barber, Nick

    2012-06-10

    Failure Mode and Effects Analysis (FMEA) is a prospective risk assessment tool that has been widely used within the aerospace and automotive industries and has been utilised within healthcare since the early 1990s. The aim of this study was to explore the validity of FMEA outputs within a hospital setting in the United Kingdom. Two multidisciplinary teams each conducted an FMEA for the use of vancomycin and gentamicin. Four different validity tests were conducted: Face validity: by comparing the FMEA participants' mapped processes with observational work. Content validity: by presenting the FMEA findings to other healthcare professionals. Criterion validity: by comparing the FMEA findings with data reported on the trust's incident report database. Construct validity: by exploring the relevant mathematical theories involved in calculating the FMEA risk priority number. Face validity was positive as the researcher documented the same processes of care as mapped by the FMEA participants. However, other healthcare professionals identified potential failures missed by the FMEA teams. Furthermore, the FMEA groups failed to include failures related to omitted doses; yet these were the failures most commonly reported in the trust's incident database. Calculating the RPN by multiplying severity, probability and detectability scores was deemed invalid because it is based on calculations that breach the mathematical properties of the scales used. There are significant methodological challenges in validating FMEA. It is a useful tool to aid multidisciplinary groups in mapping and understanding a process of care; however, the results of our study cast doubt on its validity. FMEA teams are likely to need different sources of information, besides their personal experience and knowledge, to identify potential failures. As for FMEA's methodology for scoring failures, there were discrepancies between the teams' estimates and similar incidents reported on the trust's incident

  19. Common Cause Failure Analysis for the Digital Plant Protection System

    International Nuclear Information System (INIS)

    Kagn, Hyun Gook; Jang, Seung Cheol

    2005-01-01

    Safety-critical systems such as nuclear power plants adopt the multiple-redundancy design in order to reduce the risk from the single component failure. The digitalized safety-signal generation system is also designed based on the multiple-redundancy strategy which consists of more redundant components. The level of the redundant design of digital systems is usually higher than those of conventional mechanical systems. This higher redundancy would clearly reduce the risk from the single failure of components, but raise the importance of the common cause failure (CCF) analysis. This research aims to develop the practical and realistic method for modeling the CCF in digital safety-critical systems. We propose a simple and practical framework for assessing the CCF probability of digital equipment. Higher level of redundancy causes the difficulty of CCF analysis because it results in impractically large number of CCF events in the fault tree model when we use conventional CCF modeling methods. We apply the simplified alpha-factor (SAF) method to the digital system CCF analysis. The precedent study has shown that SAF method is quite realistic but simple when we consider carefully system success criteria. The first step for using the SAF method is the analysis of target system for determining the function failure cases. That is, the success criteria of the system could be derived from the target system's function and configuration. Based on this analysis, we can calculate the probability of single CCF event which represents the CCF events resulting in the system failure. In addition to the application of SAF method, in order to accommodate the other characteristics of digital technology, we develop a simple concept and several equations for practical use

  20. Trend and pattern analysis of failures of main feedwater system components in United States commercial nuclear power plants

    International Nuclear Information System (INIS)

    Gentillon, C.D.; Meachum, T.R.; Brady, B.M.

    1987-01-01

    The goal of the trend and pattern analysis of MFW (main feedwater) component failure data is to identify component attributes that are associated with relatively high incidences of failure. Manufacturer, valve type, and pump rotational speed are examples of component attributes under study; in addition, the pattern of failures among NPP units is studied. A series of statistical methods is applied to identify trends and patterns in failures and trends in occurrences in time with regard to these component attributes or variables. This process is followed by an engineering evaluation of the statistical results. In the remainder of this paper, the characteristics of the NPRDS that facilitate its use in reliability and risk studies are highlighted, the analysis methods are briefly described, and the lessons learned thus far for improving MFW system availability and reliability are summarized (orig./GL)

  1. PACC information management code for common cause failures analysis

    International Nuclear Information System (INIS)

    Ortega Prieto, P.; Garcia Gay, J.; Mira McWilliams, J.

    1987-01-01

    The purpose of this paper is to present the PACC code, which, through an adequate data management, makes the task of computerized common-mode failure analysis easier. PACC processes and generates information in order to carry out the corresponding qualitative analysis, by means of the boolean technique of transformation of variables, and the quantitative analysis either using one of several parametric methods or a direct data-base. As far as the qualitative analysis is concerned, the code creates several functional forms for the transformation equations according to the user's choice. These equations are subsequently processed by boolean manipulation codes, such as SETS. The quantitative calculations of the code can be carried out in two different ways: either starting from a common cause data-base, or through parametric methods, such as the Binomial Failure Rate Method, the Basic Parameters Method or the Multiple Greek Letter Method, among others. (orig.)

  2. TU-AB-BRD-02: Failure Modes and Effects Analysis

    International Nuclear Information System (INIS)

    Huq, M.

    2015-01-01

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before a failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to

  3. TU-AB-BRD-02: Failure Modes and Effects Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huq, M. [University of Pittsburgh Medical Center (United States)

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before a failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to

  4. Analysis of valve failures from the NUCLARR data base

    International Nuclear Information System (INIS)

    Moore, L.M.

    1997-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) contains data on component failures with categorical and qualifying information such as component design, normal operating state, system application and safety grade information which is important to the development of risk-based component surveillance testing requirements. This report presents descriptions and results of analyses of valve component failure data and covariate information available in the document Nuclear Computerized Library for Assessing Reactor Reliability Data Manual, Part 3: Hardware Component Failure Data (NUCLARR Data Manual). Although there are substantial records on valve performance, there are many categories of the corresponding descriptors and qualifying information for which specific values are missing. Consequently, this limits the data available for analysis of covariate effects. This report presents cross tabulations by different covariate categories and limited modeling of covariate effects for data subsets with substantive non-missing covariate information

  5. Photovoltaic module reliability improvement through application testing and failure analysis

    Science.gov (United States)

    Dumas, L. N.; Shumka, A.

    1982-01-01

    During the first four years of the U.S. Department of Energy (DOE) National Photovoltatic Program, the Jet Propulsion Laboratory Low-Cost Solar Array (LSA) Project purchased about 400 kW of photovoltaic modules for test and experiments. In order to identify, report, and analyze test and operational problems with the Block Procurement modules, a problem/failure reporting and analysis system was implemented by the LSA Project with the main purpose of providing manufacturers with feedback from test and field experience needed for the improvement of product performance and reliability. A description of the more significant types of failures is presented, taking into account interconnects, cracked cells, dielectric breakdown, delamination, and corrosion. Current design practices and reliability evaluations are also discussed. The conducted evaluation indicates that current module designs incorporate damage-resistant and fault-tolerant features which address field failure mechanisms observed to date.

  6. Risk assessment of the emergency processes: Healthcare failure mode and effect analysis.

    Science.gov (United States)

    Taleghani, Yasamin Molavi; Rezaei, Fatemeh; Sheikhbardsiri, Hojat

    2016-01-01

    Ensuring about the patient's safety is the first vital step in improving the quality of care and the emergency ward is known as a high-risk area in treatment health care. The present study was conducted to evaluate the selected risk processes of emergency surgery department of a treatment-educational Qaem center in Mashhad by using analysis method of the conditions and failure effects in health care. In this study, in combination (qualitative action research and quantitative cross-sectional), failure modes and effects of 5 high-risk procedures of the emergency surgery department were identified and analyzed according to Healthcare Failure Mode and Effects Analysis (HFMEA). To classify the failure modes from the "nursing errors in clinical management model (NECM)", the classification of the effective causes of error from "Eindhoven model" and determination of the strategies to improve from the "theory of solving problem by an inventive method" were used. To analyze the quantitative data of descriptive statistics (total points) and to analyze the qualitative data, content analysis and agreement of comments of the members were used. In 5 selected processes by "voting method using rating", 23 steps, 61 sub-processes and 217 potential failure modes were identified by HFMEA. 25 (11.5%) failure modes as the high risk errors were detected and transferred to the decision tree. The most and the least failure modes were placed in the categories of care errors (54.7%) and knowledge and skill (9.5%), respectively. Also, 29.4% of preventive measures were in the category of human resource management strategy. "Revision and re-engineering of processes", "continuous monitoring of the works", "preparation and revision of operating procedures and policies", "developing the criteria for evaluating the performance of the personnel", "designing a suitable educational content for needs of employee", "training patients", "reducing the workload and power shortage", "improving team

  7. An analysis of UK wind farm statistics

    International Nuclear Information System (INIS)

    Milborrow, D.J.

    1995-01-01

    An analysis of key data for 22 completed wind projects shows 134 MW of plant cost Pound 152 million, giving an average cost of Pound 1136/kW. The energy generation potential of these windfarms is around 360 GWh, derived from sites with windspeeds between 6.2 and 8.8 m/s. Relationships between wind speed, energy production and cost were examined and it was found that costs increased with wind speed, due to the difficulties of access in hilly regions. It also appears that project costs fell with time and wind energy prices have fallen much faster than electricity prices. (Author)

  8. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  9. Progressive Damage and Failure Analysis of Composite Laminates

    Science.gov (United States)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis

  10. Statistical cluster analysis and diagnosis of nuclear system level performance

    International Nuclear Information System (INIS)

    Teichmann, T.; Levine, M.M.; Samanta, P.K.; Kato, W.Y.

    1985-01-01

    The complexity of individual nuclear power plants and the importance of maintaining reliable and safe operations makes it desirable to complement the deterministic analyses of these plants by corresponding statistical surveys and diagnoses. Based on such investigations, one can then explore, statistically, the anticipation, prevention, and when necessary, the control of such failures and malfunctions. This paper, and the accompanying one by Samanta et al., describe some of the initial steps in exploring the feasibility of setting up such a program on an integrated and global (industry-wide) basis. The conceptual statistical and data framework was originally outlined in BNL/NUREG-51609, NUREG/CR-3026, and the present work aims at showing how some important elements might be implemented in a practical way (albeit using hypothetical or simulated data)

  11. A statistical analysis of UK financial networks

    Science.gov (United States)

    Chu, J.; Nadarajah, S.

    2017-04-01

    In recent years, with a growing interest in big or large datasets, there has been a rise in the application of large graphs and networks to financial big data. Much of this research has focused on the construction and analysis of the network structure of stock markets, based on the relationships between stock prices. Motivated by Boginski et al. (2005), who studied the characteristics of a network structure of the US stock market, we construct network graphs of the UK stock market using same method. We fit four distributions to the degree density of the vertices from these graphs, the Pareto I, Fréchet, lognormal, and generalised Pareto distributions, and assess the goodness of fit. Our results show that the degree density of the complements of the market graphs, constructed using a negative threshold value close to zero, can be fitted well with the Fréchet and lognormal distributions.

  12. A multivariate statistical methodology for detection of degradation and failure trends using nuclear power plant operational data

    International Nuclear Information System (INIS)

    Samanta, P.K.; Teichmann, T.

    1990-01-01

    In this paper, a multivariate statistical method is presented and demonstrated as a means for analyzing nuclear power plant transients (or events) and safety system performance for detection of malfunctions and degradations within the course of the event based on operational data. The study provides the methodology and illustrative examples based on data gathered from simulation of nuclear power plant transients (due to lack of easily accessible operational data). Such an approach, once fully developed, can be used to detect failure trends and patterns and so can lead to prevention of conditions with serious safety implications

  13. Statistics

    Science.gov (United States)

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  14. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    Science.gov (United States)

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  15. Tests and analysis on steam generator tube failure propagation

    International Nuclear Information System (INIS)

    Tanabe, Hiromi

    1990-01-01

    The understanding of leak enlargement and failure propagation behavior is essential to select a design basis leak (DBL) of LMFBR steam generators. Therefore, various series of experiments, such as self-enlargement tests, target wastage tests, failure propagation tests were conducted in a wide range of leak using test facilities of SWAT at PNC/OEC. Especially, in the large leak tests, potential of overheating failure was investigated under a prototypical steam cooling condition inside target tubes. In the small leak, the difference of wastage resistivity was clarified among several tube materials such as 9-chrome steels. In regard to an analytical approach, a computer code LEAP (Leak Enlargement and Propagation) was developed on the basis of all of these experimental results. The code was used to validate the previously selected DBL of the prototype reactor, Monju, steam generator. This approach proved to be successful in spite of somewhat over-conservatism in the analysis. Moreover, LEAP clarified the effectiveness of a rapid steam dump and an enhanced leak detection system. The code improvement toward a realistic analysis is desired, however, to lessen the DBL for a future large plant and then the re-evaluation of the experimental data such as the size of secondary failure is under way. (author). 4 refs, 8 figs, 1 tab

  16. Method for statistical data analysis of multivariate observations

    CERN Document Server

    Gnanadesikan, R

    1997-01-01

    A practical guide for multivariate statistical techniques-- now updated and revised In recent years, innovations in computer technology and statistical methodologies have dramatically altered the landscape of multivariate data analysis. This new edition of Methods for Statistical Data Analysis of Multivariate Observations explores current multivariate concepts and techniques while retaining the same practical focus of its predecessor. It integrates methods and data-based interpretations relevant to multivariate analysis in a way that addresses real-world problems arising in many areas of inte

  17. Statistical evaluation of diagnostic performance topics in ROC analysis

    CERN Document Server

    Zou, Kelly H; Bandos, Andriy I; Ohno-Machado, Lucila; Rockette, Howard E

    2016-01-01

    Statistical evaluation of diagnostic performance in general and Receiver Operating Characteristic (ROC) analysis in particular are important for assessing the performance of medical tests and statistical classifiers, as well as for evaluating predictive models or algorithms. This book presents innovative approaches in ROC analysis, which are relevant to a wide variety of applications, including medical imaging, cancer research, epidemiology, and bioinformatics. Statistical Evaluation of Diagnostic Performance: Topics in ROC Analysis covers areas including monotone-transformation techniques in parametric ROC analysis, ROC methods for combined and pooled biomarkers, Bayesian hierarchical transformation models, sequential designs and inferences in the ROC setting, predictive modeling, multireader ROC analysis, and free-response ROC (FROC) methodology. The book is suitable for graduate-level students and researchers in statistics, biostatistics, epidemiology, public health, biomedical engineering, radiology, medi...

  18. Online Statistical Modeling (Regression Analysis) for Independent Responses

    Science.gov (United States)

    Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus

    2017-06-01

    Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.

  19. Statistical Analysis and Modelling of Olkiluoto Structures

    International Nuclear Information System (INIS)

    Hellae, P.; Vaittinen, T.; Saksa, P.; Nummela, J.

    2004-11-01

    Posiva Oy is carrying out investigations for the disposal of the spent nuclear fuel at the Olkiluoto site in SW Finland. The investigations have focused on the central part of the island. The layout design of the entire repository requires characterization of notably larger areas and must rely at least at the current stage on borehole information from a rather sparse network and on the geophysical soundings providing information outside and between the holes. In this work, the structural data according to the current version of the Olkiluoto bedrock model is analyzed. The bedrock model relies much on the borehole data although results of the seismic surveys and, for example, pumping tests are used in determining the orientation and continuation of the structures. Especially in the analysis, questions related to the frequency of structures and size of the structures are discussed. The structures observed in the boreholes are mainly dipping gently to the southeast. About 9 % of the sample length belongs to structures. The proportion is higher in the upper parts of the rock. The number of fracture and crushed zones seems not to depend greatly on the depth, whereas the hydraulic features concentrate on the depth range above -100 m. Below level -300 m, the hydraulic conductivity occurs in connection of fractured zones. Especially the hydraulic features, but also fracture and crushed zones often occur in groups. The frequency of the structure (area of structures per total volume) is estimated to be of the order of 1/100m. The size of the local structures was estimated by calculating the intersection of the zone to the nearest borehole where the zone has not been detected. Stochastic models using the Fracman software by Golder Associates were generated based on the bedrock model data complemented with the magnetic ground survey data. The seismic surveys (from boreholes KR5, KR13, KR14, and KR19) were used as alternative input data. The generated models were tested by

  20. A statistical model for prediction of fuel element failure using the Markov process and entropy minimax principles

    International Nuclear Information System (INIS)

    Choi, K.Y.; Yoon, Y.K.; Chang, S.H.

    1991-01-01

    This paper reports on a new statistical fuel failure model developed to take into account the effects of damaging environmental conditions and the overall operating history of the fuel elements. The degradation of material properties and damage resistance of the fuel cladding is mainly caused by the combined effects of accumulated dynamic stresses, neutron irradiation, and chemical and stress corrosion at operating temperature. Since the degradation of material properties due to these effects can be considered as a stochastic process, a dynamic reliability function is derived based on the Markov process. Four damage parameters, namely, dynamic stresses, magnitude of power increase from the preceding power level and with ramp rate, and fatigue cycles, are used to build this model. The dynamic reliability function and damage parameters are used to obtain effective damage parameters. The entropy maximization principle is used to generate a probability density function of the effective damage parameters. The entropy minimization principle is applied to determine weighting factors for amalgamation of the failure probabilities due to the respective failure modes. In this way, the effects of operating history, damaging environmental conditions, and damage sequence are taken into account

  1. Explorations in Statistics: The Analysis of Ratios and Normalized Data

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of "Explorations in Statistics" explores the analysis of ratios and normalized--or standardized--data. As researchers, we compute a ratio--a numerator divided by a denominator--to compute a…

  2. Analysis of thrips distribution: application of spatial statistics and Kriging

    Science.gov (United States)

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  3. failure analysis of a uav flight control system using markov analysis

    African Journals Online (AJOL)

    Failure analysis of a flight control system proposed for Air Force Institute of Technology (AFIT) Unmanned Aerial Vehicle (UAV) was studied using Markov Analysis (MA). It was perceived that understanding of the number of failure states and the probability of being in those state are of paramount importance in order to ...

  4. Statistics

    International Nuclear Information System (INIS)

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  5. Failure characteristics analysis and fault diagnosis for liquid rocket engines

    CERN Document Server

    Zhang, Wei

    2016-01-01

    This book concentrates on the subject of health monitoring technology of Liquid Rocket Engine (LRE), including its failure analysis, fault diagnosis and fault prediction. Since no similar issue has been published, the failure pattern and mechanism analysis of the LRE from the system stage are of particular interest to the readers. Furthermore, application cases used to validate the efficacy of the fault diagnosis and prediction methods of the LRE are different from the others. The readers can learn the system stage modeling, analyzing and testing methods of the LRE system as well as corresponding fault diagnosis and prediction methods. This book will benefit researchers and students who are pursuing aerospace technology, fault detection, diagnostics and corresponding applications.

  6. Failure mode and effects analysis: too little for too much?

    Science.gov (United States)

    Dean Franklin, Bryony; Shebl, Nada Atef; Barber, Nick

    2012-07-01

    Failure mode and effects analysis (FMEA) is a structured prospective risk assessment method that is widely used within healthcare. FMEA involves a multidisciplinary team mapping out a high-risk process of care, identifying the failures that can occur, and then characterising each of these in terms of probability of occurrence, severity of effects and detectability, to give a risk priority number used to identify failures most in need of attention. One might assume that such a widely used tool would have an established evidence base. This paper considers whether or not this is the case, examining the evidence for the reliability and validity of its outputs, the mathematical principles behind the calculation of a risk prioirty number, and variation in how it is used in practice. We also consider the likely advantages of this approach, together with the disadvantages in terms of the healthcare professionals' time involved. We conclude that although FMEA is popular and many published studies have reported its use within healthcare, there is little evidence to support its use for the quantitative prioritisation of process failures. It lacks both reliability and validity, and is very time consuming. We would not recommend its use as a quantitative technique to prioritise, promote or study patient safety interventions. However, the stage of FMEA involving multidisciplinary mapping process seems valuable and work is now needed to identify the best way of converting this into plans for action.

  7. Failure Mode and Effect Analysis of Subsea Multiphase Pump Equipment

    Directory of Open Access Journals (Sweden)

    Oluwatoyin Shobowale Kafayat

    2014-07-01

    Full Text Available Finding oil and gas reserves in deep/harsh environment with challenging reservoir and field conditions, subsea multiphase pumping benefits has found its way to provide solutions to these issues. Challenges such as failure issues that are still surging the industry and with the current practice of information hiding, this issues becomes even more difficult to tackle. Although, there are some joint industry projects which are only accessible to its members, still there is a need to have a clear understanding of these equipment groups so as to know which issues to focus attention on. A failure mode and effect analysis (FMEA is a potential first aid in understanding this equipment groups. A survey questionnaire/interview was conducted with the oil and gas operating company and equipment manufacturer based on the literature review. The results indicates that these equipment’s group are similar with its onshore counterpart, but the difference is the robustness built into the equipment internal subsystems for subsea applications. The results from the manufacturer perspectives indicates that Helico-axial multiphase pump have a mean time to failure of more than 10 years, twin-screw and electrical submersible pumps are still struggling with a mean time to failure of less than 5 years.

  8. Seismic analysis for translational failure of landfills with retaining walls.

    Science.gov (United States)

    Feng, Shi-Jin; Gao, Li-Ya

    2010-11-01

    In the seismic impact zone, seismic force can be a major triggering mechanism for translational failures of landfills. The scope of this paper is to develop a three-part wedge method for seismic analysis of translational failures of landfills with retaining walls. The approximate solution of the factor of safety can be calculated. Unlike previous conventional limit equilibrium methods, the new method is capable of revealing the effects of both the solid waste shear strength and the retaining wall on the translational failures of landfills during earthquake. Parameter studies of the developed method show that the factor of safety decreases with the increase of the seismic coefficient, while it increases quickly with the increase of the minimum friction angle beneath waste mass for various horizontal seismic coefficients. Increasing the minimum friction angle beneath the waste mass appears to be more effective than any other parameters for increasing the factor of safety under the considered condition. Thus, selecting liner materials with higher friction angle will considerably reduce the potential for translational failures of landfills during earthquake. The factor of safety gradually increases with the increase of the height of retaining wall for various horizontal seismic coefficients. A higher retaining wall is beneficial to the seismic stability of the landfill. Simply ignoring the retaining wall will lead to serious underestimation of the factor of safety. Besides, the approximate solution of the yield acceleration coefficient of the landfill is also presented based on the calculated method. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Failure analysis of boiler tubes in lakhra coal power plant

    International Nuclear Information System (INIS)

    Shah, A.; Baluch, M.M.; Ali, A.

    2010-01-01

    Present work deals with the failure analysis of a boiler tube in Lakhra fluidized bed combustion power station. Initially, visual inspection technique was adopted to analyse the fractured surface. Detailed microstructural investigations of the busted boiler tube were carried out using light optical microscope and scanning electron microscope. The hardness tests were also performed. A 50 percent decrease in hardness of intact portion of the tube material and from area adjacent to failure was measured, which was found to be in good agreement with the wall thicknesses measured of the busted boiler tube i.e. 4 mm and 2 mm from unaffected portion and ruptured area respectively. It was concluded that the major cause of failure of boiler tube is erosion of material which occurs due the coal particles strike at the surface of the tube material. Since the temperature of boiler is not maintained uniformly. The variations in boiler temperature can also affect the material and could be another reason for the failure of the tube. (author)

  10. Statistical analysis of the count and profitability of air conditioners.

    Science.gov (United States)

    Rady, El Houssainy A; Mohamed, Salah M; Abd Elmegaly, Alaa A

    2018-08-01

    This article presents the statistical analysis of the number and profitability of air conditioners in an Egyptian company. Checking the same distribution for each categorical variable has been made using Kruskal-Wallis test.

  11. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    Science.gov (United States)

    Glascock, M. D.; Neff, H.; Vaughn, K. J.

    2004-06-01

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  12. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    International Nuclear Information System (INIS)

    Glascock, M. D.; Neff, H.; Vaughn, K. J.

    2004-01-01

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  13. Application of Ontology Technology in Health Statistic Data Analysis.

    Science.gov (United States)

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  14. Instrumental Neutron Activation Analysis and Multivariate Statistics for Pottery Provenance

    Energy Technology Data Exchange (ETDEWEB)

    Glascock, M. D.; Neff, H. [University of Missouri, Research Reactor Center (United States); Vaughn, K. J. [Pacific Lutheran University, Department of Anthropology (United States)

    2004-06-15

    The application of instrumental neutron activation analysis and multivariate statistics to archaeological studies of ceramics and clays is described. A small pottery data set from the Nasca culture in southern Peru is presented for illustration.

  15. Statistics

    International Nuclear Information System (INIS)

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  16. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  17. Statistics

    International Nuclear Information System (INIS)

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  18. Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers

    Science.gov (United States)

    Keiffer, Greggory L.; Lane, Forrest C.

    2016-01-01

    Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…

  19. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  20. Failure rate modeling using fault tree analysis and Bayesian network: DEMO pulsed operation turbine study case

    Energy Technology Data Exchange (ETDEWEB)

    Dongiovanni, Danilo Nicola, E-mail: danilo.dongiovanni@enea.it [ENEA, Nuclear Fusion and Safety Technologies Department, via Enrico Fermi 45, Frascati 00040 (Italy); Iesmantas, Tomas [LEI, Breslaujos str. 3 Kaunas (Lithuania)

    2016-11-01

    Highlights: • RAMI (Reliability, Availability, Maintainability and Inspectability) assessment of secondary heat transfer loop for a DEMO nuclear fusion plant. • Definition of a fault tree for a nuclear steam turbine operated in pulsed mode. • Turbine failure rate models update by mean of a Bayesian network reflecting the fault tree analysis in the considered scenario. • Sensitivity analysis on system availability performance. - Abstract: Availability will play an important role in the Demonstration Power Plant (DEMO) success from an economic and safety perspective. Availability performance is commonly assessed by Reliability Availability Maintainability Inspectability (RAMI) analysis, strongly relying on the accurate definition of system components failure modes (FM) and failure rates (FR). Little component experience is available in fusion application, therefore requiring the adaptation of literature FR to fusion plant operating conditions, which may differ in several aspects. As a possible solution to this problem, a new methodology to extrapolate/estimate components failure rate under different operating conditions is presented. The DEMO Balance of Plant nuclear steam turbine component operated in pulse mode is considered as study case. The methodology moves from the definition of a fault tree taking into account failure modes possibly enhanced by pulsed operation. The fault tree is then translated into a Bayesian network. A statistical model for the turbine system failure rate in terms of subcomponents’ FR is hence obtained, allowing for sensitivity analyses on the structured mixture of literature and unknown FR data for which plausible value intervals are investigated to assess their impact on the whole turbine system FR. Finally, the impact of resulting turbine system FR on plant availability is assessed exploiting a Reliability Block Diagram (RBD) model for a typical secondary cooling system implementing a Rankine cycle. Mean inherent availability

  1. Failure rate modeling using fault tree analysis and Bayesian network: DEMO pulsed operation turbine study case

    International Nuclear Information System (INIS)

    Dongiovanni, Danilo Nicola; Iesmantas, Tomas

    2016-01-01

    Highlights: • RAMI (Reliability, Availability, Maintainability and Inspectability) assessment of secondary heat transfer loop for a DEMO nuclear fusion plant. • Definition of a fault tree for a nuclear steam turbine operated in pulsed mode. • Turbine failure rate models update by mean of a Bayesian network reflecting the fault tree analysis in the considered scenario. • Sensitivity analysis on system availability performance. - Abstract: Availability will play an important role in the Demonstration Power Plant (DEMO) success from an economic and safety perspective. Availability performance is commonly assessed by Reliability Availability Maintainability Inspectability (RAMI) analysis, strongly relying on the accurate definition of system components failure modes (FM) and failure rates (FR). Little component experience is available in fusion application, therefore requiring the adaptation of literature FR to fusion plant operating conditions, which may differ in several aspects. As a possible solution to this problem, a new methodology to extrapolate/estimate components failure rate under different operating conditions is presented. The DEMO Balance of Plant nuclear steam turbine component operated in pulse mode is considered as study case. The methodology moves from the definition of a fault tree taking into account failure modes possibly enhanced by pulsed operation. The fault tree is then translated into a Bayesian network. A statistical model for the turbine system failure rate in terms of subcomponents’ FR is hence obtained, allowing for sensitivity analyses on the structured mixture of literature and unknown FR data for which plausible value intervals are investigated to assess their impact on the whole turbine system FR. Finally, the impact of resulting turbine system FR on plant availability is assessed exploiting a Reliability Block Diagram (RBD) model for a typical secondary cooling system implementing a Rankine cycle. Mean inherent availability

  2. A Statistical Evaluation of Rules for Biochemical Failure After Radiotherapy in Men Treated for Prostate Cancer

    International Nuclear Information System (INIS)

    Bellera, Carine A.; Hanley, James A.; Joseph, Lawrence; Albertsen, Peter C.

    2009-01-01

    Purpose: The 'PSA nadir + 2 rule,' defined as any rise of 2 ng/ml above the current prostate-specific antigen (PSA) nadir, has replaced the American Society for Therapeutic Radiology and Oncology (ASTRO) rule, defined as three consecutive PSA rises, to indicate biochemical failure (BF) after radiotherapy in patients treated for prostate cancer. We propose an original approach to evaluate BF rules based on the PSAdt as the gold standard rule and on a simulation process allowing us to evaluate the BF rules under multiple settings (different frequency, duration of follow-up, PSA doubling time [PSAdt]). Methods and Materials: We relied on a retrospective, population-based cohort of individuals identified by the Connecticut Tumor Registry and treated for localized prostate cancer with radiotherapy. We estimated the 470 underlying true PSA trajectories, including the PSAdt, using a Bayesian hierarchical changepoint model. Next, we simulated realistic, sophisticated data sets that accurately reflect the systematic and random variations observed in PSA series. We estimated the sensitivity and specificity by comparing the simulated PSA series to the underlying true PSAdt. Results: For follow-up of more than 3 years, the specificity of the PSA nadir + 2 rule was systematically greater than that of the ASTRO criterion. In few settings, the nadir + 2 rule had a lower sensitivity than the ASTRO. The PSA nadir + 2 rule appeared less dependent on the frequency and duration of follow-up than the ASTRO. Conclusions: Our results provide some refinements to earlier findings as the BF rules were evaluated according to various parameters. In most settings, the PSA nadir + 2 rule outperforms the ASTRO criterion.

  3. Association of sleep bruxism with ceramic restoration failure: A systematic review and meta-analysis.

    Science.gov (United States)

    de Souza Melo, Gilberto; Batistella, Elis Ângela; Bertazzo-Silveira, Eduardo; Simek Vega Gonçalves, Thais Marques; Mendes de Souza, Beatriz Dulcineia; Porporatti, André Luís; Flores-Mir, Carlos; De Luca Canto, Graziela

    2018-03-01

    Ceramic restorations are popular because of their excellent optical properties. However, failures are still a major concern, and dentists are confronted with the following question: is sleep bruxism (SB) associated with an increased frequency of ceramic restoration failures? The purpose of this systematic review and meta-analysis was to assess whether the presence of SB is associated with increased ceramic restoration failure. Observational studies and clinical trials that evaluated the short- and long-term survival rate of ceramic restorations in SB participants were selected. Sleep bruxism diagnostic criteria must have included at least 1 of the following: questionnaire, clinical evaluation, or polysomnography. Seven databases, in addition to 3 nonpeer-reviewed literature databases, were searched. The risk of bias was assessed by using the meta-analysis of statistics assessment and review instrument (MAStARI) checklist. Eight studies were included for qualitative synthesis, but only 5 for the meta-analysis. Three studies were categorized as moderate risk and 5 as high risk of bias. Clinical and methodological heterogeneity across studies were considered high. Increased hazard ratio (HR=7.74; 95% confidence interval [CI]=2.50 to 23.95) and odds ratio (OR=2.52; 95% CI=1.24 to 5.12) were observed considering only anterior ceramic veneers. Nevertheless, limited data from the meta-analysis and from the restricted number of included studies suggested that differences in the overall odds of failure concerning SB and other types of ceramic restorations did not favor or disfavor any association (OR=1.10; 95% CI=0.43 to 2.8). The overall quality of evidence was considered very low according to the GRADE criteria. Within the limitations of this systematic review, the overall result from the meta-analysis did not favor any association between SB and increased odds of failure for ceramic restorations. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry

  4. Failure mode and effects analysis outputs: are they valid?

    Directory of Open Access Journals (Sweden)

    Shebl Nada

    2012-06-01

    Full Text Available Abstract Background Failure Mode and Effects Analysis (FMEA is a prospective risk assessment tool that has been widely used within the aerospace and automotive industries and has been utilised within healthcare since the early 1990s. The aim of this study was to explore the validity of FMEA outputs within a hospital setting in the United Kingdom. Methods Two multidisciplinary teams each conducted an FMEA for the use of vancomycin and gentamicin. Four different validity tests were conducted: · Face validity: by comparing the FMEA participants’ mapped processes with observational work. · Content validity: by presenting the FMEA findings to other healthcare professionals. · Criterion validity: by comparing the FMEA findings with data reported on the trust’s incident report database. · Construct validity: by exploring the relevant mathematical theories involved in calculating the FMEA risk priority number. Results Face validity was positive as the researcher documented the same processes of care as mapped by the FMEA participants. However, other healthcare professionals identified potential failures missed by the FMEA teams. Furthermore, the FMEA groups failed to include failures related to omitted doses; yet these were the failures most commonly reported in the trust’s incident database. Calculating the RPN by multiplying severity, probability and detectability scores was deemed invalid because it is based on calculations that breach the mathematical properties of the scales used. Conclusion There are significant methodological challenges in validating FMEA. It is a useful tool to aid multidisciplinary groups in mapping and understanding a process of care; however, the results of our study cast doubt on its validity. FMEA teams are likely to need different sources of information, besides their personal experience and knowledge, to identify potential failures. As for FMEA’s methodology for scoring failures, there were discrepancies

  5. Statistics

    International Nuclear Information System (INIS)

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  6. Statistics

    International Nuclear Information System (INIS)

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  7. Statistics

    International Nuclear Information System (INIS)

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  8. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  9. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  10. Analysis of room transfer function and reverberant signal statistics

    DEFF Research Database (Denmark)

    Georganti, Eleftheria; Mourjopoulos, John; Jacobsen, Finn

    2008-01-01

    For some time now, statistical analysis has been a valuable tool in analyzing room transfer functions (RTFs). This work examines existing statistical time-frequency models and techniques for RTF analysis (e.g., Schroeder's stochastic model and the standard deviation over frequency bands for the RTF...... magnitude and phase). RTF fractional octave smoothing, as with 1-slash 3 octave analysis, may lead to RTF simplifications that can be useful for several audio applications, like room compensation, room modeling, auralisation purposes. The aim of this work is to identify the relationship of optimal response...... and the corresponding ratio of the direct and reverberant signal. In addition, this work examines the statistical quantities for speech and audio signals prior to their reproduction within rooms and when recorded in rooms. Histograms and other statistical distributions are used to compare RTF minima of typical...

  11. Statistical analysis of dynamic parameters of the core

    International Nuclear Information System (INIS)

    Ionov, V.S.

    2007-01-01

    The transients of various types were investigated for the cores of zero power critical facilities in RRC KI and NPP. Dynamic parameters of neutron transients were explored by tool statistical analysis. Its have sufficient duration, few channels for currents of chambers and reactivity and also some channels for technological parameters. On these values the inverse period. reactivity, lifetime of neutrons, reactivity coefficients and some effects of a reactivity are determinate, and on the values were restored values of measured dynamic parameters as result of the analysis. The mathematical means of statistical analysis were used: approximation(A), filtration (F), rejection (R), estimation of parameters of descriptive statistic (DSP), correlation performances (kk), regression analysis(KP), the prognosis (P), statistician criteria (SC). The calculation procedures were realized by computer language MATLAB. The reasons of methodical and statistical errors are submitted: inadequacy of model operation, precision neutron-physical parameters, features of registered processes, used mathematical model in reactivity meters, technique of processing for registered data etc. Examples of results of statistical analysis. Problems of validity of the methods used for definition and certification of values of statistical parameters and dynamic characteristics are considered (Authors)

  12. Failure of Emperion modular femoral stem with implant analysis

    Directory of Open Access Journals (Sweden)

    Benjamin M. Stronach, MD, MS

    2016-03-01

    Full Text Available Modularity in total hip arthroplasty provides multiple benefits to the surgeon in restoring the appropriate alignment and position to a previously damaged hip joint. The vast majority of modern implants incorporate modularity into their design with some implants having multiple modular interfaces. There is the potential for failure at modular junctions because of fretting and crevice corrosion in combination with mechanical loading. This case report details the failure of an Emperion (Smith and Nephew, Memphis, TN femoral stem in a 67-year-old male patient 6 years after total hip replacement. Analysis of the implant revealed mechanically assisted crevice corrosion that likely accelerated fatigue crack initiation in the hip stem. The benefits of modularity come with the potential drawback of a combination of fretting and crevice corrosion at the modular junction, which may accelerate fatigue, crack initiation and ultimately reduce the hip longevity.

  13. Prediction of hospital failure: a post-PPS analysis.

    Science.gov (United States)

    Gardiner, L R; Oswald, S L; Jahera, J S

    1996-01-01

    This study investigates the ability of discriminant analysis to provide accurate predictions of hospital failure. Using data from the period following the introduction of the Prospective Payment System, we developed discriminant functions for each of two hospital ownership categories: not-for-profit and proprietary. The resulting discriminant models contain six and seven variables, respectively. For each ownership category, the variables represent four major aspects of financial health (liquidity, leverage, profitability, and efficiency) plus county marketshare and length of stay. The proportion of closed hospitals misclassified as open one year before closure does not exceed 0.05 for either ownership type. Our results show that discriminant functions based on a small set of financial and nonfinancial variables provide the capability to predict hospital failure reliably for both not-for-profit and proprietary hospitals.

  14. Failure mode and effects analysis on typical reactor trip system

    International Nuclear Information System (INIS)

    Eisawy, E.A.

    2010-01-01

    An updated failure mode and effects analysis, FMEA , has been performed on a typical reactor trip system. This upgrade helps to avoid system damage and ,as a result, extends the system service life. It also provides for simplified maintenance and surveillance testing. The operating conditions under which the system is to carry out its function and the operational profile expected for the system have been determined. The results of the FMEA have been given in terms of operating states of the subsystem.The results are given in form of table which is set up such that for a given failure one can read across it and determine which items remain operating in the system. From this data one can identify the number of components operating in the system for monitors pressure exceeds the setpoint pressure.

  15. Failure analysis of electrolyte-supported solid oxide fuel cells

    Science.gov (United States)

    Fleischhauer, Felix; Tiefenauer, Andreas; Graule, Thomas; Danzer, Robert; Mai, Andreas; Kuebler, Jakob

    2014-07-01

    For solid oxide fuel cells (SOFCs) one key aspect is the structural integrity of the cell and hence its thermo mechanical long term behaviour. The present study investigates the failure mechanisms and the actual causes for fracture of electrolyte supported SOFCs which were run using the current μ-CHP system of Hexis AG, Winterthur - Switzerland under lab conditions or at customer sites for up to 40,000 h. In a first step several operated stacks were demounted for post-mortem inspection, followed by a fractographic evaluation of the failed cells. The respective findings are then set into a larger picture including an analysis of the present stresses acting on the cell like thermal and residual stresses and the measurements regarding the temperature dependent electrolyte strength. For all investigated stacks, the mechanical failure of individual cells can be attributed to locally acting bending loads, which rise due to an inhomogeneous and uneven contact between the metallic interconnect and the cell.

  16. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  17. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    Science.gov (United States)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  18. Dam failure analysis/calibration using NWS models on dam failure in Alton, New Hampshire

    International Nuclear Information System (INIS)

    Capone, E.J.

    1998-01-01

    The State of New Hampshire Water Resources Board, the United States Geological Service, and private concerns have compiled data on the cause of a catastrophic failure of the Bergeron Dam in Alton, New Hampshire in March of 1996. Data collected related to the cause of the breach, the breach parameters, the soil characteristics of the failed section, and the limits of downstream flooding. Dam break modeling software was used to calibrate and verify the simulated flood-wave caused by the Bergeron Dam breach. Several scenarios were modeled, using different degrees of detail concerning the topography/channel-geometry of the affected areas. A sensitivity analysis of the important output parameters was completed. The relative importance of model parameters on the results was assessed against the background of observed historical events

  19. The study of Influencing Maintenance Factors on Failures of Two gypsum Kilns by Failure Modes and Effects Analysis (FMEA

    Directory of Open Access Journals (Sweden)

    Iraj Alimohammadi

    2014-06-01

    Full Text Available Developing technology and using equipment in Iranian industries caused that maintenance system would be more important to use. Using proper management techniques not only increase the performance of production system but also reduce the failures and costs. The aim of this study was to determine the quality of maintenance system and the effects of its components on failures of kilns in two gypsum production companies using Failure Modes and Effects Analysis (FMEA. Furthermore the costs of failures were studied. After the study of gypsum production steps in the factories, FMEA was conducted by the determination of analysis insight, information gathering, making list of kilns’ component and filling up the FMEA’s tables. The effects of failures on production, how to fail, failure rate, failure severity, and control measures were studied. The evaluation of maintenance system was studied by a check list including questions related to system components. The costs of failures were determined by refer in accounting notebooks and interview with the head of accounting department. It was found the total qualities of maintenance system in NO.1 was more than NO.2 but because of lower quality of NO.1’s kiln design, number of failures and their costs were more. In addition it was determined that repair costs in NO.2’s kiln were about one third of NO.1’s. The low severity failures caused the most costs in comparison to the moderate and low ones. The technical characteristics of kilns were appeared to be the most important factors in reducing of failures and costs.

  20. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  1. Does Bruxism Contribute to Dental Implant Failure? A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Zhou, Yi; Gao, Jinxia; Luo, Le; Wang, Yining

    2016-04-01

    Bruxism was usually considered as a contraindication for oral implanting. The causal relationship between bruxism and dental implant failure was remained controversial in existing literatures. This meta-analysis was performed to investigate the relationship between them. This review conducted an electronic systematic literature search in MEDLINE (PubMed) and EmBase in November 2013 without time and language restrictions. Meanwhile, a hand searching for all the relevant references of included studies was also conducted. Study information extraction and methodological quality assessments were accomplished by two reviewers independently. A discussion ensued if any disagreement occurred, and unresolved issues were solved by consulting a third reviewer. Methodological quality was assessed by using the Newcastle-Ottawa Scale tool. Odds ratio (OR) with 95% confidence interval (CI) was pooled to estimate the relative effect of bruxism on dental implant failures. Fixed effects model was used initially; if the heterogeneity was high, random effects model was chosen for meta-analysis. Statistical analyses were carried out by using Review Manager 5.1. In this meta-analysis review, extracted data were classified into two groups based on different units. Units were based on the number of prostheses (group A) and the number of patients (group B). In group A, the total pooled OR of bruxers versus nonbruxers for all subgroups was 4.72 (95% CI: 2.66-8.36, p = .07). In group B, the total pooled OR of bruxers versus nonbruxers for all subgroups was 3.83 (95% CI: 2.12-6.94, p = .22). This meta-analysis was performed to evaluate the relationship between bruxism and dental implant failure. In contrast to nonbruxers, prostheses in bruxers had a higher failure rate. It suggests that bruxism is a contributing factor of causing the occurrence of dental implant technical/biological complications and plays a role in dental implant failure. © 2015 Wiley Periodicals, Inc.

  2. failure analysis of a uav flight control system using markov analysis

    African Journals Online (AJOL)

    eobe

    2016-01-01

    Jan 1, 2016 ... Tree Analysis (FTA), Dependence Diagram Analysis. (DDA) and Markov Analysis (MA) are the most widely-used methods of probabilistic safety and reliability analysis for airborne system [1]. Fault trees analysis is a backward failure searching ..... [4] Christopher Dabrowski and Fern Hunt Markov Chain.

  3. Statistical analysis of planktic foraminifera of the surface Continental ...

    African Journals Online (AJOL)

    Planktic foraminiferal assemblage recorded from selected samples obtained from shallow continental shelf sediments off southwestern Nigeria were subjected to statistical analysis. The Principal Component Analysis (PCA) was used to determine variants of planktic parameters. Values obtained for these parameters were ...

  4. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic

  5. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is

  6. PRECISE - pregabalin in addition to usual care: Statistical analysis plan

    NARCIS (Netherlands)

    S. Mathieson (Stephanie); L. Billot (Laurent); C. Maher (Chris); A.J. McLachlan (Andrew J.); J. Latimer (Jane); B.W. Koes (Bart); M.J. Hancock (Mark J.); I. Harris (Ian); R.O. Day (Richard O.); J. Pik (Justin); S. Jan (Stephen); C.-W.C. Lin (Chung-Wei Christine)

    2016-01-01

    textabstractBackground: Sciatica is a severe, disabling condition that lacks high quality evidence for effective treatment strategies. This a priori statistical analysis plan describes the methodology of analysis for the PRECISE study. Methods/design: PRECISE is a prospectively registered, double

  7. A Divergence Statistics Extension to VTK for Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pebay, Philippe Pierre [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bennett, Janine Camille [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    This report follows the series of previous documents ([PT08, BPRT09b, PT09, BPT09, PT10, PB13], where we presented the parallel descriptive, correlative, multi-correlative, principal component analysis, contingency, k -means, order and auto-correlative statistics engines which we developed within the Visualization Tool Kit ( VTK ) as a scalable, parallel and versatile statistics package. We now report on a new engine which we developed for the calculation of divergence statistics, a concept which we hereafter explain and whose main goal is to quantify the discrepancy, in a stasticial manner akin to measuring a distance, between an observed empirical distribution and a theoretical, "ideal" one. The ease of use of the new diverence statistics engine is illustrated by the means of C++ code snippets. Although this new engine does not yet have a parallel implementation, it has already been applied to HPC performance analysis, of which we provide an example.

  8. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  9. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  10. Longitudinal data analysis a handbook of modern statistical methods

    CERN Document Server

    Fitzmaurice, Garrett; Verbeke, Geert; Molenberghs, Geert

    2008-01-01

    Although many books currently available describe statistical models and methods for analyzing longitudinal data, they do not highlight connections between various research threads in the statistical literature. Responding to this void, Longitudinal Data Analysis provides a clear, comprehensive, and unified overview of state-of-the-art theory and applications. It also focuses on the assorted challenges that arise in analyzing longitudinal data. After discussing historical aspects, leading researchers explore four broad themes: parametric modeling, nonparametric and semiparametric methods, joint

  11. Highly Robust Statistical Methods in Medical Image Analysis

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2012-01-01

    Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf

  12. Network similarity and statistical analysis of earthquake seismic data

    OpenAIRE

    Deyasi, Krishanu; Chakraborty, Abhijit; Banerjee, Anirban

    2016-01-01

    We study the structural similarity of earthquake networks constructed from seismic catalogs of different geographical regions. A hierarchical clustering of underlying undirected earthquake networks is shown using Jensen-Shannon divergence in graph spectra. The directed nature of links indicates that each earthquake network is strongly connected, which motivates us to study the directed version statistically. Our statistical analysis of each earthquake region identifies the hub regions. We cal...

  13. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  14. Improving failure analysis efficiency by combining FTA and FMEA in a recursive manner

    NARCIS (Netherlands)

    Peeters, J.F.W.; Basten, R.J.I.; Tinga, Tiedo

    2018-01-01

    When designing a maintenance programme for a capital good, especially a new one, it is of key importance to accurately understand its failure behaviour. Failure mode and effects analysis (FMEA) and fault tree analysis (FTA) are two commonly used methods for failure analysis. FMEA is a bottom-up

  15. Improving failure analysis efficiency by combining FTA and FMEA in a recursive manner

    NARCIS (Netherlands)

    Peeters, J.F.W.; Basten, R.J.I.; Tinga, T.

    When designing a maintenance programme for a capital good, especially a new one, it is of key importance to accurately understand its failure behaviour. Failure mode and effects analysis (FMEA) and fault tree analysis (FTA) are two commonly used methods for failure analysis. FMEA is a bottom-up

  16. The distributed failure probability approach to dependent failure analysis, and its application

    International Nuclear Information System (INIS)

    Hughes, R.P.

    1989-01-01

    The Distributed Failure Probability (DFP) approach to the problem of dependent failures in systems is presented. The basis of the approach is that the failure probability of a component is a variable. The source of this variability is the change in the 'environment' of the component, where the term 'environment' is used to mean not only obvious environmental factors such as temperature etc., but also such factors as the quality of maintenance and manufacture. The failure probability is distributed among these various 'environments' giving rise to the Distributed Failure Probability method. Within the framework which this method represents, modelling assumptions can be made, based both on engineering judgment and on the data directly. As such, this DFP approach provides a soundly based and scrutable technique by which dependent failures can be quantitatively assessed. (orig.)

  17. Failure and Reliability Analysis for the Master Pump Shutdown System

    International Nuclear Information System (INIS)

    BEVINS, R.R.

    2000-01-01

    The Master Pump Shutdown System (MPSS) will be installed in the 200 Areas of the Hanford Site to monitor and control the transfer of liquid waste between tank farms and between the 200 West and 200 East areas through the Cross-Site Transfer Line. The Safety Function provided by the MPSS is to shutdown any waste transfer process within or between tank farms if a waste leak should occur along the selected transfer route. The MPSS, which provides this Safety Class Function, is composed of Programmable Logic Controllers (PLCs), interconnecting wires, relays, Human to Machine Interfaces (HMI), and software. These components are defined as providing a Safety Class Function and will be designated in this report as MPSS/PLC. Input signals to the MPSS/PLC are provided by leak detection systems from each of the tank farm leak detector locations along the waste transfer route. The combination of the MPSS/PLC, leak detection system, and transfer pump controller system will be referred to as MPSS/SYS. The components addressed in this analysis are associated with the MPSS/SYS. The purpose of this failure and reliability analysis is to address the following design issues of the Project Development Specification (PDS) for the MPSS/SYS (HNF 2000a): (1) Single Component Failure Criterion, (2) System Status Upon Loss of Electrical Power, (3) Physical Separation of Safety Class cables, (4) Physical Isolation of Safety Class Wiring from General Service Wiring, and (5) Meeting the MPSS/PLC Option 1b (RPP 1999) Reliability estimate. The failure and reliability analysis examined the system on a component level basis and identified any hardware or software elements that could fail and/or prevent the system from performing its intended safety function

  18. Application of micropolar plasticity to post failure analysis in geomechanics

    Science.gov (United States)

    Manzari, Majid T.

    2004-08-01

    A micropolar elastoplastic model for soils is formulated and a series of finite element analyses are employed to demonstrate the use of a micropolar continuum in overcoming the numerical difficulties encountered in application of finite element method in standard Cauchy-Boltzmann continuum. Three examples of failure analysis involving a deep excavation, shallow foundation, and a retaining wall are presented. In all these cases, it is observed that the length scale introduced in the polar continuum regularizes the incremental boundary value problem and allows the numerical simulation to be continued until a clear collapse mechanism is achieved. The issue of grain size effect is also discussed. Copyright

  19. Augmenting health care failure modes and effects analysis with simulation

    DEFF Research Database (Denmark)

    Staub-Nielsen, Ditte Emilie; Dieckmann, Peter; Mohr, Marlene

    2014-01-01

    This study explores whether simulation plays a role in health care failure mode and effects analysis (HFMEA); it does this by evaluating whether additional data are found when a traditional HFMEA is augmented with simulation. Two multidisciplinary teams identified vulnerabilities in a process...... by brainstorming, followed by simulation. Two means of adding simulation were investigated as follows: just simulating the process and interrupting the simulation between substeps of the process. By adding simulation to a traditional HFMEA, both multidisciplinary teams identified additional data that were relevant...

  20. Local buckling failure analysis of high-strength pipelines

    Institute of Scientific and Technical Information of China (English)

    Yan Li; Jian Shuai; Zhong-Li Jin; Ya-Tong Zhao; Kui Xu

    2017-01-01

    Pipelines in geological disaster regions typically suffer the risk of local buckling failure because of slender structure and complex load.This paper is meant to reveal the local buckling behavior of buried pipelines with a large diameter and high strength,which are under different conditions,including pure bending and bending combined with internal pressure.Finite element analysis was built according to previous data to study local buckling behavior of pressurized and unpressurized pipes under bending conditions and their differences in local buckling failure modes.In parametric analysis,a series of parameters,including pipe geometrical dimension,pipe material properties and internal pressure,were selected to study their influences on the critical bending moment,critical compressive stress and critical compressive strain of pipes.Especially the hardening exponent of pipe material was introduced to the parameter analysis by using the Ramberg-Osgood constitutive model.Results showed that geometrical dimensions,material and internal pressure can exert similar effects on the critical bending moment and critical compressive stress,which have different,even reverse effects on the critical compressive strain.Based on these analyses,more accurate design models of critical bending moment and critical compressive stress have been proposed for high-strength pipelines under bending conditions,which provide theoretical methods for highstrength pipeline engineering.

  1. Failure analysis of axle shaft of a fork lift

    Directory of Open Access Journals (Sweden)

    Souvik Das

    2015-04-01

    Full Text Available An axle shaft of fork lift failed at operation within 296 h of service. The shaft transmits torque from discrepancy to wheel through planetary gear arrangement. A section of fractured axle shaft made of induction-hardened steel was analyzed to determine the root cause of the failure. Optical microscopies as well as field emission gun scanning electron microscopy (FEG-SEM along with energy dispersive spectroscopy (EDS were carried out to characterize the microstructure. Hardness profile throughout the cross-section was evaluated by micro-hardness measurements. Chemical analysis indicated that the shaft was made of 42CrMo4 steel grade as per specification. Microstructural analysis and micro-hardness profile revealed that the shaft was improperly heat treated resulting in a brittle case, where crack was found to initiate from the case in a brittle mode in contrast to ductile mode within the core. This behaviour was related to differences in microstructure, which was observed to be martensitic within the case with a micro-hardness equivalent to 735 HV, and a mixture of non-homogeneous structure of pearlite and ferrite within the core with a hardness of 210 HV. The analysis suggests that the fracture initiated from the martensitic case as brittle mode due to improper heat treatment process (high hardness. Moreover the inclusions along the hot working direction i.e. in the longitudinal axis made the component more susceptible to failure.

  2. Analysis of Service Recovery Failure: From Minority Perspective

    OpenAIRE

    Yasemin Öcal Atınç

    2016-01-01

    We investigate the service failures towards diverse customer groups for the purpose to bring insightful proposals to the managers to recover from these failures. Previous literature provided insights regarding the perception of service failures by minorities and the challenge of recovery due to racial implications driven from the failure, however lacked to propose suggestions for the managers so that they can take either corrective steps toward service failure recovery or prevent service fail...

  3. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  4. Statistical margin to DNB safety analysis approach for LOFT

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1982-01-01

    A method was developed and used for LOFT thermal safety analysis to estimate the statistical margin to DNB for the hot rod, and to base safety analysis on desired DNB probability limits. This method is an advanced approach using response surface analysis methods, a very efficient experimental design, and a 2nd-order response surface equation with a 2nd-order error propagation analysis to define the MDNBR probability density function. Calculations for limiting transients were used in the response surface analysis thereby including transient interactions and trip uncertainties in the MDNBR probability density

  5. The failure trace archive : enabling comparative analysis of failures in diverse distributed systems

    NARCIS (Netherlands)

    Kondo, D.; Javadi, B.; Iosup, A.; Epema, D.H.J.

    2010-01-01

    With the increasing functionality and complexity of distributed systems, resource failures are inevitable. While numerous models and algorithms for dealing with failures exist, the lack of public trace data sets and tools has prevented meaningful comparisons. To facilitate the design, validation,

  6. Data analysis using the Gnu R system for statistical computation

    Energy Technology Data Exchange (ETDEWEB)

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  7. A meta-analysis of the association between diabetic patients and AVF failure in dialysis.

    Science.gov (United States)

    Yan, Yan; Ye, Dan; Yang, Liu; Ye, Wen; Zhan, Dandan; Zhang, Li; Xiao, Jun; Zeng, Yan; Chen, Qinkai

    2018-11-01

    The most preferable vascular access for patients with end-stage renal failure needing hemodialysis is native arteriovenous fistula (AVF) on account of its access longevity, patient morbidity, hospitalization costs, lower risks of infection and fewer incidence of thrombotic complications. Meanwhile, according to National Kidney Foundation (NKF)̸Dialysis Out-comes Quality Initiative (DOQI) guidelines, AVF is more used than before. However, a significant percentage of AVF fails to support dialysis therapy due to lack of adequate maturity. Among all factors, the presence of diabetes mellitus was shown to be one of the risk factors for the development of vascular access failure by some authors. Therefore, this review evaluates the current evidence concerning the correlation of diabetes and AVF failure. A search was conducted using MEDLINE, SCIENCE DIRECT, SPRINGER, WILEY-BLACKWELL, KARGER, EMbase, CNKI and WanFang Data from the establishment time of databases to January 2016. The analysis involved studies that contained subgroups of diabetic patients and compared their outcomes with those of non-diabetic adults. In total, 23 articles were retrieved and included in the review. The meta-analysis revealed a statistically significantly higher rate of AVF failure in diabetic patients compared with non-diabetic patients (OR = 1.682; 95% CI, 1.429-1.981, Test of OR = 1: z = 6.25, p <.001). This review found an increased risk of AVF failure in diabetes patients. If confirmed by further prospective studies, preventive measure should be considered when planning AVF in diabetic patients.

  8. A κ-generalized statistical mechanics approach to income analysis

    International Nuclear Information System (INIS)

    Clementi, F; Gallegati, M; Kaniadakis, G

    2009-01-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low–middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful

  9. A κ-generalized statistical mechanics approach to income analysis

    Science.gov (United States)

    Clementi, F.; Gallegati, M.; Kaniadakis, G.

    2009-02-01

    This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.

  10. Common pitfalls in statistical analysis: Linear regression analysis

    Directory of Open Access Journals (Sweden)

    Rakesh Aggarwal

    2017-01-01

    Full Text Available In a previous article in this series, we explained correlation analysis which describes the strength of relationship between two continuous variables. In this article, we deal with linear regression analysis which predicts the value of one continuous variable from another. We also discuss the assumptions and pitfalls associated with this analysis.

  11. A novel statistic for genome-wide interaction analysis.

    Directory of Open Access Journals (Sweden)

    Xuesen Wu

    2010-09-01

    Full Text Available Although great progress in genome-wide association studies (GWAS has been made, the significant SNP associations identified by GWAS account for only a few percent of the genetic variance, leading many to question where and how we can find the missing heritability. There is increasing interest in genome-wide interaction analysis as a possible source of finding heritability unexplained by current GWAS. However, the existing statistics for testing interaction have low power for genome-wide interaction analysis. To meet challenges raised by genome-wide interactional analysis, we have developed a novel statistic for testing interaction between two loci (either linked or unlinked. The null distribution and the type I error rates of the new statistic for testing interaction are validated using simulations. Extensive power studies show that the developed statistic has much higher power to detect interaction than classical logistic regression. The results identified 44 and 211 pairs of SNPs showing significant evidence of interactions with FDR<0.001 and 0.001analysis is a valuable tool for finding remaining missing heritability unexplained by the current GWAS, and the developed novel statistic is able to search significant interaction between SNPs across the genome. Real data analysis showed that the results of genome-wide interaction analysis can be replicated in two independent studies.

  12. Failure analysis and success analysis: roles in plant aging assessments

    International Nuclear Information System (INIS)

    Johnson, A.B. Jr.

    1985-06-01

    Component aging investigations are an important element in NRC's Nuclear Plant Aging Research (NPAR) strategy. Potential sources of components include plants in decommissioning and commercial plant, both for in situ tests and for examination of equipment removed from service. Nuclear utilities currently have voluntary programs addressing aspects of equipment reliability, such as root cause analysis for safety-related equipment that malfunctions, and trending analysis to follow the course of both successful and abnormal equipment performance. Properly coordinated, the NPAR and utility programs offer an important approach to establish the data base necessary for life extension of nuclear electrical generating plants

  13. Corrosion failure analysis of hearing aid battery-spring contacts

    DEFF Research Database (Denmark)

    Gudla, Visweswara Chakravarthy; Ambat, Rajan

    2017-01-01

    the susceptibility of these systems to galvanic corrosion. In this study, traditional behind the ear (BTE) hearing aid systems, which failed during service were analysed. Failure analysis was performed on the dome type battery-spring contact systems. The morphology of the contact areas was observed using scanning......Reliability of low power electrical contacts such as those in hearing aid battery-spring systems is a very critical aspect for the overall performance of the device. These systems are exposed to certain harsh environments like high humidity and elevated temperatures, and often in combination...... electron microscopy, and the compositional analysis of the corrosion products and contaminants was performed using energy dispersive X-ray spectroscopy. Wear track morphology was observed on the contact points, and the top coating on the dome was worn out exposing the substrate spring material...

  14. Failure Analysis of Nonvolatile Residue (NVR) Analyzer Model SP-1000

    Science.gov (United States)

    Potter, Joseph C.

    2011-01-01

    National Aeronautics and Space Administration (NASA) subcontractor Wiltech contacted the NASA Electrical Lab (NE-L) and requested a failure analysis of a Solvent Purity Meter; model SP-IOOO produced by the VerTis Instrument Company. The meter, used to measure the contaminate in a solvent to determine the relative contamination on spacecraft flight hardware and ground servicing equipment, had been inoperable and in storage for an unknown amount of time. NE-L was asked to troubleshoot the unit and make a determination on what may be required to make the unit operational. Through the use of general troubleshooting processes and the review of a unit in service at the time of analysis, the unit was found to be repairable but would need the replacement of multiple components.

  15. Statistical Compilation of the ICT Sector and Policy Analysis | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  16. Multivariate statistical analysis of major and trace element data for ...

    African Journals Online (AJOL)

    Multivariate statistical analysis of major and trace element data for niobium exploration in the peralkaline granites of the anorogenic ring-complex province of Nigeria. PO Ogunleye, EC Ike, I Garba. Abstract. No Abstract Available Journal of Mining and Geology Vol.40(2) 2004: 107-117. Full Text: EMAIL FULL TEXT EMAIL ...

  17. Statistical Compilation of the ICT Sector and Policy Analysis | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Statistical Compilation of the ICT Sector and Policy Analysis. As the presence and influence of information and communication technologies (ICTs) continues to widen and deepen, so too does its impact on economic development. However, much work needs to be done before the linkages between economic development ...

  18. Statistical analysis of the BOIL program in RSYST-III

    International Nuclear Information System (INIS)

    Beck, W.; Hausch, H.J.

    1978-11-01

    The paper describes a statistical analysis in the RSYST-III program system. Using the example of the BOIL program, it is shown how the effects of inaccurate input data on the output data can be discovered. The existing possibilities of data generation, data handling, and data evaluation are outlined. (orig.) [de

  19. Statistical analysis of thermal conductivity of nanofluid containing ...

    Indian Academy of Sciences (India)

    Thermal conductivity measurements of nanofluids were analysed via two-factor completely randomized design and comparison of data means is carried out with Duncan's multiple-range test. Statistical analysis of experimental data show that temperature and weight fraction have a reasonable impact on the thermal ...

  20. Multivariate statistical analysis of precipitation chemistry in Northwestern Spain

    International Nuclear Information System (INIS)

    Prada-Sanchez, J.M.; Garcia-Jurado, I.; Gonzalez-Manteiga, W.; Fiestras-Janeiro, M.G.; Espada-Rios, M.I.; Lucas-Dominguez, T.

    1993-01-01

    149 samples of rainwater were collected in the proximity of a power station in northwestern Spain at three rainwater monitoring stations. The resulting data are analyzed using multivariate statistical techniques. Firstly, the Principal Component Analysis shows that there are three main sources of pollution in the area (a marine source, a rural source and an acid source). The impact from pollution from these sources on the immediate environment of the stations is studied using Factorial Discriminant Analysis. 8 refs., 7 figs., 11 tabs

  1. Implementation and statistical analysis of Metropolis algorithm for SU(3)

    International Nuclear Information System (INIS)

    Katznelson, E.; Nobile, A.

    1984-12-01

    In this paper we study the statistical properties of an implementation of the Metropolis algorithm for SU(3) gauge theory. It is shown that the results have normal distribution. We demonstrate that in this case error analysis can be carried on in a simple way and we show that applying it to both the measurement strategy and the output data analysis has an important influence on the performance and reliability of the simulation. (author)

  2. Multivariate statistical analysis of precipitation chemistry in Northwestern Spain

    Energy Technology Data Exchange (ETDEWEB)

    Prada-Sanchez, J.M.; Garcia-Jurado, I.; Gonzalez-Manteiga, W.; Fiestras-Janeiro, M.G.; Espada-Rios, M.I.; Lucas-Dominguez, T. (University of Santiago, Santiago (Spain). Faculty of Mathematics, Dept. of Statistics and Operations Research)

    1993-07-01

    149 samples of rainwater were collected in the proximity of a power station in northwestern Spain at three rainwater monitoring stations. The resulting data are analyzed using multivariate statistical techniques. Firstly, the Principal Component Analysis shows that there are three main sources of pollution in the area (a marine source, a rural source and an acid source). The impact from pollution from these sources on the immediate environment of the stations is studied using Factorial Discriminant Analysis. 8 refs., 7 figs., 11 tabs.

  3. Reducing bias in the analysis of counting statistics data

    International Nuclear Information System (INIS)

    Hammersley, A.P.; Antoniadis, A.

    1997-01-01

    In the analysis of counting statistics data it is common practice to estimate the variance of the measured data points as the data points themselves. This practice introduces a bias into the results of further analysis which may be significant, and under certain circumstances lead to false conclusions. In the case of normal weighted least squares fitting this bias is quantified and methods to avoid it are proposed. (orig.)

  4. Fisher statistics for analysis of diffusion tensor directional information.

    Science.gov (United States)

    Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P

    2012-04-30

    A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (pstatistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  6. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  7. Failure analysis of storage tank component in LNG regasification unit using fault tree analysis method (FTA)

    Science.gov (United States)

    Mulyana, Cukup; Muhammad, Fajar; Saad, Aswad H.; Mariah, Riveli, Nowo

    2017-03-01

    Storage tank component is the most critical component in LNG regasification terminal. It has the risk of failure and accident which impacts to human health and environment. Risk assessment is conducted to detect and reduce the risk of failure in storage tank. The aim of this research is determining and calculating the probability of failure in regasification unit of LNG. In this case, the failure is caused by Boiling Liquid Expanding Vapor Explosion (BLEVE) and jet fire in LNG storage tank component. The failure probability can be determined by using Fault Tree Analysis (FTA). Besides that, the impact of heat radiation which is generated is calculated. Fault tree for BLEVE and jet fire on storage tank component has been determined and obtained with the value of failure probability for BLEVE of 5.63 × 10-19 and for jet fire of 9.57 × 10-3. The value of failure probability for jet fire is high enough and need to be reduced by customizing PID scheme of regasification LNG unit in pipeline number 1312 and unit 1. The value of failure probability after customization has been obtained of 4.22 × 10-6.

  8. Standard guide for corrosion-related failure analysis

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2000-01-01

    1.1 This guide covers key issues to be considered when examining metallic failures when corrosion is suspected as either a major or minor causative factor. 1.2 Corrosion-related failures could include one or more of the following: change in surface appearance (for example, tarnish, rust, color change), pin hole leak, catastrophic structural failure (for example, collapse, explosive rupture, implosive rupture, cracking), weld failure, loss of electrical continuity, and loss of functionality (for example, seizure, galling, spalling, swelling). 1.3 Issues covered include overall failure site conditions, operating conditions at the time of failure, history of equipment and its operation, corrosion product sampling, environmental sampling, metallurgical and electrochemical factors, morphology (mode) or failure, and by considering the preceding, deducing the cause(s) of corrosion failure. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibili...

  9. Machinery failure analysis and troubleshooting practical machinery management for process plants

    CERN Document Server

    Bloch, Heinz P

    2012-01-01

    Solve the machinery failure problems costing you time and money with this classic, comprehensive guide to analysis and troubleshooting  Provides detailed, complete and accurate information on anticipating risk of component failure and avoiding equipment downtime Includes numerous photographs of failed parts to ensure you are familiar with the visual evidence you need to recognize Covers proven approaches to failure definition and offers failure identification and analysis methods that can be applied to virtually all problem situations Demonstr

  10. Failure analysis of collector circuits associated with wind farms

    Directory of Open Access Journals (Sweden)

    Clifton Ashley P.

    2017-01-01

    Full Text Available Wind farm collector circuits generally comprise several wind turbine generators (WTG’s. WTG’s are connected in parallel to a substation. This connection acts as the point-of-connection to the national electricity grid. The electrical load in these circuits is close to component (power cables and accessories ratings. The objective of this paper is to identify cable joint failure paths; and, develop an understanding of specific contributing factors. All findings presented were established from literature review involving data analysis and discussion with industry experts working across the wind industry. Application of forces, inadequate workmanship, incorrect thermal resistance or other contributing factors, all contribute to high conductor operating temperatures. High conductor operating temperatures highlight issues including insufficient environmental heat transfer due to the use of inadequate cable trenching materials. This in turn results in the imbalanced application of force, experienced at the cable joint, as a direct result of frequent thermal expansion and contraction. For most cable joint failures, the root cause is insulation breakdown due to sustained deterioration of the cross-linked polyethylene insulation. This is a direct result from excessive operating temperatures.

  11. Survival analysis of heart failure patients: A case study.

    Directory of Open Access Journals (Sweden)

    Tanvir Ahmad

    Full Text Available This study was focused on survival analysis of heart failure patients who were admitted to Institute of Cardiology and Allied hospital Faisalabad-Pakistan during April-December (2015. All the patients were aged 40 years or above, having left ventricular systolic dysfunction, belonging to NYHA class III and IV. Cox regression was used to model mortality considering age, ejection fraction, serum creatinine, serum sodium, anemia, platelets, creatinine phosphokinase, blood pressure, gender, diabetes and smoking status as potentially contributing for mortality. Kaplan Meier plot was used to study the general pattern of survival which showed high intensity of mortality in the initial days and then a gradual increase up to the end of study. Martingale residuals were used to assess functional form of variables. Results were validated computing calibration slope and discrimination ability of model via bootstrapping. For graphical prediction of survival probability, a nomogram was constructed. Age, renal dysfunction, blood pressure, ejection fraction and anemia were found as significant risk factors for mortality among heart failure patients.

  12. Survival analysis of heart failure patients: A case study.

    Science.gov (United States)

    Ahmad, Tanvir; Munir, Assia; Bhatti, Sajjad Haider; Aftab, Muhammad; Raza, Muhammad Ali

    2017-01-01

    This study was focused on survival analysis of heart failure patients who were admitted to Institute of Cardiology and Allied hospital Faisalabad-Pakistan during April-December (2015). All the patients were aged 40 years or above, having left ventricular systolic dysfunction, belonging to NYHA class III and IV. Cox regression was used to model mortality considering age, ejection fraction, serum creatinine, serum sodium, anemia, platelets, creatinine phosphokinase, blood pressure, gender, diabetes and smoking status as potentially contributing for mortality. Kaplan Meier plot was used to study the general pattern of survival which showed high intensity of mortality in the initial days and then a gradual increase up to the end of study. Martingale residuals were used to assess functional form of variables. Results were validated computing calibration slope and discrimination ability of model via bootstrapping. For graphical prediction of survival probability, a nomogram was constructed. Age, renal dysfunction, blood pressure, ejection fraction and anemia were found as significant risk factors for mortality among heart failure patients.

  13. Failure analysis of re-bars during bending operations

    Directory of Open Access Journals (Sweden)

    Souvik Das

    2014-10-01

    Full Text Available Thermo-mechanical treated (TMT rebar is suitable material for reinforcing concrete structures on accounts of similarity in thermal expansion, ability to bond well with concrete and, above all the ability to shoulder most of the tensile stress acting on the structure and also steel manufacturing industry has successfully developed a corrosion-resistant variety of rebar for the construction industry. As the TMT is the finish product thus proper control of rolling parameters and water box is needed to achieve adequate property. Water box plays an important role for achieving the final structure and property of the rebars. Water box is responsible for outer rim formation and which helps to achieve the yield strength of the material. The present paper highlights failure investigation of a failed rebar during bending operations. From fractography and microstructural analysis it is confirmed that the rebar sample failed in brittle manner due to through harden martensitic structure and which indicates that there is some anomaly in water box resulting in these premature failures.

  14. STATCAT, Statistical Analysis of Parametric and Non-Parametric Data

    International Nuclear Information System (INIS)

    David, Hugh

    1990-01-01

    1 - Description of program or function: A suite of 26 programs designed to facilitate the appropriate statistical analysis and data handling of parametric and non-parametric data, using classical and modern univariate and multivariate methods. 2 - Method of solution: Data is read entry by entry, using a choice of input formats, and the resultant data bank is checked for out-of- range, rare, extreme or missing data. The completed STATCAT data bank can be treated by a variety of descriptive and inferential statistical methods, and modified, using other standard programs as required

  15. Analytical and statistical analysis of elemental composition of lichens

    International Nuclear Information System (INIS)

    Calvelo, S.; Baccala, N.; Bubach, D.; Arribere, M.A.; Riberio Guevara, S.

    1997-01-01

    The elemental composition of lichens from remote southern South America regions has been studied with analytical and statistical techniques to determine if the values obtained reflect species, growth forms or habitat characteristics. The enrichment factors are calculated discriminated by species and collection site and compared with data available in the literature. The elemental concentrations are standardized and compared for different species. The information was statistically processed, a cluster analysis was performed using the three first principal axes of the PCA; the three groups formed are presented. Their relationship with the species, collection sites and the lichen growth forms are interpreted. (author)

  16. Detecting errors in micro and trace analysis by using statistics

    DEFF Research Database (Denmark)

    Heydorn, K.

    1993-01-01

    By assigning a standard deviation to each step in an analytical method it is possible to predict the standard deviation of each analytical result obtained by this method. If the actual variability of replicate analytical results agrees with the expected, the analytical method is said...... to be in statistical control. Significant deviations between analytical results from different laboratories reveal the presence of systematic errors, and agreement between different laboratories indicate the absence of systematic errors. This statistical approach, referred to as the analysis of precision, was applied...

  17. Multivariate statistical analysis of atom probe tomography data

    International Nuclear Information System (INIS)

    Parish, Chad M.; Miller, Michael K.

    2010-01-01

    The application of spectrum imaging multivariate statistical analysis methods, specifically principal component analysis (PCA), to atom probe tomography (APT) data has been investigated. The mathematical method of analysis is described and the results for two example datasets are analyzed and presented. The first dataset is from the analysis of a PM 2000 Fe-Cr-Al-Ti steel containing two different ultrafine precipitate populations. PCA properly describes the matrix and precipitate phases in a simple and intuitive manner. A second APT example is from the analysis of an irradiated reactor pressure vessel steel. Fine, nm-scale Cu-enriched precipitates having a core-shell structure were identified and qualitatively described by PCA. Advantages, disadvantages, and future prospects for implementing these data analysis methodologies for APT datasets, particularly with regard to quantitative analysis, are also discussed.

  18. Ayame/PAM-D apogee kick motor nozzle failure analysis

    Science.gov (United States)

    1981-01-01

    The failure of two communication satellites during firing sequence were examined. The correlation/comparison of the circumstances of the Ayame incidents and the failure of the STAR 48 (DM-2) motor are reviewed. The massive nozzle failure of the AKM to determine the impact on spacecraft performance is examined. It is recommended that a closer watch is kept on systems techniques,

  19. Using Pre-Statistical Analysis to Streamline Monitoring Assessments

    International Nuclear Information System (INIS)

    Reed, J.K.

    1999-01-01

    A variety of statistical methods exist to aid evaluation of groundwater quality and subsequent decision making in regulatory programs. These methods are applied because of large temporal and spatial extrapolations commonly applied to these data. In short, statistical conclusions often serve as a surrogate for knowledge. However, facilities with mature monitoring programs that have generated abundant data have inherently less uncertainty because of the sheer quantity of analytical results. In these cases, statistical tests can be less important, and ''expert'' data analysis should assume an important screening role.The WSRC Environmental Protection Department, working with the General Separations Area BSRI Environmental Restoration project team has developed a method for an Integrated Hydrogeological Analysis (IHA) of historical water quality data from the F and H Seepage Basins groundwater remediation project. The IHA combines common sense analytical techniques and a GIS presentation that force direct interactive evaluation of the data. The IHA can perform multiple data analysis tasks required by the RCRA permit. These include: (1) Development of a groundwater quality baseline prior to remediation startup, (2) Targeting of constituents for removal from RCRA GWPS, (3) Targeting of constituents for removal from UIC, permit, (4) Targeting of constituents for reduced, (5)Targeting of monitoring wells not producing representative samples, (6) Reduction in statistical evaluation, and (7) Identification of contamination from other facilities

  20. Clinical risk analysis with failure mode and effect analysis (FMEA) model in a dialysis unit.

    Science.gov (United States)

    Bonfant, Giovanna; Belfanti, Pietro; Paternoster, Giuseppe; Gabrielli, Danila; Gaiter, Alberto M; Manes, Massimo; Molino, Andrea; Pellu, Valentina; Ponzetti, Clemente; Farina, Massimo; Nebiolo, Pier E

    2010-01-01

    The aim of clinical risk management is to improve the quality of care provided by health care organizations and to assure patients' safety. Failure mode and effect analysis (FMEA) is a tool employed for clinical risk reduction. We applied FMEA to chronic hemodialysis outpatients. FMEA steps: (i) process study: we recorded phases and activities. (ii) Hazard analysis: we listed activity-related failure modes and their effects; described control measures; assigned severity, occurrence and detection scores for each failure mode and calculated the risk priority numbers (RPNs) by multiplying the 3 scores. Total RPN is calculated by adding single failure mode RPN. (iii) Planning: we performed a RPNs prioritization on a priority matrix taking into account the 3 scores, and we analyzed failure modes causes, made recommendations and planned new control measures. (iv) Monitoring: after failure mode elimination or reduction, we compared the resulting RPN with the previous one. Our failure modes with the highest RPN came from communication and organization problems. Two tools have been created to ameliorate information flow: "dialysis agenda" software and nursing datasheets. We scheduled nephrological examinations, and we changed both medical and nursing organization. Total RPN value decreased from 892 to 815 (8.6%) after reorganization. Employing FMEA, we worked on a few critical activities, and we reduced patients' clinical risk. A priority matrix also takes into account the weight of the control measures: we believe this evaluation is quick, because of simple priority selection, and that it decreases action times.

  1. Goal-oriented failure analysis - a systems analysis approach to hazard identification

    International Nuclear Information System (INIS)

    Reeves, A.B.; Davies, J.; Foster, J.; Wells, G.L.

    1990-01-01

    Goal-Oriented Failure Analysis, GOFA, is a methodology which is being developed to identify and analyse the potential failure modes of a hazardous plant or process. The technique will adopt a structured top-down approach, with a particular failure goal being systematically analysed. A systems analysis approach is used, with the analysis being organised around a systems diagram of the plant or process under study. GOFA will also use checklists to supplement the analysis -these checklists will be prepared in advance of a group session and will help to guide the analysis and avoid unnecessary time being spent on identifying obvious failure modes or failing to identify certain hazards or failures. GOFA is being developed with the aim of providing a hazard identification methodology which is more efficient and stimulating than the conventional approach to HAZOP. The top-down approach should ensure that the analysis is more focused and the use of a systems diagram will help to pull the analysis together at an early stage whilst also helping to structure the sessions in a more stimulating way than the conventional techniques. GOFA will be, essentially, an extension of the HAZOP methodology. GOFA is currently being computerised using a knowledge-based systems approach for implementation. The Goldworks II expert systems development tool is being used. (author)

  2. Data management and statistical analysis for environmental assessment

    International Nuclear Information System (INIS)

    Wendelberger, J.R.; McVittie, T.I.

    1995-01-01

    Data management and statistical analysis for environmental assessment are important issues on the interface of computer science and statistics. Data collection for environmental decision making can generate large quantities of various types of data. A database/GIS system developed is described which provides efficient data storage as well as visualization tools which may be integrated into the data analysis process. FIMAD is a living database and GIS system. The system has changed and developed over time to meet the needs of the Los Alamos National Laboratory Restoration Program. The system provides a repository for data which may be accessed by different individuals for different purposes. The database structure is driven by the large amount and varied types of data required for environmental assessment. The integration of the database with the GIS system provides the foundation for powerful visualization and analysis capabilities

  3. Explorations in statistics: the analysis of ratios and normalized data.

    Science.gov (United States)

    Curran-Everett, Douglas

    2013-09-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This ninth installment of Explorations in Statistics explores the analysis of ratios and normalized-or standardized-data. As researchers, we compute a ratio-a numerator divided by a denominator-to compute a proportion for some biological response or to derive some standardized variable. In each situation, we want to control for differences in the denominator when the thing we really care about is the numerator. But there is peril lurking in a ratio: only if the relationship between numerator and denominator is a straight line through the origin will the ratio be meaningful. If not, the ratio will misrepresent the true relationship between numerator and denominator. In contrast, regression techniques-these include analysis of covariance-are versatile: they can accommodate an analysis of the relationship between numerator and denominator when a ratio is useless.

  4. Statistical analysis and interpolation of compositional data in materials science.

    Science.gov (United States)

    Pesenson, Misha Z; Suram, Santosh K; Gregoire, John M

    2015-02-09

    Compositional data are ubiquitous in chemistry and materials science: analysis of elements in multicomponent systems, combinatorial problems, etc., lead to data that are non-negative and sum to a constant (for example, atomic concentrations). The constant sum constraint restricts the sampling space to a simplex instead of the usual Euclidean space. Since statistical measures such as mean and standard deviation are defined for the Euclidean space, traditional correlation studies, multivariate analysis, and hypothesis testing may lead to erroneous dependencies and incorrect inferences when applied to compositional data. Furthermore, composition measurements that are used for data analytics may not include all of the elements contained in the material; that is, the measurements may be subcompositions of a higher-dimensional parent composition. Physically meaningful statistical analysis must yield results that are invariant under the number of composition elements, requiring the application of specialized statistical tools. We present specifics and subtleties of compositional data processing through discussion of illustrative examples. We introduce basic concepts, terminology, and methods required for the analysis of compositional data and utilize them for the spatial interpolation of composition in a sputtered thin film. The results demonstrate the importance of this mathematical framework for compositional data analysis (CDA) in the fields of materials science and chemistry.

  5. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  6. Failures to further developing orphan medicinal products after designation granted in Europe: an analysis of marketing authorisation failures and abandoned drugs.

    Science.gov (United States)

    Giannuzzi, Viviana; Landi, Annalisa; Bosone, Enrico; Giannuzzi, Floriana; Nicotri, Stefano; Torrent-Farnell, Josep; Bonifazi, Fedele; Felisi, Mariagrazia; Bonifazi, Donato; Ceci, Adriana

    2017-09-11

    The research and development process in the field of rare diseases is characterised by many well-known difficulties, and a large percentage of orphan medicinal products do not reach the marketing approval.This work aims at identifying orphan medicinal products that failed the developmental process and investigating reasons for and possible factors influencing failures. Drugs designated in Europe under Regulation (European Commission) 141/2000 in the period 2000-2012 were investigated in terms of the following failures: (1) marketing authorisation failures (refused or withdrawn) and (2) drugs abandoned by sponsors during development.Possible risk factors for failure were analysed using statistically validated methods. This study points out that 437 out of 788 designations are still under development, while 219 failed the developmental process. Among the latter, 34 failed the marketing authorisation process and 185 were abandoned during the developmental process. In the first group of drugs (marketing authorisation failures), 50% reached phase II, 47% reached phase III and 3% reached phase I, while in the second group (abandoned drugs), the majority of orphan medicinal products apparently never started the development process, since no data on 48.1% of them were published and the 3.2% did not progress beyond the non-clinical stage.The reasons for failures of marketing authorisation were: efficacy/safety issues (26), insufficient data (12), quality issues (7), regulatory issues on trials (4) and commercial reasons (1). The main causes for abandoned drugs were efficacy/safety issues (reported in 54 cases), inactive companies (25.4%), change of company strategy (8.1%) and drug competition (10.8%). No information concerning reasons for failure was available for 23.2% of the analysed products. This analysis shows that failures occurred in 27.8% of all designations granted in Europe, the main reasons being safety and efficacy issues. Moreover, the stage of development

  7. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    Science.gov (United States)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  8. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    Science.gov (United States)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  9. How Analysis Informs Regulation:Success and Failure of ...

    Science.gov (United States)

    How Analysis Informs Regulation:Success and Failure of Evolving Approaches to Polyfluoroalkyl Acid Contamination The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  10. Failure analysis of the fractured wires in sternal perichronal loops.

    Science.gov (United States)

    Chao, Jesús; Voces, Roberto; Peña, Carmen

    2011-10-01

    We report failure analysis of sternal wires in two cases in which a perichronal fixation technique was used to close the sternotomy. Various characteristics of the retrieved wires were compared to those of unused wires of the same grade and same manufacturer and with surgical wire specifications. In both cases, wire fracture was un-branched and transgranular and proceeded by a high cycle fatigue process, apparently in the absence of corrosion. However, stress anlysis indicates that the effective stress produced during strong coughing is lower than the yield strength. Our findings suggest that in order to reduce the risk for sternal dehiscence, the diameter of the wire used should be increased. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Prestudy - Development of trend analysis of component failure

    International Nuclear Information System (INIS)

    Poern, K.

    1995-04-01

    The Bayesian trend analysis model that has been used for the computation of initiating event intensities (I-book) is based on the number of events that have occurred during consecutive time intervals. The model itself is a Poisson process with time-dependent intensity. For the analysis of aging it is often more relevant to use times between failures for a given component as input, where by 'time' is meant a quantity that best characterizes the age of the component (calendar time, operating time, number of activations etc). Therefore, it has been considered necessary to extend the model and the computer code to allow trend analysis of times between events, and also of several sequences of times between events. This report describes this model extension as well as an application on an introductory ageing analysis of centrifugal pumps defined in Table 5 of the T-book. The application in turn directs the attention to the need for further development of both the trend model and the data base. Figs

  12. Observations in the statistical analysis of NBG-18 nuclear graphite strength tests

    International Nuclear Information System (INIS)

    Hindley, Michael P.; Mitchell, Mark N.; Blaine, Deborah C.; Groenwold, Albert A.

    2012-01-01

    Highlights: ► Statistical analysis of NBG-18 nuclear graphite strength test. ► A Weibull distribution and normal distribution is tested for all data. ► A Bimodal distribution in the CS data is confirmed. ► The CS data set has the lowest variance. ► A Combined data set is formed and has Weibull distribution. - Abstract: The purpose of this paper is to report on the selection of a statistical distribution chosen to represent the experimental material strength of NBG-18 nuclear graphite. Three large sets of samples were tested during the material characterisation of the Pebble Bed Modular Reactor and Core Structure Ceramics materials. These sets of samples are tensile strength, flexural strength and compressive strength (CS) measurements. A relevant statistical fit is determined and the goodness of fit is also evaluated for each data set. The data sets are also normalised for ease of comparison, and combined into one representative data set. The validity of this approach is demonstrated. A second failure mode distribution is found on the CS test data. Identifying this failure mode supports the similar observations made in the past. The success of fitting the Weibull distribution through the normalised data sets allows us to improve the basis for the estimates of the variability. This could also imply that the variability on the graphite strength for the different strength measures is based on the same flaw distribution and thus a property of the material.

  13. Defining Business decline, failure and turnaround: A content analysis

    Directory of Open Access Journals (Sweden)

    Marius Pretorius

    2009-12-01

    Full Text Available In the past, researchers have often defined failure to suit their data. This has led to a lack of comparability in research outputs. The overriding objective of this paper is to propose a universal definition for the failure phenomenon. Clear definitions are a prerequisite for exploring major constructs, their relationship to failure and the context and processes involved. The study reports on the core definitions of the failure phenomenon and identifies core criteria for distinguishing between them. It places decline, failure and turnaround in perspective and highlights level of distress and turnaround as key moderating elements. It distinguishes the failure phenomenon from controversial synonyms such as closure, accidental bankruptcy and closure for alternative motives. Key words and phrases: business decline, failure, turnaround, level of distress

  14. Perceptual and statistical analysis of cardiac phase and amplitude images

    International Nuclear Information System (INIS)

    Houston, A.; Craig, A.

    1991-01-01

    A perceptual experiment was conducted using cardiac phase and amplitude images. Estimates of statistical parameters were derived from the images and the diagnostic potential of human and statistical decisions compared. Five methods were used to generate the images from 75 gated cardiac studies, 39 of which were classified as pathological. The images were presented to 12 observers experienced in nuclear medicine. The observers rated the images using a five-category scale based on their confidence of an abnormality presenting. Circular and linear statistics were used to analyse phase and amplitude image data, respectively. Estimates of mean, standard deviation (SD), skewness, kurtosis and the first term of the spatial correlation function were evaluated in the region of the left ventricle. A receiver operating characteristic analysis was performed on both sets of data and the human and statistical decisions compared. For phase images, circular SD was shown to discriminate better between normal and abnormal than experienced observers, but no single statistic discriminated as well as the human observer for amplitude images. (orig.)

  15. Failure mode and effects analysis and fault tree analysis of surface image guided cranial radiosurgery.

    Science.gov (United States)

    Manger, Ryan P; Paxton, Adam B; Pawlicki, Todd; Kim, Gwe-Ya

    2015-05-01

    Surface image guided, Linac-based radiosurgery (SIG-RS) is a modern approach for delivering radiosurgery that utilizes optical stereoscopic imaging to monitor the surface of the patient during treatment in lieu of using a head frame for patient immobilization. Considering the novelty of the SIG-RS approach and the severity of errors associated with delivery of large doses per fraction, a risk assessment should be conducted to identify potential hazards, determine their causes, and formulate mitigation strategies. The purpose of this work is to investigate SIG-RS using the combined application of failure modes and effects analysis (FMEA) and fault tree analysis (FTA), report on the effort required to complete the analysis, and evaluate the use of FTA in conjunction with FMEA. A multidisciplinary team was assembled to conduct the FMEA on the SIG-RS process. A process map detailing the steps of the SIG-RS was created to guide the FMEA. Failure modes were determined for each step in the SIG-RS process, and risk priority numbers (RPNs) were estimated for each failure mode to facilitate risk stratification. The failure modes were ranked by RPN, and FTA was used to determine the root factors contributing to the riskiest failure modes. Using the FTA, mitigation strategies were formulated to address the root factors and reduce the risk of the process. The RPNs were re-estimated based on the mitigation strategies to determine the margin of risk reduction. The FMEA and FTAs for the top two failure modes required an effort of 36 person-hours (30 person-hours for the FMEA and 6 person-hours for two FTAs). The SIG-RS process consisted of 13 major subprocesses and 91 steps, which amounted to 167 failure modes. Of the 91 steps, 16 were directly related to surface imaging. Twenty-five failure modes resulted in a RPN of 100 or greater. Only one of these top 25 failure modes was specific to surface imaging. The riskiest surface imaging failure mode had an overall RPN-rank of eighth

  16. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    Science.gov (United States)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  17. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    Science.gov (United States)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  18. State analysis of BOP using statistical and heuristic methods

    International Nuclear Information System (INIS)

    Heo, Gyun Young; Chang, Soon Heung

    2003-01-01

    Under the deregulation environment, the performance enhancement of BOP in nuclear power plants is being highlighted. To analyze performance level of BOP, we use the performance test procedures provided from an authorized institution such as ASME. However, through plant investigation, it was proved that the requirements of the performance test procedures about the reliability and quantity of sensors was difficult to be satisfied. As a solution of this, state analysis method that are the expanded concept of signal validation, was proposed on the basis of the statistical and heuristic approaches. Authors recommended the statistical linear regression model by analyzing correlation among BOP parameters as a reference state analysis method. Its advantage is that its derivation is not heuristic, it is possible to calculate model uncertainty, and it is easy to apply to an actual plant. The error of the statistical linear regression model is below 3% under normal as well as abnormal system states. Additionally a neural network model was recommended since the statistical model is impossible to apply to the validation of all of the sensors and is sensitive to the outlier that is the signal located out of a statistical distribution. Because there are a lot of sensors need to be validated in BOP, wavelet analysis (WA) were applied as a pre-processor for the reduction of input dimension and for the enhancement of training accuracy. The outlier localization capability of WA enhanced the robustness of the neural network. The trained neural network restored the degraded signals to the values within ±3% of the true signals

  19. Computerized statistical analysis with bootstrap method in nuclear medicine

    International Nuclear Information System (INIS)

    Zoccarato, O.; Sardina, M.; Zatta, G.; De Agostini, A.; Barbesti, S.; Mana, O.; Tarolo, G.L.

    1988-01-01

    Statistical analysis of data samples involves some hypothesis about the features of data themselves. The accuracy of these hypotheses can influence the results of statistical inference. Among the new methods of computer-aided statistical analysis, the bootstrap method appears to be one of the most powerful, thanks to its ability to reproduce many artificial samples starting from a single original sample and because it works without hypothesis about data distribution. The authors applied the bootstrap method to two typical situation of Nuclear Medicine Department. The determination of the normal range of serum ferritin, as assessed by radioimmunoassay and defined by the mean value ±2 standard deviations, starting from an experimental sample of small dimension, shows an unacceptable lower limit (ferritin plasmatic levels below zero). On the contrary, the results obtained by elaborating 5000 bootstrap samples gives ans interval of values (10.95 ng/ml - 72.87 ng/ml) corresponding to the normal ranges commonly reported. Moreover the authors applied the bootstrap method in evaluating the possible error associated with the correlation coefficient determined between left ventricular ejection fraction (LVEF) values obtained by first pass radionuclide angiocardiography with 99m Tc and 195m Au. The results obtained indicate a high degree of statistical correlation and give the range of r 2 values to be considered acceptable for this type of studies

  20. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Heising, Carolyn D.

    1998-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach to plant maintenance and control, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R-charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specifications limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (author)

  1. Statistical analysis of the Ft. Calhoun reactor coolant pump system

    International Nuclear Information System (INIS)

    Patel, Bimal; Heising, C.D.

    1997-01-01

    In engineering science, statistical quality control techniques have traditionally been applied to control manufacturing processes. An application to commercial nuclear power plant maintenance and control is presented that can greatly improve plant safety. As a demonstration of such an approach, a specific system is analyzed: the reactor coolant pumps (RCPs) of the Ft. Calhoun nuclear power plant. This research uses capability analysis, Shewhart X-bar, R charts, canonical correlation methods, and design of experiments to analyze the process for the state of statistical control. The results obtained show that six out of ten parameters are under control specification limits and four parameters are not in the state of statistical control. The analysis shows that statistical process control methods can be applied as an early warning system capable of identifying significant equipment problems well in advance of traditional control room alarm indicators. Such a system would provide operators with ample time to respond to possible emergency situations and thus improve plant safety and reliability. (Author)

  2. STATISTICAL ANALYSIS OF THE HEAVY NEUTRAL ATOMS MEASURED BY IBEX

    International Nuclear Information System (INIS)

    Park, Jeewoo; Kucharek, Harald; Möbius, Eberhard; Galli, André; Livadiotis, George; Fuselier, Steve A.; McComas, David J.

    2015-01-01

    We investigate the directional distribution of heavy neutral atoms in the heliosphere by using heavy neutral maps generated with the IBEX-Lo instrument over three years from 2009 to 2011. The interstellar neutral (ISN) O and Ne gas flow was found in the first-year heavy neutral map at 601 keV and its flow direction and temperature were studied. However, due to the low counting statistics, researchers have not treated the full sky maps in detail. The main goal of this study is to evaluate the statistical significance of each pixel in the heavy neutral maps to get a better understanding of the directional distribution of heavy neutral atoms in the heliosphere. Here, we examine three statistical analysis methods: the signal-to-noise filter, the confidence limit method, and the cluster analysis method. These methods allow us to exclude background from areas where the heavy neutral signal is statistically significant. These methods also allow the consistent detection of heavy neutral atom structures. The main emission feature expands toward lower longitude and higher latitude from the observational peak of the ISN O and Ne gas flow. We call this emission the extended tail. It may be an imprint of the secondary oxygen atoms generated by charge exchange between ISN hydrogen atoms and oxygen ions in the outer heliosheath

  3. Software for statistical data analysis used in Higgs searches

    International Nuclear Information System (INIS)

    Gumpert, Christian; Moneta, Lorenzo; Cranmer, Kyle; Kreiss, Sven; Verkerke, Wouter

    2014-01-01

    The analysis and interpretation of data collected by the Large Hadron Collider (LHC) requires advanced statistical tools in order to quantify the agreement between observation and theoretical models. RooStats is a project providing a statistical framework for data analysis with the focus on discoveries, confidence intervals and combination of different measurements in both Bayesian and frequentist approaches. It employs the RooFit data modelling language where mathematical concepts such as variables, (probability density) functions and integrals are represented as C++ objects. RooStats and RooFit rely on the persistency technology of the ROOT framework. The usage of a common data format enables the concept of digital publishing of complicated likelihood functions. The statistical tools have been developed in close collaboration with the LHC experiments to ensure their applicability to real-life use cases. Numerous physics results have been produced using the RooStats tools, with the discovery of the Higgs boson by the ATLAS and CMS experiments being certainly the most popular among them. We will discuss tools currently used by LHC experiments to set exclusion limits, to derive confidence intervals and to estimate discovery significances based on frequentist statistics and the asymptotic behaviour of likelihood functions. Furthermore, new developments in RooStats and performance optimisation necessary to cope with complex models depending on more than 1000 variables will be reviewed

  4. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    Science.gov (United States)

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  5. A statistical test for outlier identification in data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Morteza Khodabin

    2010-09-01

    Full Text Available In the use of peer group data to assess individual, typical or best practice performance, the effective detection of outliers is critical for achieving useful results. In these ‘‘deterministic’’ frontier models, statistical theory is now mostly available. This paper deals with the statistical pared sample method and its capability of detecting outliers in data envelopment analysis. In the presented method, each observation is deleted from the sample once and the resulting linear program is solved, leading to a distribution of efficiency estimates. Based on the achieved distribution, a pared test is designed to identify the potential outlier(s. We illustrate the method through a real data set. The method could be used in a first step, as an exploratory data analysis, before using any frontier estimation.

  6. Statistical analysis of the determinations of the Sun's Galactocentric distance

    Science.gov (United States)

    Malkin, Zinovy

    2013-02-01

    Based on several tens of R0 measurements made during the past two decades, several studies have been performed to derive the best estimate of R0. Some used just simple averaging to derive a result, whereas others provided comprehensive analyses of possible errors in published results. In either case, detailed statistical analyses of data used were not performed. However, a computation of the best estimates of the Galactic rotation constants is not only an astronomical but also a metrological task. Here we perform an analysis of 53 R0 measurements (published in the past 20 years) to assess the consistency of the data. Our analysis shows that they are internally consistent. It is also shown that any trend in the R0 estimates from the last 20 years is statistically negligible, which renders the presence of a bandwagon effect doubtful. On the other hand, the formal errors in the published R0 estimates improve significantly with time.

  7. Statistical analysis of first period of operation of FTU Tokamak

    International Nuclear Information System (INIS)

    Crisanti, F.; Apruzzese, G.; Frigione, D.; Kroegler, H.; Lovisetto, L.; Mazzitelli, G.; Podda, S.

    1996-09-01

    On the FTU Tokamak the plasma physics operations started on the 20/4/90. The first plasma had a plasma current Ip=0.75 MA for about a second. The experimental phase lasted until 7/7/94, when a long shut-down begun for installing the toroidal limiter in the inner side of the vacuum vessel. In these four years of operations plasma experiments have been successfully exploited, e.g. experiments of single and multiple pellet injections; full current drive up to Ip=300 KA was obtained by using waves at the frequency of the Lower Hybrid; analysis of ohmic plasma parameters with different materials (from the low Z silicon to high Z tungsten) as plasma facing element was performed. In this work a statistical analysis of the full period of operation is presented. Moreover, a comparison with the statistical data from other Tokamaks is attempted

  8. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    Science.gov (United States)

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  9. Statistical analysis of RHIC beam position monitors performance

    Science.gov (United States)

    Calaga, R.; Tomás, R.

    2004-04-01

    A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  10. Statistical analysis of RHIC beam position monitors performance

    Directory of Open Access Journals (Sweden)

    R. Calaga

    2004-04-01

    Full Text Available A detailed statistical analysis of beam position monitors (BPM performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.

  11. Analysis of Millstone Unit 1 system failure and maintenance data

    International Nuclear Information System (INIS)

    Bickel, J.H.; Beveridge, R.L.; Jain, N.K.; Owens, D.B.; Radder, J.A.

    1985-01-01

    As a result of a task force plan developed four years ago at Northeast Utilities, plant-specific probabilistic safety analysis models are being developed for all Northeast Utilities operating nuclear plants. An essential feature of these models is their reliance on plant-specific reliability information to the maximum extent possible. This assures that future design efforts and decisions on backfitting or procedure changes are made with full knowledge of existing plant reliability. The use of plant-specific reliability data assures that the impacts of problem components are given appropriate attention and that proper credit is given for those components, which because of plant-specific maintenance practices, have exhibited better than industry average performance. A case study of a portion of the Millstone-1 cooling system demonstrates differing results obtained by fault tree analysis and a reliability analysis using plant-specific failure data. When risk assessment techniques are being applied in resource allocation, usage of plant data clearly becomes essential for sound decision making

  12. Common pitfalls in statistical analysis: Odds versus risk

    Science.gov (United States)

    Ranganathan, Priya; Aggarwal, Rakesh; Pramesh, C. S.

    2015-01-01

    In biomedical research, we are often interested in quantifying the relationship between an exposure and an outcome. “Odds” and “Risk” are the most common terms which are used as measures of association between variables. In this article, which is the fourth in the series of common pitfalls in statistical analysis, we explain the meaning of risk and odds and the difference between the two. PMID:26623395

  13. Statistical and machine learning approaches for network analysis

    CERN Document Server

    Dehmer, Matthias

    2012-01-01

    Explore the multidisciplinary nature of complex networks through machine learning techniques Statistical and Machine Learning Approaches for Network Analysis provides an accessible framework for structurally analyzing graphs by bringing together known and novel approaches on graph classes and graph measures for classification. By providing different approaches based on experimental data, the book uniquely sets itself apart from the current literature by exploring the application of machine learning techniques to various types of complex networks. Comprised of chapters written by internation

  14. Statistical Challenges of Big Data Analysis in Medicine

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    2015-01-01

    Roč. 3, č. 1 (2015), s. 24-27 ISSN 1805-8698 R&D Projects: GA ČR GA13-23940S Grant - others:CESNET Development Fund(CZ) 494/2013 Institutional support: RVO:67985807 Keywords : big data * variable selection * classification * cluster analysis Subject RIV: BB - Applied Statistics, Operational Research http://www.ijbh.org/ijbh2015-1.pdf

  15. Research and Development on Food Nutrition Statistical Analysis Software System

    OpenAIRE

    Du Li; Ke Yun

    2013-01-01

    Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...

  16. Maximum Likelihood, Consistency and Data Envelopment Analysis: A Statistical Foundation

    OpenAIRE

    Rajiv D. Banker

    1993-01-01

    This paper provides a formal statistical basis for the efficiency evaluation techniques of data envelopment analysis (DEA). DEA estimators of the best practice monotone increasing and concave production function are shown to be also maximum likelihood estimators if the deviation of actual output from the efficient output is regarded as a stochastic variable with a monotone decreasing probability density function. While the best practice frontier estimator is biased below the theoretical front...

  17. Lifetime statistics of quantum chaos studied by a multiscale analysis

    KAUST Repository

    Di Falco, A.

    2012-04-30

    In a series of pump and probe experiments, we study the lifetime statistics of a quantum chaotic resonator when the number of open channels is greater than one. Our design embeds a stadium billiard into a two dimensional photonic crystal realized on a silicon-on-insulator substrate. We calculate resonances through a multiscale procedure that combines energy landscape analysis and wavelet transforms. Experimental data is found to follow the universal predictions arising from random matrix theory with an excellent level of agreement.

  18. Statistical Analysis of the Exchange Rate of Bitcoin

    Science.gov (United States)

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate. PMID:26222702

  19. Statistical Analysis of the Exchange Rate of Bitcoin.

    Directory of Open Access Journals (Sweden)

    Jeffrey Chu

    Full Text Available Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.

  20. Statistical Analysis of the Exchange Rate of Bitcoin.

    Science.gov (United States)

    Chu, Jeffrey; Nadarajah, Saralees; Chan, Stephen

    2015-01-01

    Bitcoin, the first electronic payment system, is becoming a popular currency. We provide a statistical analysis of the log-returns of the exchange rate of Bitcoin versus the United States Dollar. Fifteen of the most popular parametric distributions in finance are fitted to the log-returns. The generalized hyperbolic distribution is shown to give the best fit. Predictions are given for future values of the exchange rate.

  1. Analysis of spectral data with rare events statistics

    International Nuclear Information System (INIS)

    Ilyushchenko, V.I.; Chernov, N.I.

    1990-01-01

    The case is considered of analyzing experimental data, when the results of individual experimental runs cannot be summed due to large systematic errors. A statistical analysis of the hypothesis about the persistent peaks in the spectra has been performed by means of the Neyman-Pearson test. The computations demonstrate the confidence level for the hypothesis about the presence of a persistent peak in the spectrum is proportional to the square root of the number of independent experimental runs, K. 5 refs

  2. Australasian Resuscitation In Sepsis Evaluation trial statistical analysis plan.

    Science.gov (United States)

    Delaney, Anthony; Peake, Sandra L; Bellomo, Rinaldo; Cameron, Peter; Holdgate, Anna; Howe, Belinda; Higgins, Alisa; Presneill, Jeffrey; Webb, Steve

    2013-10-01

    The Australasian Resuscitation In Sepsis Evaluation (ARISE) study is an international, multicentre, randomised, controlled trial designed to evaluate the effectiveness of early goal-directed therapy compared with standard care for patients presenting to the ED with severe sepsis. In keeping with current practice, and taking into considerations aspects of trial design and reporting specific to non-pharmacologic interventions, this document outlines the principles and methods for analysing and reporting the trial results. The document is prepared prior to completion of recruitment into the ARISE study, without knowledge of the results of the interim analysis conducted by the data safety and monitoring committee and prior to completion of the two related international studies. The statistical analysis plan was designed by the ARISE chief investigators, and reviewed and approved by the ARISE steering committee. The data collected by the research team as specified in the study protocol, and detailed in the study case report form were reviewed. Information related to baseline characteristics, characteristics of delivery of the trial interventions, details of resuscitation and other related therapies, and other relevant data are described with appropriate comparisons between groups. The primary, secondary and tertiary outcomes for the study are defined, with description of the planned statistical analyses. A statistical analysis plan was developed, along with a trial profile, mock-up tables and figures. A plan for presenting baseline characteristics, microbiological and antibiotic therapy, details of the interventions, processes of care and concomitant therapies, along with adverse events are described. The primary, secondary and tertiary outcomes are described along with identification of subgroups to be analysed. A statistical analysis plan for the ARISE study has been developed, and is available in the public domain, prior to the completion of recruitment into the

  3. Precision Statistical Analysis of Images Based on Brightness Distribution

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  4. Propagated failure analysis for non-repairable systems considering both global and selective effects

    International Nuclear Information System (INIS)

    Wang Chaonan; Xing Liudong; Levitin, Gregory

    2012-01-01

    This paper proposes an algorithm for the reliability analysis of non-repairable binary systems subject to competing failure propagation and failure isolation events with both global and selective failure effects. A propagated failure that originates from a system component causes extensive damage to the rest of the system. Global effect happens when the propagated failure causes the entire system to fail; whereas selective effect happens when the propagated failure causes only failure of a subset of system components. In both cases, the failure propagation that originates from some system components (referred to as dependent components) can be isolated because of functional dependence between the dependent components and a component that prevents the failure propagation (trigger components) when the failure of the trigger component happens before the occurrence of the propagated failure. Most existing studies focus on the analysis of propagated failures with global effect. However, in many cases, propagated failures affect only a subset of system components not the entire system. Existing approaches for analyzing propagated failures with selective effect are limited to series-parallel systems. This paper proposes a combinatorial method for the propagated failure analysis considering both global and selective effects as well as the competition with the failure isolation in the time domain. The proposed method is not limited to series-parallel systems and has no limitation on the type of time-to-failure distributions for the system components. The method is verified using the Markov-based method. An example of computer memory systems is analyzed to demonstrate the application of the proposed method.

  5. Bayesian analysis of repairable systems showing a bounded failure intensity

    International Nuclear Information System (INIS)

    Guida, Maurizio; Pulcini, Gianpaolo

    2006-01-01

    The failure pattern of repairable mechanical equipment subject to deterioration phenomena sometimes shows a finite bound for the increasing failure intensity. A non-homogeneous Poisson process with bounded increasing failure intensity is then illustrated and its characteristics are discussed. A Bayesian procedure, based on prior information on model-free quantities, is developed in order to allow technical information on the failure process to be incorporated into the inferential procedure and to improve the inference accuracy. Posterior estimation of the model-free quantities and of other quantities of interest (such as the optimal replacement interval) is provided, as well as prediction on the waiting time to the next failure and on the number of failures in a future time interval is given. Finally, numerical examples are given to illustrate the proposed inferential procedure

  6. Analysis of acetal toilet fill valve supply line nut failure

    Directory of Open Access Journals (Sweden)

    Anthony Timpanaro

    2017-10-01

    Full Text Available In recent years, there has been a rise in the number of product liability cases involving the failure of toilet water supply line acetal plastic nuts. These nuts can fail in service, causing water leaks that result in significant property and financial losses. This study examines three possible failure modes of acetal plastic toilet water supply nuts. The three failure modes tested were all due to over load failure of the acetal nut and are as follows: (1 Overtightening of the supply line acetal nut, (2 Supply line lateral pull and, (3 Embrittled supply line lateral pull. Additionally, a “hand-tight” torque survey was conducted. The fracture surfaces and characteristics of these failure tests were examined with Stereo Microscopy and Scanning Electron Microscopy (SEM. The failure modes were compared and contrasted to provide guidance in determination of cause in these investigations.

  7. Observations on analysis, testing and failure of prestressed concrete containments

    International Nuclear Information System (INIS)

    Murray, D.W.

    1984-01-01

    The paper reviews the mechanics which indicate that a bursting failure with large energy release is the failure mechanism to be expected from ductile lined containment structures pressurized to failure. It reviews a study which shows that, because of leakage, this is not the case for unlined prestressed containments. It argues that current practice, since it does not specifically address the bursting failure problem for lined prestressed containments, is inadequate to ensure that this type of failure could not occur. It concludes that, in view of the inadequacy of the current state-of-the-art to predict leakage from lined structures, the logical remedy is to eliminate all possibility of bursting failure by making provision for venting of containments. (orig.)

  8. Failure mode and effects analysis: A community practice perspective.

    Science.gov (United States)

    Schuller, Bradley W; Burns, Angi; Ceilley, Elizabeth A; King, Alan; LeTourneau, Joan; Markovic, Alexander; Sterkel, Lynda; Taplin, Brigid; Wanner, Jennifer; Albert, Jeffrey M

    2017-11-01

    To report our early experiences with failure mode and effects analysis (FMEA) in a community practice setting. The FMEA facilitator received extensive training at the AAPM Summer School. Early efforts focused on department education and emphasized the need for process evaluation in the context of high profile radiation therapy accidents. A multidisciplinary team was assembled with representation from each of the major department disciplines. Stereotactic radiosurgery (SRS) was identified as the most appropriate treatment technique for the first FMEA evaluation, as it is largely self-contained and has the potential to produce high impact failure modes. Process mapping was completed using breakout sessions, and then compiled into a simple electronic format. Weekly sessions were used to complete the FMEA evaluation. Risk priority number (RPN) values > 100 or severity scores of 9 or 10 were considered high risk. The overall time commitment was also tracked. The final SRS process map contained 15 major process steps and 183 subprocess steps. Splitting the process map into individual assignments was a successful strategy for our group. The process map was designed to contain enough detail such that another radiation oncology team would be able to perform our procedures. Continuous facilitator involvement helped maintain consistent scoring during FMEA. Practice changes were made responding to the highest RPN scores, and new resulting RPN scores were below our high-risk threshold. The estimated person-hour equivalent for project completion was 258 hr. This report provides important details on the initial steps we took to complete our first FMEA, providing guidance for community practices seeking to incorporate this process into their quality assurance (QA) program. Determining the feasibility of implementing complex QA processes into different practice settings will take on increasing significance as the field of radiation oncology transitions into the new TG-100 QA

  9. Physicochemical characterization and failure analysis of military coating systems

    Science.gov (United States)

    Keene, Lionel Thomas

    Modern military coating systems, as fielded by all branches of the U.S. military, generally consist of a diverse array of organic and inorganic components that can complicate their physicochemical analysis. These coating systems consist of VOC-solvent/waterborne automotive grade polyurethane matrix containing a variety of inorganic pigments and flattening agents. The research presented here was designed to overcome the practical difficulties regarding the study of such systems through the combined application of several cross-disciplinary techniques, including vibrational spectroscopy, electron microscopy, microtomy, ultra-fast laser ablation and optical interferometry. The goal of this research has been to determine the degree and spatial progression of weathering-induced alteration of military coating systems as a whole, as well as to determine the failure modes involved, and characterizing the impact of these failures on the physical barrier performance of the coatings. Transmission-mode Fourier Transform Infrared (FTIR) spectroscopy has been applied to cross-sections of both baseline and artificially weathered samples to elucidate weathering-induced spatial gradients to the baseline chemistry of the coatings. A large discrepancy in physical durability (as indicated by the spatial progression of these gradients) has been found between older and newer generation coatings. Data will be shown implicating silica fillers (previously considered inert) as the probable cause for this behavioral divergence. A case study is presented wherein the application of the aforementioned FTIR technique fails to predict the durability of the coating system as a whole. The exploitation of the ultra-fast optical phenomenon of femtosecond (10-15S) laser ablation is studied as a potential tool to facilitate spectroscopic depth profiling of composite materials. Finally, the interferometric technique of Phase Shifting was evaluated as a potential high-sensitivity technique applied to the

  10. Data analysis using the Binomial Failure Rate common cause model

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1983-09-01

    This report explains how to use the Binomial Failure Rate (BFR) method to estimate common cause failure rates. The entire method is described, beginning with the conceptual model, and covering practical issues of data preparation, treatment of variation in the failure rates, Bayesian estimation of the quantities of interest, checking the model assumptions for lack of fit to the data, and the ultimate application of the answers

  11. Analysis and prevention of human failure in nuclear power plants

    International Nuclear Information System (INIS)

    Liu Xinshuan

    2001-01-01

    Based on the performances in Daya Bay Nuclear Power Plant and the common experience from the world nuclear industry, the features and usual kinds of human failures in nuclear power plants are highlighted and the prominent factors on the personal, external and decision problems which might cause the human failures are analyzed. Effective preventive measures have been proposed respectively. Some successful human-failure-prevention practices applied in the Daya Bay Nuclear Power Plant are illustrated specifically

  12. SAS and R data management, statistical analysis, and graphics

    CERN Document Server

    Kleinman, Ken

    2009-01-01

    An All-in-One Resource for Using SAS and R to Carry out Common TasksProvides a path between languages that is easier than reading complete documentationSAS and R: Data Management, Statistical Analysis, and Graphics presents an easy way to learn how to perform an analytical task in both SAS and R, without having to navigate through the extensive, idiosyncratic, and sometimes unwieldy software documentation. The book covers many common tasks, such as data management, descriptive summaries, inferential procedures, regression analysis, and the creation of graphics, along with more complex applicat

  13. Neutron activation and statistical analysis of pottery from Thera, Greece

    International Nuclear Information System (INIS)

    Kilikoglou, V.; Grimanis, A.P.; Karayannis, M.I.

    1990-01-01

    Neutron activation analysis, in combination with multivariate analysis of the generated data, was used for the chemical characterization of prehistoric pottery from the Greek islands of Thera, Melos (islands with similar geology) and Crete. The statistical procedure which proved that Theran pottery could be distinguished from Melian is described. This discrimination, attained for the first time, was mainly based on the concentrations of the trace elements Sm, Yb, Lu and Cr. Also, Cretan imports to both Thera and Melos were clearly separable from local products. (author) 22 refs.; 1 fig.; 4 tabs

  14. Statistical Analysis of Hypercalcaemia Data related to Transferability

    DEFF Research Database (Denmark)

    Frølich, Anne; Nielsen, Bo Friis

    2005-01-01

    In this report we describe statistical analysis related to a study of hypercalcaemia carried out in the Copenhagen area in the ten year period from 1984 to 1994. Results from the study have previously been publised in a number of papers [3, 4, 5, 6, 7, 8, 9] and in various abstracts and posters...... at conferences during the late eighties and early nineties. In this report we give a more detailed description of many of the analysis and provide some new results primarily by simultaneous studies of several databases....

  15. Analysis of Preference Data Using Intermediate Test Statistic Abstract

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-06-01

    Jun 1, 2013 ... West African Journal of Industrial and Academic Research Vol.7 No. 1 June ... Keywords:-Preference data, Friedman statistic, multinomial test statistic, intermediate test statistic. ... new method and consequently a new statistic ...

  16. Common-Cause Failure Analysis in Event Assessment

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Kelly, D.L.

    2008-01-01

    This paper reviews the basic concepts of modeling common-cause failures (CCFs) in reliability and risk studies and then applies these concepts to the treatment of CCF in event assessment. The cases of a failed component (with and without shared CCF potential) and a component being unavailable due to preventive maintenance or testing are addressed. The treatment of two related failure modes (e.g. failure to start and failure to run) is a new feature of this paper, as is the treatment of asymmetry within a common-cause component group

  17. Statistical Analysis of 30 Years Rainfall Data: A Case Study

    Science.gov (United States)

    Arvind, G.; Ashok Kumar, P.; Girish Karthi, S.; Suribabu, C. R.

    2017-07-01

    Rainfall is a prime input for various engineering design such as hydraulic structures, bridges and culverts, canals, storm water sewer and road drainage system. The detailed statistical analysis of each region is essential to estimate the relevant input value for design and analysis of engineering structures and also for crop planning. A rain gauge station located closely in Trichy district is selected for statistical analysis where agriculture is the prime occupation. The daily rainfall data for a period of 30 years is used to understand normal rainfall, deficit rainfall, Excess rainfall and Seasonal rainfall of the selected circle headquarters. Further various plotting position formulae available is used to evaluate return period of monthly, seasonally and annual rainfall. This analysis will provide useful information for water resources planner, farmers and urban engineers to assess the availability of water and create the storage accordingly. The mean, standard deviation and coefficient of variation of monthly and annual rainfall was calculated to check the rainfall variability. From the calculated results, the rainfall pattern is found to be erratic. The best fit probability distribution was identified based on the minimum deviation between actual and estimated values. The scientific results and the analysis paved the way to determine the proper onset and withdrawal of monsoon results which were used for land preparation and sowing.

  18. Statistical investigations of the failure behaviour of components in the AVR experimental nuclear power plant. Vol. 2

    International Nuclear Information System (INIS)

    Meyna, A.; Mock, R.; Tietze, A.; Hennings, W.

    1989-08-01

    From operational records of the years 1977 to 1986, service life distributions of helium valves in gas circuits of the AVR were determined. Results are constant failure rates in the range from 3 to 6x10 -6 /h and, for some populations, indications of time dependent failure rates. Nonparametric methods showed only limited efficiency. For a Bayesian approach the necessary prior information was missing. Furthermore, the main failure causes could be determined. (orig./HP) [de

  19. Validation of statistical models for creep rupture by parametric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bolton, J., E-mail: john.bolton@uwclub.net [65, Fisher Ave., Rugby, Warks CV22 5HW (United Kingdom)

    2012-01-15

    Statistical analysis is an efficient method for the optimisation of any candidate mathematical model of creep rupture data, and for the comparative ranking of competing models. However, when a series of candidate models has been examined and the best of the series has been identified, there is no statistical criterion to determine whether a yet more accurate model might be devised. Hence there remains some uncertainty that the best of any series examined is sufficiently accurate to be considered reliable as a basis for extrapolation. This paper proposes that models should be validated primarily by parametric graphical comparison to rupture data and rupture gradient data. It proposes that no mathematical model should be considered reliable for extrapolation unless the visible divergence between model and data is so small as to leave no apparent scope for further reduction. This study is based on the data for a 12% Cr alloy steel used in BS PD6605:1998 to exemplify its recommended statistical analysis procedure. The models considered in this paper include a) a relatively simple model, b) the PD6605 recommended model and c) a more accurate model of somewhat greater complexity. - Highlights: Black-Right-Pointing-Pointer The paper discusses the validation of creep rupture models derived from statistical analysis. Black-Right-Pointing-Pointer It demonstrates that models can be satisfactorily validated by a visual-graphic comparison of models to data. Black-Right-Pointing-Pointer The method proposed utilises test data both as conventional rupture stress and as rupture stress gradient. Black-Right-Pointing-Pointer The approach is shown to be more reliable than a well-established and widely used method (BS PD6605).

  20. Models and analysis for multivariate failure time data

    Science.gov (United States)

    Shih, Joanna Huang

    The goal of this research is to develop and investigate models and analytic methods for multivariate failure time data. We compare models in terms of direct modeling of the margins, flexibility of dependency structure, local vs. global measures of association, and ease of implementation. In particular, we study copula models, and models produced by right neutral cumulative hazard functions and right neutral hazard functions. We examine the changes of association over time for families of bivariate distributions induced from these models by displaying their density contour plots, conditional density plots, correlation curves of Doksum et al, and local cross ratios of Oakes. We know that bivariate distributions with same margins might exhibit quite different dependency structures. In addition to modeling, we study estimation procedures. For copula models, we investigate three estimation procedures. the first procedure is full maximum likelihood. The second procedure is two-stage maximum likelihood. At stage 1, we estimate the parameters in the margins by maximizing the marginal likelihood. At stage 2, we estimate the dependency structure by fixing the margins at the estimated ones. The third procedure is two-stage partially parametric maximum likelihood. It is similar to the second procedure, but we estimate the margins by the Kaplan-Meier estimate. We derive asymptotic properties for these three estimation procedures and compare their efficiency by Monte-Carlo simulations and direct computations. For models produced by right neutral cumulative hazards and right neutral hazards, we derive the likelihood and investigate the properties of the maximum likelihood estimates. Finally, we develop goodness of fit tests for the dependency structure in the copula models. We derive a test statistic and its asymptotic properties based on the test of homogeneity of Zelterman and Chen (1988), and a graphical diagnostic procedure based on the empirical Bayes approach. We study the

  1. A NEW APPROACH TO DETECT CONGESTIVE HEART FAILURE USING DETRENDED FLUCTUATION ANALYSIS OF ELECTROCARDIOGRAM SIGNALS

    Directory of Open Access Journals (Sweden)

    CHANDRAKAR KAMATH

    2015-02-01

    Full Text Available The aim of this study is to evaluate how far the detrended fluctuation analysis (DFA approach helps to characterize the short-term and intermediate-term fractal correlations in the raw electrocardiogram (ECG signals and thereby discriminate between normal and congestive heart failure (CHF subjects. The DFA-1 calculations were performed on normal and CHF short-term ECG segments, of the order of 20 seconds duration. Differences were found in shortterm and intermediate-term correlation properties and the corresponding scaling exponents of the two groups (normal and CHF. The statistical analyses show that short-term fractal scaling exponent alone is sufficient to distinguish between normal and CHF subjects. The receiver operating characteristic curve (ROC analysis confirms the robustness of this new approach and exhibits an average accuracy that exceeds 98.2%, average sensitivity of about 98.4%, positive predictivity of 98.00%, and average specificity of 98.00%.

  2. Statistical analysis about corrosion in nuclear power plants

    International Nuclear Information System (INIS)

    Naquid G, C.; Medina F, A.; Zamora R, L.

    1999-01-01

    Nowadays, it has been carried out the investigations related with the structure degradation mechanisms, systems or and components in the nuclear power plants, since a lot of the involved processes are the responsible of the reliability of these ones, of the integrity of their components, of the safety aspects and others. This work presents the statistics of the studies related with materials corrosion in its wide variety and specific mechanisms. These exist at world level in the PWR, BWR, and WWER reactors, analysing the AIRS (Advanced Incident Reporting System) during the period between 1993-1998 in the two first plants in during the period between 1982-1995 for the WWER. The factors identification allows characterize them as those which apply, they are what have happen by the presence of some corrosion mechanism. Those which not apply, these are due to incidental by natural factors, mechanical failures and human errors. Finally, the total number of cases analysed, they correspond to the total cases which apply and not apply. (Author)

  3. [Failure mode and effects analysis on computerized drug prescriptions].

    Science.gov (United States)

    Paredes-Atenciano, J A; Roldán-Aviña, J P; González-García, Mercedes; Blanco-Sánchez, M C; Pinto-Melero, M A; Pérez-Ramírez, C; Calvo Rubio-Burgos, Miguel; Osuna-Navarro, F J; Jurado-Carmona, A M

    2015-01-01

    To identify and analyze errors in drug prescriptions of patients treated in a "high resolution" hospital by applying a Failure mode and effects analysis (FMEA).Material and methods A multidisciplinary group of medical specialties and nursing analyzed medical records where drug prescriptions were held in free text format. An FMEA was developed in which the risk priority index (RPI) was obtained from a cross-sectional observational study using an audit of the medical records, carried out in 2 phases: 1) Pre-intervention testing, and (2) evaluation of improvement actions after the first analysis. An audit sample size of 679 medical records from a total of 2,096 patients was calculated using stratified sampling and random selection of clinical events. Prescription errors decreased by 22.2% in the second phase. FMEA showed a greater RPI in "unspecified route of administration" and "dosage unspecified", with no significant decreases observed in the second phase, although it did detect, "incorrect dosing time", "contraindication due to drug allergy", "wrong patient" or "duplicate prescription", which resulted in the improvement of prescriptions. Drug prescription errors have been identified and analyzed by FMEA methodology, improving the clinical safety of these prescriptions. This tool allows updates of electronic prescribing to be monitored. To avoid such errors would require the mandatory completion of all sections of a prescription. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  4. Analysis Method of Common Cause Failure on Non-safety Digital Control System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yun Goo; Oh, Eun Gse [KHNP, Daejeon (Korea, Republic of)

    2014-08-15

    The effects of common cause failure on safety digital instrumentation and control system had been considered in defense in depth analysis with safety analysis method. However, the effects of common cause failure on non-safety digital instrumentation and control system also should be evaluated. The common cause failure can be included in credible failure on the non-safety system. In the I and C architecture of nuclear power plant, many design feature has been applied for the functional integrity of control system. One of that is segmentation. Segmentation defenses the propagation of faults in the I and C architecture. Some of effects from common cause failure also can be limited by segmentation. Therefore, in this paper there are two type of failure mode, one is failures in one control group which is segmented, and the other is failures in multiple control group because that the segmentation cannot defense all effects from common cause failure. For each type, the worst failure scenario is needed to be determined, so the analysis method has been proposed in this paper. The evaluation can be qualitative when there is sufficient justification that the effects are bounded in previous safety analysis. When it is not bounded in previous safety analysis, additional analysis should be done with conservative assumptions method of previous safety analysis or best estimation method with realistic assumptions.

  5. Multivariate statistical analysis a high-dimensional approach

    CERN Document Server

    Serdobolskii, V

    2000-01-01

    In the last few decades the accumulation of large amounts of in­ formation in numerous applications. has stimtllated an increased in­ terest in multivariate analysis. Computer technologies allow one to use multi-dimensional and multi-parametric models successfully. At the same time, an interest arose in statistical analysis with a de­ ficiency of sample data. Nevertheless, it is difficult to describe the recent state of affairs in applied multivariate methods as satisfactory. Unimprovable (dominating) statistical procedures are still unknown except for a few specific cases. The simplest problem of estimat­ ing the mean vector with minimum quadratic risk is unsolved, even for normal distributions. Commonly used standard linear multivari­ ate procedures based on the inversion of sample covariance matrices can lead to unstable results or provide no solution in dependence of data. Programs included in standard statistical packages cannot process 'multi-collinear data' and there are no theoretical recommen­ ...

  6. Statistical wind analysis for near-space applications

    Science.gov (United States)

    Roney, Jason A.

    2007-09-01

    Statistical wind models were developed based on the existing observational wind data for near-space altitudes between 60 000 and 100 000 ft (18 30 km) above ground level (AGL) at two locations, Akon, OH, USA, and White Sands, NM, USA. These two sites are envisioned as playing a crucial role in the first flights of high-altitude airships. The analysis shown in this paper has not been previously applied to this region of the stratosphere for such an application. Standard statistics were compiled for these data such as mean, median, maximum wind speed, and standard deviation, and the data were modeled with Weibull distributions. These statistics indicated, on a yearly average, there is a lull or a “knee” in the wind between 65 000 and 72 000 ft AGL (20 22 km). From the standard statistics, trends at both locations indicated substantial seasonal variation in the mean wind speed at these heights. The yearly and monthly statistical modeling indicated that Weibull distributions were a reasonable model for the data. Forecasts and hindcasts were done by using a Weibull model based on 2004 data and comparing the model with the 2003 and 2005 data. The 2004 distribution was also a reasonable model for these years. Lastly, the Weibull distribution and cumulative function were used to predict the 50%, 95%, and 99% winds, which are directly related to the expected power requirements of a near-space station-keeping airship. These values indicated that using only the standard deviation of the mean may underestimate the operational conditions.

  7. Bayesian Sensitivity Analysis of Statistical Models with Missing Data.

    Science.gov (United States)

    Zhu, Hongtu; Ibrahim, Joseph G; Tang, Niansheng

    2014-04-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures.

  8. A big data analysis approach for rail failure risk assessment

    NARCIS (Netherlands)

    Jamshidi, A.; Faghih Roohi, S.; Hajizadeh, S.; Nunez Vicencio, Alfredo; Babuska, R.; Dollevoet, R.P.B.J.; Li, Z.; De Schutter, B.H.K.

    2017-01-01

    Railway infrastructure monitoring is a vital task to ensure rail transportation safety. A rail failure could result in not only a considerable impact on train delays and maintenance costs, but also on safety of passengers. In this article, the aim is to assess the risk of a rail failure by

  9. Analysis of transient fuel failure mechanisms: selected ANL programs

    International Nuclear Information System (INIS)

    Deitrich, L.W.

    1975-01-01

    Analytical programs at Argonne National Laboratory related to fuel pin failure mechanisms in fast-reactor accident transients are described. The studies include transient fuel pin mechanics, mechanics of unclad fuel, and mechanical effects concerning potential fuel failure propagation. (U.S.).

  10. Letter report seismic shutdown system failure mode and effect analysis

    International Nuclear Information System (INIS)

    KECK, R.D.

    1999-01-01

    The Supply Ventilation System Seismic Shutdown ensures that the 234-52 building supply fans, the dry air process fans and vertical development calciner are shutdown following a seismic event. This evaluates the failure modes and determines the effects of the failure modes

  11. STATISTICAL ANALYSIS OF SPORT MOVEMENT OBSERVATIONS: THE CASE OF ORIENTEERING

    Directory of Open Access Journals (Sweden)

    K. Amouzandeh

    2017-09-01

    Full Text Available Study of movement observations is becoming more popular in several applications. Particularly, analyzing sport movement time series has been considered as a demanding area. However, most of the attempts made on analyzing movement sport data have focused on spatial aspects of movement to extract some movement characteristics, such as spatial patterns and similarities. This paper proposes statistical analysis of sport movement observations, which refers to analyzing changes in the spatial movement attributes (e.g. distance, altitude and slope and non-spatial movement attributes (e.g. speed and heart rate of athletes. As the case study, an example dataset of movement observations acquired during the “orienteering” sport is presented and statistically analyzed.

  12. Noise removing in encrypted color images by statistical analysis

    Science.gov (United States)

    Islam, N.; Puech, W.

    2012-03-01

    Cryptographic techniques are used to secure confidential data from unauthorized access but these techniques are very sensitive to noise. A single bit change in encrypted data can have catastrophic impact over the decrypted data. This paper addresses the problem of removing bit error in visual data which are encrypted using AES algorithm in the CBC mode. In order to remove the noise, a method is proposed which is based on the statistical analysis of each block during the decryption. The proposed method exploits local statistics of the visual data and confusion/diffusion properties of the encryption algorithm to remove the errors. Experimental results show that the proposed method can be used at the receiving end for the possible solution for noise removing in visual data in encrypted domain.

  13. Multivariate statistical pattern recognition system for reactor noise analysis

    International Nuclear Information System (INIS)

    Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.

    1976-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system

  14. Reactor noise analysis by statistical pattern recognition methods

    International Nuclear Information System (INIS)

    Howington, L.C.; Gonzalez, R.C.

    1976-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis is presented. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, updating, and data compacting capabilities. System design emphasizes control of the false-alarm rate. Its abilities to learn normal patterns, to recognize deviations from these patterns, and to reduce the dimensionality of data with minimum error were evaluated by experiments at the Oak Ridge National Laboratory (ORNL) High-Flux Isotope Reactor. Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the pattern recognition system

  15. Multivariate statistical pattern recognition system for reactor noise analysis

    International Nuclear Information System (INIS)

    Gonzalez, R.C.; Howington, L.C.; Sides, W.H. Jr.; Kryter, R.C.

    1975-01-01

    A multivariate statistical pattern recognition system for reactor noise analysis was developed. The basis of the system is a transformation for decoupling correlated variables and algorithms for inferring probability density functions. The system is adaptable to a variety of statistical properties of the data, and it has learning, tracking, and updating capabilities. System design emphasizes control of the false-alarm rate. The ability of the system to learn normal patterns of reactor behavior and to recognize deviations from these patterns was evaluated by experiments at the ORNL High-Flux Isotope Reactor (HFIR). Power perturbations of less than 0.1 percent of the mean value in selected frequency ranges were detected by the system. 19 references

  16. Statistical Analysis of Sport Movement Observations: the Case of Orienteering

    Science.gov (United States)

    Amouzandeh, K.; Karimipour, F.

    2017-09-01

    Study of movement observations is becoming more popular in several applications. Particularly, analyzing sport movement time series has been considered as a demanding area. However, most of the attempts made on analyzing movement sport data have focused on spatial aspects of movement to extract some movement characteristics, such as spatial patterns and similarities. This paper proposes statistical analysis of sport movement observations, which refers to analyzing changes in the spatial movement attributes (e.g. distance, altitude and slope) and non-spatial movement attributes (e.g. speed and heart rate) of athletes. As the case study, an example dataset of movement observations acquired during the "orienteering" sport is presented and statistically analyzed.

  17. Statistical Mechanics Analysis of ATP Binding to a Multisubunit Enzyme

    International Nuclear Information System (INIS)

    Zhang Yun-Xin

    2014-01-01

    Due to inter-subunit communication, multisubunit enzymes usually hydrolyze ATP in a concerted fashion. However, so far the principle of this process remains poorly understood. In this study, from the viewpoint of statistical mechanics, a simple model is presented. In this model, we assume that the binding of ATP will change the potential of the corresponding enzyme subunit, and the degree of this change depends on the state of its adjacent subunits. The probability of enzyme in a given state satisfies the Boltzmann's distribution. Although it looks much simple, this model can fit the recent experimental data of chaperonin TRiC/CCT well. From this model, the dominant state of TRiC/CCT can be obtained. This study provide a new way to understand biophysical processe by statistical mechanics analysis. (interdisciplinary physics and related areas of science and technology)

  18. Statistical methods for data analysis in particle physics

    CERN Document Server

    AUTHOR|(CDS)2070643

    2015-01-01

    This concise set of course-based notes provides the reader with the main concepts and tools to perform statistical analysis of experimental data, in particular in the field of high-energy physics (HEP). First, an introduction to probability theory and basic statistics is given, mainly as reminder from advanced undergraduate studies, yet also in view to clearly distinguish the Frequentist versus Bayesian approaches and interpretations in subsequent applications. More advanced concepts and applications are gradually introduced, culminating in the chapter on upper limits as many applications in HEP concern hypothesis testing, where often the main goal is to provide better and better limits so as to be able to distinguish eventually between competing hypotheses or to rule out some of them altogether. Many worked examples will help newcomers to the field and graduate students to understand the pitfalls in applying theoretical concepts to actual data

  19. Statistical analysis of subjective preferences for video enhancement

    Science.gov (United States)

    Woods, Russell L.; Satgunam, PremNandhini; Bronstad, P. Matthew; Peli, Eli

    2010-02-01

    Measuring preferences for moving video quality is harder than for static images due to the fleeting and variable nature of moving video. Subjective preferences for image quality can be tested by observers indicating their preference for one image over another. Such pairwise comparisons can be analyzed using Thurstone scaling (Farrell, 1999). Thurstone (1927) scaling is widely used in applied psychology, marketing, food tasting and advertising research. Thurstone analysis constructs an arbitrary perceptual scale for the items that are compared (e.g. enhancement levels). However, Thurstone scaling does not determine the statistical significance of the differences between items on that perceptual scale. Recent papers have provided inferential statistical methods that produce an outcome similar to Thurstone scaling (Lipovetsky and Conklin, 2004). Here, we demonstrate that binary logistic regression can analyze preferences for enhanced video.

  20. Statistical Analysis of Radio Propagation Channel in Ruins Environment

    Directory of Open Access Journals (Sweden)

    Jiao He

    2015-01-01

    Full Text Available The cellphone based localization system for search and rescue in complex high density ruins has attracted a great interest in recent years, where the radio channel characteristics are critical for design and development of such a system. This paper presents a spatial smoothing estimation via rotational invariance technique (SS-ESPRIT for radio channel characterization of high density ruins. The radio propagations at three typical mobile communication bands (0.9, 1.8, and 2 GHz are investigated in two different scenarios. Channel parameters, such as arrival time, delays, and complex amplitudes, are statistically analyzed. Furthermore, a channel simulator is built based on these statistics. By comparison analysis of average excess delay and delay spread, the validation results show a good agreement between the measurements and channel modeling results.