Brasselet, Romain; Johansson, Roland S; Arleo, Angelo
We set forth an information-theoretical measure to quantify neurotransmission reliability while taking into full account the metrical properties of the spike train space. This parametric information analysis relies on similarity measures induced by the metrical relations between neural responses as spikes flow in. Thus, in order to assess the entropy, the conditional entropy, and the overall information transfer, this method does not require any a priori decoding algorithm to partition the space into equivalence classes. It therefore allows the optimal parameters of a class of distances to be determined with respect to information transmission. To validate the proposed information-theoretical approach, we study precise temporal decoding of human somatosensory signals recorded using microneurography experiments. For this analysis, we employ a similarity measure based on the Victor-Purpura spike train metrics. We show that with appropriate parameters of this distance, the relative spike times of the mechanoreceptors' responses convey enough information to perform optimal discrimination--defined as maximum metrical information and zero conditional entropy--of 81 distinct stimuli within 40 ms of the first afferent spike. The proposed information-theoretical measure proves to be a suitable generalization of Shannon mutual information in order to consider the metrics of temporal codes explicitly. It allows neurotransmission reliability to be assessed in the presence of large spike train spaces (e.g., neural population codes) with high temporal precision.
Hendrickson, D.W. [ed.
This document provides a technical report of operability, reliability, and maintenance of a plasma melter for low-level waste vitrification, in support of the Hanford Tank Waste Remediation System (TWRS) Low-Level Waste (LLW) Vitrification Program. A process description is provided that minimizes maintenance and downtime and includes material and energy balances, equipment sizes and arrangement, startup/operation/maintence/shutdown cycle descriptions, and basis for scale-up to a 200 metric ton/day production facility. Operational requirements are provided including utilities, feeds, labor, and maintenance. Equipment reliability estimates and maintenance requirements are provided which includes a list of failure modes, responses, and consequences.
Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)
More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)
Support,” Proc. SPIE 8050, 2011.  Linkov I, Welle P, Loney D, Tkachuk A, Canis L, Kim JB, Bridges T., “Use of multicriteria decision analysis to...categories of reliability and credibility. Reliability has traditionally been assessed for physical machines to support failure analysis . Source reliability...discussions, we detail an analysis of credibility and reliability. Information fusion consumers comprise users and machines of which the man-machine
Mcinroy, John E.; Saridis, George N.
Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.
Gintautas, Tomas; Sørensen, John Dalsgaard
Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...... (indicators) that can be used to assess the reliability of innovations and new technologies....
... is an electronic catalog of human genes and genetic disorders. The Web site was developed by the National Center for Biotechnology Information (NCBI), and contains text and reference information. ...
Pinar Pérez, Jesús María; García Márquez, Fausto Pedro; Tobias, Andrew Mark; Papaelias, Mayorkinos
Against the background of steadily increasing wind power generation worldwide, wind turbine manufacturers are continuing to develop a range of configurations with different combinations of pitch control, rotor speeds, gearboxes, generators and converters. This paper categorizes the main designs, focusing on their reliability by bringing together and comparing data from a selection of major studies in the literature. These are not particularly consistent but plotting failure rates against hour...
Jeffrey C. Joe; Diego Mandelli; Ronald L. Boring; Curtis L. Smith; Rachel B. Shirley
The United States Department of Energy is sponsoring the Light Water Reactor Sustainability program, which has the overall objective of supporting the near-term and the extended operation of commercial nuclear power plants. One key research and development (R&D) area in this program is the Risk-Informed Safety Margin Characterization pathway, which combines probabilistic risk simulation with thermohydraulic simulation codes to define and manage safety margins. The R&D efforts to date, however, have not included robust simulations of human operators, and how the reliability of human performance or lack thereof (i.e., human errors) can affect risk-margins and plant performance. This paper describes current and planned research efforts to address the absence of robust human reliability simulations and thereby increase the fidelity of simulated accident scenarios.
Toft, Henrik Stensgaard; Sørensen, John Dalsgaard
In order to minimise the total expected life-cycle costs of a wind turbine it is important to estimate the reliability level for all components in the wind turbine. This paper deals with reliability analysis for the tower and blades of onshore wind turbines placed in a wind farm. The limit states......) the reliability level for a wind turbine placed in a wind farm is considered, and wake effects from neighbouring wind turbines is taken into account. An illustrative example with calculation of the reliability for mudline bending of the tower is considered. In the example the design is determined according...
Martinho, Margarida Suzel Lopes; da Costa Santos, Cristina Maria Nogueira; Silva Carvalho, João Luís Mendonça; Bernardes, João Francisco Montenegro Andrade Lima
Inter-observer agreement and reliability in hysteroscopic image assessment remain uncertain and the type of factors that may influence it has only been studied in relation to the experience of hysteroscopists. We aim to assess the effect of clinical information and previous exam execution on observer agreement and reliability in the analysis of hysteroscopic video-recordings. Ninety hysteroscopies were video-recorded and randomized into a group without (Group 1) and with clinical information (Group 2). The videos were independently analyzed by three hysteroscopists, regarding lesion location, dimension, and type, as well as decision to perform a biopsy. One of the hysteroscopists had executed all the exams before. Proportions of agreement (PA) and kappa statistics (κ) with 95% confidence intervals (95% CI) were used. In Group 2, there was a higher proportion of a normal diagnosis (p analysis of the video-recordings did not significantly affect the results. With clinical information, agreement and reliability in the overall analysis of hysteroscopic video-recordings may reach almost perfect results and this was not significantly affected by the execution of the exams before the analysis. However, there is still uncertainty in the analysis of specific endometrial cavity abnormalities.
Full Text Available Internet is used by many patients to obtain relevant medical information. We assessed the impact of "Google" search on the knowledge of the parents whose ward suffered from squint. In 21 consecutive patients, the "Google" search improved the mean score of the correct answers from 47% to 62%. We found that "Google" search was useful and reliable source of information for the patients with regards to the disease etiopathogenesis and the problems caused by the disease. The internet-based information, however, was incomplete and not reliable with regards to the disease treatment.
David Gertman; Julie Marble; Steven Novack
Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991) cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.
Brombacher, A.C.; de Boer, H.A.; van 't Loo, J.
The authors focus on the development of systems with online optimized reliability. They argue that in the case of online analysis, reliability analysis should have the same importance as functional analysis currently has, and that reliability should be integrated in a design. Further the function of
Tekinerdogan, B.; Sözer, Hasan; Aksit, Mehmet
We propose a Software Architecture Reliability Analysis (SARA) approach that benefits from both reliability engineering and scenario-based software architecture analysis to provide an early reliability analysis of the software architecture. SARA makes use of failure scenarios that are prioritized
The Transit Reliability Information Program (TRIP) is a government-initiated program to assist the transit industry in satisfying its need for transit reliability information. TRIP provides this assistance through the operation of a national Data Ban...
This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.
Lenormand, Maxime; Barthelemy, Marc; Ramasco, José J
An increasing number of human activities are studied using data produced by individuals' ICT devices. In particular, when ICT data contain spatial information, they represent an invaluable source for analyzing urban dynamics. However, there have been relatively few contributions investigating the robustness of this type of results against fluctuations of data characteristics. Here, we present a stability analysis of higher-level information extracted from mobile phone data passively produced during an entire year by 9 million individuals in Senegal. We focus on two information-retrieval tasks: (a) the identification of land use in the region of Dakar from the temporal rhythms of the communication activity; (b) the identification of home and work locations of anonymized individuals, which enable to construct Origin-Destination (OD) matrices of commuting flows. Our analysis reveal that the uncertainty of results highly depends on the sample size, the scale and the period of the year at which the data were gathe...
N A Kovyazina
Full Text Available The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case. Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded uncertainty and the evaluation of the dynamics. Conclusion. Compliance with mandatory measures for laboratory quality management system enables laboratories to obtain reliable results and calculate the parameters that are able to increase the amount of information content of laboratory tests in clinical decision making.
Delgadillo, Lucy M.; Bushman, Brittani S.
Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…
Daghigh, M.; Hengst, S.; Vrouwenvelder, A.C.W.M.; Boonstra, H.
In this paper, the concepts to compute reliabilities for stationary and ergodic conditions in the presence of time-invariant, non-ergodic parameters will first be reviewed. Focus wilt be on numerical techniques like FORM and numerical integration. The effect of correlation between the environmental
Full Text Available The following article describes diagnostic and reliability aspects of amateur radio communications. The intention of precise measuring of quality and accuracy of every contact was present in ham radio from very beginning. The reports were sent and they are still used today, in many variants. In time, the tools to measure ionosphere state and propagation conditions emerged. They are fundamental for communication, especially long distance ones. The article concentrates on description of digital amateur mode, JT9. This mode is very effective, enables worldwide reach and requires simple tools.
Nicole K. Dalmer, BSc, MLIS, PhD Candidate
This narrative review examines assessments of the reliability of online health information retrieved through social media to ascertain whether health information accessed or disseminated through social media should be evaluated differently than other online health information. Several medical, library and information science, and interdisciplinary databases were searched using terms relating to social media, reliability, and health information. While social media?s increasing role in health i...
Victoria María Antonieta Martín Granados
Full Text Available The financial information is the document that the administration of a juridical entity issues to know his financial situation. The financial information is useful and confiable for the users of the financial information when this has been prepared under conditions of certainty. This certainty is provided by the administration when it establishes political and procedures of internal control, as well as the surveillance in the accomplishment of the internal control. This control incides in the financial information since it is inherent to the operative flow and extends itself in relevant information, veracious and comparable. This is important for users of the financial information, due to the fact that they take timely and objective decisions.
National Aeronautics and Space Administration — The purpose of this project is to extend current ground-based Human Reliability Analysis (HRA) techniques to a long-duration, space-based tool to more effectively...
devices. App stores also provide a tempting vector for an attacker. An attacker can take advantage of bugdoors (software defects that permit...STATIC PROGRAM ANALYSIS FOR RELIABLE, TRUSTED APPS UNIVERSITY OF WASHINGTON FEBRUARY 2017 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE...STATIC PROGRAM ANALYSIS FOR RELIABLE, TRUSTED APPS 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA8750-12-2-0107 5c. PROGRAM ELEMENT NUMBER 61101E 6. AUTHOR
Dale, Crystal Buchanan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Klein, Steven Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
This document describes the reliability, maintainability, and availability (RMA) modeling of the Los Alamos National Laboratory (LANL) design for the Closed Loop Helium Cooling System (CLHCS) planned for the NorthStar accelerator-based 99Mo production facility. The current analysis incorporates a conceptual helium recovery system, beam diagnostics, and prototype control system into the reliability analysis. The results from the 1000 hr blower test are addressed.
Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema
. For that reason, new methods are needed to assist assessing robustness and reliability at early design stages. A specific taxonomy on robustness and reliability information in design could support classifying available design information to orient new techniques assessing innovative designs.......This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...
Sharma, Naresh; Warsi, Naqueeb Ahmad
Information theory tells us that if the rate of sending information across a noisy channel were above the capacity of that channel, then the transmission would necessarily be unreliable. For classical information sent over classical or quantum channels, one could, under certain conditions, make a stronger statement that the reliability of the transmission shall decay exponentially to zero with the number of channel uses, and the proof of this statement typically relies on a certain fundamental bound on the reliability of the transmission. Such a statement or the bound has never been given for sending quantum information. We give this bound and then use it to give the first example where the reliability of sending quantum information at rates above the capacity decays exponentially to zero. We also show that our framework can be used for proving generalized bounds on the reliability.
Bobić, Branko; Štajner, Tijana; Nikolić, Aleksandra; Klun, Ivana; Srbljanović, Jelena; Djurković-Djaković, Olgica
Health education of women of childbearing age has been shown to be an acceptable approach to the prevention of toxoplasmosis, the most frequent congenitally transmitted parasitic infection. The aim of this study was to evaluate the Internet as a source of health education on toxoplasmosis in pregnancy. A group of 100 pregnant women examined in the National Reference Laboratory for Toxoplasmosis was surveyed by a questionnaire on the source of their information on toxoplasmosis. We also analyzed information offered by websites in the Serbian and Croatian languages through the Google search engine, using "toxoplasmosis" as a keyword. The 23 top websites were evaluated for comprehensiveness and accuracy of information on the impact of toxoplasmosis on the course of pregnancy, diagnosis and prevention. Having knowledge on toxoplasmosis was confirmed by 64 (64.0%) examined women, 40.6% (26/64) of whom learned about toxoplasmosis through the Internet, 48.4% from physicians, and 10.9% from friends. Increase in the degree of education was found to be associated with the probability that pregnant women would be informed via the Internet (RR=3.15, 95% CI=1.27-7.82, p=0.013). Analysis of four interactive websites (allowing users to ask questions) showed that routes of infection were the most common concern, particularly the risk presented by pet cats and dogs, followed by the diagnosis of infection (who and when should be tested, and how should the results be interpreted). Of 20 sites containing educational articles, only seven were authorized and two listed sources. Evaluation confirmed that information relevant to pregnant women was significantly more accurate than comprehensive, but no site gave both comprehensive and completely accurate information. Only four sites (20%) were good sources of information for pregnant women. Internet has proved itself as an important source of information. However, despite numerous websites, only a few offer reliable information to the
Dalmer, Nicole K
This narrative review examines assessments of the reliability of online health information retrieved through social media to ascertain whether health information accessed or disseminated through social media should be evaluated differently than other online health information. Several medical, library and information science, and interdisciplinary databases were searched using terms relating to social media, reliability, and health information. While social media's increasing role in health information consumption is recognized, studies are dominated by investigations of traditional (i.e., non-social media) sites. To more richly assess constructions of reliability when using social media for health information, future research must focus on health consumers' unique contexts, virtual relationships, and degrees of trust within their social networks.
Nicole K. Dalmer, BSc, MLIS, PhD Candidate
Full Text Available This narrative review examines assessments of the reliability of online health information retrieved through social media to ascertain whether health information accessed or disseminated through social media should be evaluated differently than other online health information. Several medical, library and information science, and interdisciplinary databases were searched using terms relating to social media, reliability, and health information. While social media’s increasing role in health information consumption is recognized, studies are dominated by investigations of traditional (i.e., non-social media sites. To more richly assess constructions of reliability when using social media for health information, future research must focus on health consumers’ unique contexts, virtual relationships, and degrees of trust within their social networks.
Steenbergen, H.M.G.M.; Lassing, B.L.; Vrouwenvelder, A.C.W.M.; Waarts, P.H.
In recent years an advanced program for the reliability analysis of flood defence systems has been under development. This paper describes the global data requirements for the application and the setup of the models. The analysis generates the probability of system failure and the contribution of
Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rasmussen, Martin [Norwegian Univ. of Science and Technology, Trondheim (Norway). Social Research; Herberger, Sarah [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ulrich, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)
This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: • Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.
Schaefer, H Martin; Ruxton, G D
Although communication underpins many biological processes, its function and basic definition remain contentious. In particular, researchers have debated whether information should be an integral part of a definition of communication and how it remains reliable. So far the handicap principle, assuming signal costs to stabilize reliable communication, has been the predominant paradigm in the study of animal communication. The role of by-product information produced by mechanisms other than the communicative interaction has been neglected in the debate on signal reliability. We argue that by-product information is common and that it provides the starting point for ritualization as the process of the evolution of communication. Second, by-product information remains unchanged during ritualization and enforces reliable communication by restricting the options for manipulation and cheating. Third, this perspective changes the focus of research on communication from studying signal costs to studying the costs of cheating. It can thus explain the reliability of signalling in many communication systems that do not rely on handicaps. We emphasize that communication can often be informative but that the evolution of communication does not cause the evolution of information because by-product information often predates and stimulates the evolution of communication. Communication is thus a consequence but not a cause of reliability. Communication is the interplay of inadvertent, informative traits and evolved traits that increase the stimulation and perception of perceivers. Viewing communication as a complex of inadvertent and derived traits facilitates understanding of the selective pressures shaping communication and those shaping information and its reliability. This viewpoint further contributes to resolving the current controversy on the role of information in communication. © 2012 The Authors. Journal of Evolutionary Biology © 2012 European Society For Evolutionary
Schneider, Ronald; Thöns, Sebastian; Straub, Daniel
An efficient approach to reliability analysis of deteriorating structural systems is presented, which considers stochastic dependence among element deterioration. Information on a deteriorating structure obtained through inspection or monitoring is included in the reliability assessment through B...... is an efficient and robust sampling-based algorithm suitable for such analyses. The approach is demonstrated in two case studies considering a steel frame structure and a Daniels system subjected to high-cycle fatigue....
Sørensen, John Dalsgaard; Rackwitz, R.; Thoft-Christensen, Palle
For an offshore structure in the North Sea it is assumed that information from measurements and inspections is available. As illustrations measurements of the significant wave height and the marine growth and different inspection and repair results are considered. It is shown how the reliability...
Thoft-Christensen, Palle; Sørensen, John Dalsgaard
. Failure of this type of system is defined either as formation of a mechanism or by failure of a prescribed number of elements. In the first case failure is independent of the order in which the elements fail, but this is not so by the second definition. The reliability analysis consists of two parts...... are described and the two definitions of failure can be used by the first formulation, but only the failure definition based on formation of a mechanism by the second formulation. The second part of the reliability analysis is an estimate of the failure probability for the structure on the basis...
Marini, Vinicius Kaster
methods, and an industrial case to assess how the use of information about robustness, reliability and safety as practised by current methods influences concept development. Current methods cannot be used in early design phases due to their dependence on detailed design information for the identification...... of attributes of robustness, reliability and safety. The uncertainty and ambiguity that are inherent to concept development impede the evaluation and improvement of attributes of robustness, reliability and safety in early design. A taxonomy was therefore developed to assess the information about...... these attributes that current methods require, and to address the need for clarity about design issues that result in risks. The concept development phase fosters ambiguity on how to satisfy requirements of robustness, reliability and safety, which is exacerbated by complexity in the individual solution...
Aria, S. E. Hosseini; Menenti, M.; Gorte, B. G. H.
Reliability analysis is usually applied to evaluate classification procedures with different classes. In this research, we have applied the analysis to two different band sets to find out which one is more reliable. These band sets provide the most informative spectral regions covered by hyperspectral images. The informative regions are identified by minimizing two dependency measures between bands: correlation coefficient and normalized mutual information. The implementations are done by a newly developed top-down method named Spectral Region Splitting (SRS) resulting in two sets of bands which are almost identical at critical spectral regions. A reliability analysis based on the thresholding technique of the two sets of bands was performed. A technique was applied to discard those pixels that are not correctly classified at the given confidence level. The results show that the informative spectral regions selected by normalized mutual information was more reliable.
Meyer, J. F.
Research is reported in the program to refine the current notion of system reliability by identifying and investigating attributes of a system which are important to reliability considerations, and to develop techniques which facilitate analysis of system reliability. Reliability analysis, and on-line fault diagnosis are discussed.
Full Text Available Introduction. Health education of women of childbearing age has been shown to be an acceptable approach to the prevention of toxoplasmosis, the most frequent congenitally transmitted parasitic infection. Objective. The aim of this study was to evaluate the Internet as a source of health education on toxoplasmosis in pregnancy. Methods. A group of 100 pregnant women examined in the National Reference Laboratory for Toxoplasmosis was surveyed by a questionnaire on the source of their information on toxoplasmosis. We also analyzed information offered by websites in the Serbian and Croatian languages through the Google search engine, using “toxoplasmosis” as a keyword. The 23 top websites were evaluated for comprehensiveness and accuracy of information on the impact of toxoplasmosis on the course of pregnancy, diagnosis and prevention. Results. Having knowledge on toxoplasmosis was confirmed by 64 (64.0% examined women, 40.6% (26/64 of whom learned about toxoplasmosis through the Internet, 48.4% from physicians, and 10.9% from friends. Increase in the degree of education was found to be associated with the probability that pregnant women would be informed via the Internet (RR=3.15, 95% CI=1.27-7.82, p=0.013. Analysis of four interactive websites (allowing users to ask questions showed that routes of infection were the most common concern, particularly the risk presented by pet cats and dogs, followed by the diagnosis of infection (who and when should be tested, and how should the results be interpreted. Of 20 sites containing educational articles, only seven were authorized and two listed sources. Evaluation confirmed that information relevant to pregnant women was significantly more accurate than comprehensive, but no site gave both comprehensive and completely accurate information. Only four sites (20% were good sources of information for pregnant women. Conclusion. Internet has proved itself as an important source of information. However
Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)
Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)
Küçükdurmaz, Fatih; Gomez, Miguel M; Secrist, Eric; Parvizi, Javad
The Internet has become the most widely-used source for patients seeking information more about their health and many sites geared towards this audience have gained widespread use in recent years. Additionally, many healthcare institutions publish their own patient-education web sites with information regarding common conditions. Little is known about how these resources impact patient health, though, as they have the potential both to inform and to misinform patients regarding their prognosis and possible treatments. In this study we investigated the reliability, readability and quality of information about femoracetabular impingement, a condition which commonly affects young patients. The terms "hip impingement" and "femoracetabular impingement" were searched in Google® in November 2013 and the first 30 results were analyzed. The LIDA scale was used to assess website accessibility, usability and reliability. The DISCERN scale was used to assess reliability and quality of information. The FRE score was used to assess readability. The patient-oriented sites performed significantly worse in LIDA reliability, and DISCERN reliability. However, the FRE score was significantly higher in patient-oriented sites. According to our results, the websites intended to attract patients searching for information regarding femoroacetabular impingement are providing a highly accessible, readable information source, but do not appear to apply a comparable amount of rigor to scientific literature or healthcare practitioner websites in regard to matters such as citing sources for information, supplying methodology and including a publication date. This indicates that while these resources are easily accessed by patients, there is potential for them to be a source of misinformation.
Ronald L. Boring
There has been strong interest in the new and emerging field called resilience engineering. This field has been quick to align itself with many existing safety disciplines, but it has also distanced itself from the field of human reliability analysis. To date, the discussion has been somewhat one-sided, with much discussion about the new insights afforded by resilience engineering. This paper presents an attempt to address resilience engineering from the perspective of human reliability analysis (HRA). It is argued that HRA shares much in common with resilience engineering and that, in fact, it can help strengthen nascent ideas in resilience engineering. This paper seeks to clarify and ultimately refute the arguments that have served to divide HRA and resilience engineering.
Petersen, Lars; Esbensen, Kim Harry
regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data......) that fully cover all practical aspects of sampling and provides a handy “toolbox” for samplers, engineers, laboratory and scientific personnel....
Soldz, Stephen; Panas, Lee; Rodriguez-Howard, Mayra
State substance abuse management information systems increasingly are becoming important tools for research, program management, and policy formulation at federal and state levels. These systems are currently undergoing radical expansion, leading to the creation of statewide performance and outcome monitoring systems for publicly-funded substance abuse treatment. This expansion makes imperative increased knowledge of the psychometric properties of the data in these systems. This study develops a method for examining the reliability of such data and applies it to the Massachusetts Substance Abuse Management Information System (SAMIS). Cohen's kappa, intraclass correlations, and the techniques of Heise (1969) are used to assess the reliability of different types of variables. Results show that key variables on the SAMIS Admission Form exhibit moderate to high reliability, supporting the use of this data for aggregate analyses. At the same time, caution should be used in making judgments about individual patients. Copyright 2002 Wiley Periodicals, Inc.
The study recommends improved design of the DHIS user interface (forms) and reports to replicate the paper-based forms in order to assure usability and reduce the incidences and impact of human errors in the keying-in of health data. Keywords: district, health information, software, reliability, usability, Tanzania.
Hartmann, Stephan; Bovens, L
We develop a probabilistic criterion for belief expansion that is sensitive to the degree of contextual fit of the new information to our belief set as well as to the reliability of our information source. We contrast our approach with the success postulate in AGM-style belief revision and show how the idealizations in our approach can be relaxed by invoking Bayesian-Network models.Article
Tomlinson, J J; Elliott-Smith, W; Radosta, T
A chain of custody (COC) is required in many laboratories that handle forensics, drugs of abuse, environmental, clinical, and DNA testing, as well as other laboratories that want to assure reliability of reported results. Maintaining a dependable COC can be laborious, but with the recent establishment of the criteria for electronic records and signatures by US regulatory agencies, laboratory information management systems (LIMSs) are now being developed to fully automate COCs. The extent of automation and of data reliability can vary, and FDA- and EPA-compliant electronic signatures and system security are rare.
Tomlinson, J. J.; Elliott-Smith, W.; Radosta, T.
A chain of custody (COC) is required in many laboratories that handle forensics, drugs of abuse, environmental, clinical, and DNA testing, as well as other laboratories that want to assure reliability of reported results. Maintaining a dependable COC can be laborious, but with the recent establishment of the criteria for electronic records and signatures by US regulatory agencies, laboratory information management systems (LIMSs) are now being developed to fully automate COCs. The extent of automation and of data reliability can vary, and FDA- and EPA-compliant electronic signatures and system security are rare. PMID:17671623
Kimiaeifar, Amin; Toft, Henrik Stensgaard; Lund, Erik
A probabilistic model for the reliability analysis of adhesive bonded scarfed lap joints subjected to static loading is developed. It is representative for the main laminate in a wind turbine blade subjected to flapwise bending. The structural analysis is based on a three dimensional (3D) finite...... the FEA model, and a sensitivity analysis on the influence of various geometrical parameters and material properties on the maximum stress is conducted. Because the yield behavior of many polymeric structural adhesives is dependent on both deviatoric and hydrostatic stress components, different ratios...... of the compressive to tensile adhesive yield stresses in the failure criterion are considered. It is shown that the chosen failure criterion, the scarf angle and the load are significant for the assessment of the probability of failure....
Tomlinson, J. J.; Elliott-Smith, W.; Radosta, T.
A chain of custody (COC) is required in many laboratories that handle forensics, drugs of abuse, environmental, clinical, and DNA testing, as well as other laboratories that want to assure reliability of reported results. Maintaining a dependable COC can be laborious, but with the recent establishment of the criteria for electronic records and signatures by US regulatory agencies, laboratory information management systems (LIMSs) are now being developed to fully automate COCs. The extent of a...
Ronald L. Boring; David I. Gertman
This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.
Butler, Ricky W.; White, Allan L.
The SURE program is a new reliability analysis tool for ultrareliable computer system architectures. The computational methods on which the program is based provide an efficient means for computing accurate upper and lower bounds for the death state probabilities of a large class of semi-Markov models. Once a semi-Markov model is described using a simple input language, the SURE program automatically computes the upper and lower bounds on the probability of system failure. A parameter of the model can be specified as a variable over a range of values directing the SURE program to perform a sensitivity analysis automatically. This feature, along with the speed of the program, makes it especially useful as a design tool.
Full Text Available The Human Aspects of Information Security Questionnaire (HAIS-Q is designed to measure Information Security Awareness. More specifically, the tool measures an individual’s knowledge, attitude, and self-reported behaviour relating to information security in the workplace. This paper reports on the reliability of the HAIS-Q, including test-retest reliability and internal consistency. The paper also assesses the reliability of three preliminary over-claiming items, designed specifically to complement the HAIS-Q, and identify those individuals who provide socially desirable responses. A total of 197 working Australians completed two iterations of the HAIS-Q and the over-claiming items, approximately 4 weeks apart. Results of the analysis showed that the HAIS-Q was externally reliable and internally consistent. Therefore, the HAIS-Q can be used to reliably measure information security awareness. Reliability testing on the preliminary over-claiming items was not as robust and further development is required and recommended. The implications of these findings mean that organisations can confidently use the HAIS-Q to not only measure the current state of employee information security awareness within their organisation, but they can also measure the effectiveness and impacts of training interventions, information security awareness programs and campaigns. The influence of cultural changes and the effect of security incidents can also be assessed.
Full Text Available Failures of transformers in sub-transmission systems not only reduce reliability of power system but also have significant effects on power quality since one of the important components of any system quality is reliability of that system. To enhance utility reliability, failure analysis and its rates, failure origin and physical damage causes must be studied. This paper describes a case study of the reliability of sub-transmission transformers (63/20 KV installed in Mazandaran province, operated in sub-transmission system. The information obtained from Meandering Regional Electric Company. The results of study and analysis on 60 substation including more than 110 transformers installed in sub-transmission system show that the failure modes of transformers can be represented by Weibull distribution. Weibull statistics have been widely used and accepted as a successful mathematical method to predict the remaining life time of any equipment. Useful conclusions are presented both for power systems operators and manufactures for improving the reliability of transformers.
Full Text Available The paper presents a detailed review of the state-of-the-art research activities on structural reliability analysis of wind turbines between the 1990s and 2017. We describe the reliability methods including the first- and second-order reliability methods and the simulation reliability methods and show the procedure for and application areas of structural reliability analysis of wind turbines. Further, we critically review the various structural reliability studies on rotor blades, bottom-fixed support structures, floating systems and mechanical and electrical components. Finally, future applications of structural reliability methods to wind turbine designs are discussed.
Tekinerdogan, B.; Sözer, Hasan; Aksit, Mehmet
With the increasing size and complexity of software in embedded systems, software has now become a primary threat for the reliability. Several mature conventional reliability engineering techniques exist in literature but traditionally these have primarily addressed failures in hardware components
Thoft-Christensen, Palle; Sørensen, John Dalsgaard
Reliability analysis of single tubular joints and offshore platforms with tubular joints is" presented. The failure modes considered are yielding, punching, buckling and fatigue failure. Element reliability as well as systems reliability approaches are used and illustrated by several examples....... Finally, optimal design of tubular.joints with reliability constraints is discussed and illustrated by an example....
Sørensen, John Dalsgaard; Hoffmeyer, P.
types of redundancy and non-linearity are considered. The statistical characteristics of the load bearing capacity are determined by reliability analysis. Next, more complex systems are considered modelling the mechanical behaviour of timber roof elements I stressed skin panels made of timber. Using...... characteristics of the load-bearing capacity is estimated in the form of a characteristic value and a coefficient of variation. These two values are of primary importance for codes of practice based on the partial safety factor format since the partial safety factor is closely related to the coefficient...... the above stochastic models, statistical characteristics (distribution function, 5% quantile and coefficient of variation) are determined. Generally, the results show that taking the system effects into account the characteristic load bearing capacity can be increased and the partial safety factor decreased...
Ronald L. Boring; David I. Gertman; Katya Le Blanc
This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.
Ronald L. Boring; David I. Gertman
Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.
Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory
In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.
Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory
Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.
Peters, Valerie A.; Ogilvie, Alistair; Veers, Paul S.
This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a list of the data needed to support reliability and availability analysis, and gives specific recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of fielded wind turbines. This report is intended to help the reader develop a basic understanding of what data are needed from a Computerized Maintenance Management System (CMMS) and other data systems, for reliability analysis. The report provides: (1) a list of the data needed to support reliability and availability analysis; and (2) specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and a wider variety of analysis and reporting needs.
Gertsbakh, Ilya B
Unique in its approach, Models of Network Reliability: Analysis, Combinatorics, and Monte Carlo provides a brief introduction to Monte Carlo methods along with a concise exposition of reliability theory ideas. From there, the text investigates a collection of principal network reliability models, such as terminal connectivity for networks with unreliable edges and/or nodes, network lifetime distribution in the process of its destruction, network stationary behavior for renewable components, importance measures of network elements, reliability gradient, and network optimal reliability synthesis
Qin, Zhihong; Zhang, Juan; Wang, Junfeng
Satellites channels are generally featured by high bit error rate (BER), long propagation delay, large bandwidth-delay product (BDP) and so on. This tends to make the traditional TCP suffer from serious performance degradation in satellite networks. Therefore, a TCP-compatible reliable transmission protocol (i.e., TCP-AX) for spatial information networks is proposed in this paper. And a bandwidth probing mechanism is designed to distinguish network congestion and link error. Simulation results show that TCP-AX has better performance than some popular enhanced TCP protocols.
Faber, Michael Havbro; Sørensen, John Dalsgaard; Rackwitz, R.
A jacket type offshore structure from the North Sea is considered. The time variant reliability is estimated for failure defined as brittle fracture and crack through the tubular member walls. The stochastic modelling is described. The hot spot stress spectral moments as function of the stochastic...... variables are described using spline function response surfaces. A Laplace integral expansion is used to estimate the time variant reliability. Parameter studies are performed for the reliability estimates and the results of the time variant and the time invariant reliability analyses are compared. (Authors)...
Sørensen, John Dalsgaard; Thoft-Christensen, Palle; Rackwitz, R.
A jacket type offshore structure from the North Sea is considered. The time variant reliability is estimated for failure defined as brittie fradure and crack through the tubular roerober walls. The stochastic modeiling is described. The hot spot stress speetral moments as fundion of the stochastic...... variables are desenbed using spline fundion response surfaces. A Laplace integral expansion is used to estimate the time variant reliability. Parameter studies are performed for the reliability estimates and the results of the time variant and the time invariant reliability analyses are compared....
Full Text Available Circuit reliability has become a growing concern in today’s nanoelectronics, which motivates strong research interest over the years in reliability analysis and reliability-oriented circuit design. While quite a few approaches for circuit reliability analysis have been reported, there is a lack of comparative studies on their pros and cons in terms of both accuracy and efficiency. This paper provides an overview of some typical methods for reliability analysis with focus on gate-level circuits, large or small, with or without reconvergent fanouts. It is intended to help the readers gain an insight into the reliability issues, and their complexity as well as optional solutions. Understanding the reliability analysis is also a first step towards advanced circuit designs for improved reliability in the future research.
Sørensen, John Dalsgaard; Burcharth, Hans F.; Christiani, E.
Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of the most important failure modes, sliding failure, failure of the foundation and overturning failure are described . Relevant design variables are identified and relia......Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of the most important failure modes, sliding failure, failure of the foundation and overturning failure are described . Relevant design variables are identified...
Czuchry, Andrew J.; And Others
The reliability and maintainability (R&M) model described in this report represents an important portion of a larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. The R&M model is the first of three models that comprise a modeling system for use in LCC analysis of avionics systems. The total…
The District Health Information System (DHIS) software from the Health Information System Programme (HISP) based in South Africa is widely implemented in many developing countries as a health data analysis tool. Through the HISP Tanzania project, the DHIS was piloted in five districts in Tanzania. The objective of this ...
Abstract: The District Health Information System (DHIS) software from the Health Information System Programme. (HISP) based in South Africa is widely implemented in many developing countries as a health data analysis tool. Through the HISP Tanzania project, the DHIS was piloted in five districts in Tanzania.
Evans, John W.; Gallo, Luis; Kaminsky, Mark
Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.
380, 1981. 7. J. DeVore; "The Mechanisms of Solderability and Solderability Related Failures," General Electric, WCIII-43, Printed Circuit World Convention...34Thermal Cycles and Surface Mounting Attachment Reliability," Circuit World , Vol. Il, 1985 76. J. Collett; "SMT Solder Joint Reliability," IPC-TP-708
Jeffrey C. Joe; Ronald L. Boring
While human reliability analysis (HRA) methods include uncertainty in quantification, the nominal model of human error in HRA typically assumes that operator performance does not vary significantly when they are given the same initiating event, indicators, procedures, and training, and that any differences in operator performance are simply aleatory (i.e., random). While this assumption generally holds true when performing routine actions, variability in operator response has been observed in multiple studies, especially in complex situations that go beyond training and procedures. As such, complexity can lead to differences in operator performance (e.g., operator understanding and decision-making). Furthermore, psychological research has shown that there are a number of known antecedents (i.e., attributable causes) that consistently contribute to observable and systematically measurable (i.e., not random) differences in behavior. This paper reviews examples of individual differences taken from operational experience and the psychological literature. The impact of these differences in human behavior and their implications for HRA are then discussed. We propose that individual differences should not be treated as aleatory, but rather as epistemic. Ultimately, by understanding the sources of individual differences, it is possible to remove some epistemic uncertainty from analyses.
The purpose of the Space Mission Human Reliability Analysis (HRA) Project is to extend current ground-based HRA risk prediction techniques to a long-duration, space-based tool. Ground-based HRA methodology has been shown to be a reasonable tool for short-duration space missions, such as Space Shuttle and lunar fly-bys. However, longer-duration deep-space missions, such as asteroid and Mars missions, will require the crew to be in space for as long as 400 to 900 day missions with periods of extended autonomy and self-sufficiency. Current indications show higher risk due to fatigue, physiological effects due to extended low gravity environments, and others, may impact HRA predictions. For this project, Safety & Mission Assurance (S&MA) will work with Human Health & Performance (HH&P) to establish what is currently used to assess human reliabiilty for human space programs, identify human performance factors that may be sensitive to long duration space flight, collect available historical data, and update current tools to account for performance shaping factors believed to be important to such missions. This effort will also contribute data to the Human Performance Data Repository and influence the Space Human Factors Engineering research risks and gaps (part of the HRP Program). An accurate risk predictor mitigates Loss of Crew (LOC) and Loss of Mission (LOM).The end result will be an updated HRA model that can effectively predict risk on long-duration missions.
Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.
Strawa, A. W.; Chatfield, R. B.; Legg, M.; Esswein, R.; Justice, E.
Air quality agencies use ground sites to monitor air quality, providing accurate information at particular points. Using measurements from satellite imagery has the potential to provide air quality information in a timely manner with better spatial resolution and at a lower cost that can also useful for model validation. While previous studies show acceptable correlations between Aerosol Optical Depth (AOD) derived from MODIS and surface Particulate Matter (PM) measurements on the eastern US, the data do not correlate well in the western US (Al-Saadi et al., 2005; Engle-Cox et al., 2004) . This paper seeks to improve the AOD-PM correlations by using advanced statistical analysis techniques. Our study area is the San Joaquin Valley in California because air quality in this region has failed to meet state and federal attainment standards for PM for the past several years. A previous investigation found good correlation of the AOD values between MODIS, MISR and AERONET, but poor correlations (R2 ~ 0.02) between satellite-based AOD and surface PM2.5 measurements. PM2.5 measurements correlated somewhat better (R2 ~ 0.18) with MODIS-derived AOD using the Deep Blue surface reflectance algorithm (Hsu et al., 2006) rather than the standard MODIS algorithm. This level of correlation is not adequate for reliable air quality measurements. Pelletier et al. (2007) used generalized additive models (GAMs) and meteorological data to improve the correlation between PM and AERONET AOD in western Europe. Additive models are more flexible than linear models and the functional relationships can be plotted to give a sense of the relationship between the predictor and the response. In this paper we use GAMs to improve surface PM2.5 to MODIS-AOD correlations. For example, we achieve an R2 ~ 0.44 using a GAM that includes the Deep Blue AOD, and day of year as parameters. Including NOx observations, improves the R2 ~ 0.64. Surprisingly Ångström exponent did not prove to be a significant
Bele, Irene Velsvik
This study focuses on speaking voice quality in male teachers (n = 35) and male actors (n = 36), who represent untrained and trained voice users, because we wanted to investigate normal and supranormal voices. In this study, both substantial and methodologic aspects were considered. It includes a method for perceptual voice evaluation, and a basic issue was rater reliability. A listening group of 10 listeners, 7 experienced speech-language therapists, and 3 speech-language therapist students evaluated the voices by 15 vocal characteristics using VA scales. Two sets of voice signals were investigated: text reading (2 loudness levels) and sustained vowel (3 levels). The results indicated a high interrater reliability for most perceptual characteristics. Connected speech was evaluated more reliably, especially at the normal level, but both types of voice signals were evaluated reliably, although the reliability for connected speech was somewhat higher than for vowels. Experienced listeners tended to be more consistent in their ratings than did the student raters. Some vocal characteristics achieved acceptable reliability even with a smaller panel of listeners. The perceptual characteristics grouped in 4 factors reflected perceptual dimensions.
Huang, Zhao-Feng; Fint, Jeffry A.; Kuck, Frederick M.
This paper is to address the in-flight reliability of a liquid propulsion engine system for a launch vehicle. We first establish a comprehensive list of system and sub-system reliability drivers for any liquid propulsion engine system. We then build a reliability model to parametrically analyze the impact of some reliability parameters. We present sensitivity analysis results for a selected subset of the key reliability drivers using the model. Reliability drivers identified include: number of engines for the liquid propulsion stage, single engine total reliability, engine operation duration, engine thrust size, reusability, engine de-rating or up-rating, engine-out design (including engine-out switching reliability, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction), propellant specific hazards, engine start and cutoff transient hazards, engine combustion cycles, vehicle and engine interface and interaction hazards, engine health management system, engine modification, engine ground start hold down with launch commit criteria, engine altitude start (1 in. start), Multiple altitude restart (less than 1 restart), component, subsystem and system design, manufacturing/ground operation support/pre and post flight check outs and inspection, extensiveness of the development program. We present some sensitivity analysis results for the following subset of the drivers: number of engines for the propulsion stage, single engine total reliability, engine operation duration, engine de-rating or up-rating requirements, engine-out design, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction, and engine health management system implementation (basic redlines and more advanced health management systems).
Huber, Catherine; Mesbah, Mounir
Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.
Mirzai, M; A Gholami; F. Aminifar
Failures of transformers in sub-transmission systems not only reduce reliability of power system but also have significant effects on power quality since one of the important components of any system quality is reliability of that system. To enhance utility reliability, failure analysis and its rates, failure origin and physical damage causes must be studied. This paper describes a case study of the reliability of sub-transmission transformers (63/20 KV) installed in Mazandaran province, oper...
Torres, Juliana V.S. [Federal University of Pernambuco (UFPE), Recife, PE (Brazil); Afonso, Silvana M. Bastos [Federal University of Pernambuco (UFPE), Recife, PE (Brazil). Dept. of Civil Engineering; Vaz, Luiz Eloy [Federal University of Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil). Dept. of Applied Mechanics and Structures
The most adequate procedure for measuring the structural safety is through the quantification of its failure probability. The determination of this value can be made using the first order reliability method (FORM) that leads to an optimization problem to solve the structural reliability problem. Other alternative is to apply Monte Carlo simulation method. The present work has the aim to present a methodology for safety verification and optimum design of pipelines with defect caused by corrosion. The methods used here to verify the safety of pipelines with corrosion defects use information from the deterministic analysis in its calculations. The choice of the deterministic method directly affects the calculation of failure probability of the structure. To obtain the failure pressure load, the FEM is applied considering both physical and geometric nonlinearities. This is a costly simulation problem even for a single simulation. When, in a reliability analysis procedure using the FEM, many random variables and gradient evaluations are applied, the issued related to computational time becomes very critical, and could be even prohibitively, depending on the case of study. Surrogate models are here used to overcome the above mentioned problem. (author)
Ronald Laurids Boring
This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.
Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Brincker, Rune
In this paper, a fatigue reliability analysis of a Mono-tower platform is presented. The failure mode, fatigue failure in the butt welds, is investigated with two different models. The one with the fatigue strength expressed through SN relations, the other with the fatigue strength expressed...... through linear-elastic fracture mechanics (LEFM). In determining the cumulative fatigue damage, Palmgren-Miner's rule is applied. Element reliability, as well as systems reliability, is estimated using first-order reliability methods (FORM). The sensitivity of the systems reliability to various parameters...... is investigated. The systems reliability index, estimated by using the fatigue elements with the fatigue strength expressed through SN relations, is found to be smaller than the systems reliability index estimated by using LEFM. It is shown that the systems reliability index is very sensitive to variations...
Williams, Poul Frederick; Nikolskaia, Macha; Rauzy, Antoine
In this note, we propose a Boolean Expression Diagram (BED)-based algorithm to compute the minimal p-cuts of boolean reliability models such as fault trees. BEDs make it possible to bypass the Binary Decision Diagram (BDD) construction, which is the main cost of fault tree assessment....
Gunn, J. M.
The TIGER algorithm, the inputs to the program and the output are described. TIGER is a computer program designed to simulate a system over a period of time to evaluate system reliability and availability. Results can be used in the Deep Space Network for initial spares provisioning and system evaluation.
AFRL-AFOSR-VA-TR-2017-0095 PHYSICAL-MECHANISMS BASED RELIABILITY ANALYSIS FOR EMERGING TECHNOLOGIES Ron Schrimpf VANDERBILT UNIVERSITY 110 21ST...SUBTITLE PHYSICAL-MECHANISMS BASED RELIABILITY ANALYSIS FOR EMERGING TECHNOLOGIES 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-11-1-0307 5c. PROGRAM...order to ensure technical superiority for US forces, but at the same time, levels of reliability exceeding those of commercial systems are required
Kirkegaard, Poul Henning; Enevoldsen, I.; Sørensen, John Dalsgaard
In this paper, a reliability analysis of a Mono-tower platform is presented. Te failure modes considered are yielding in the tube cross sections and fatigue failure in the butts welds. The fatigue failrue mode is investigated with a fatigue model, where the fatigue strength is expressed through SN...... relations. In determining the cumulative fatigue damage, Palmgren-Miner's rule is applied. Element reliability as well as systems reliability, is estimated using first-order reliability methods(FORM). The sensitivity of the systems reliability to various parameters is investigated. It is shown...
Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.
Full Text Available Classical reliability assessment methods have predominantly focused on probability and statistical theories, which are insufficient in assessing the operational reliability of individual mechanical equipment with time-varying characteristics. A new approach to assess machinery operational reliability with normalized lifting wavelet entropy from condition monitoring information is proposed, which is different from classical reliability assessment methods depending on probability and statistics analysis. The machinery vibration signals with time-varying operational characteristics are firstly decomposed and reconstructed by means of a lifting wavelet package transform. The relative energy of every reconstructed signal is computed as an energy percentage of the reconstructed signal in the whole signal energy. Moreover, a normalized lifting wavelet entropy is defined by the relative energy to reveal the machinery operational uncertainty. Finally, operational reliability degree is defined by the quantitative value obtained by the normalized lifting wavelet entropy belonging to the range of [0, 1]. The proposed method is applied in the operational reliability assessment of the gearbox in an oxy-generator compressor to validate the effectiveness.
Guck, Dennis; Spel, Jip; Stoelinga, Mariëlle Ida Antoinette; Butler, Michael; Conchon, Sylvain; Zaïdi, Fatiha
Reliability, availability, maintenance and safety (RAMS) analysis is essential in the evaluation of safety critical systems like nuclear power plants and the railway infrastructure. A widely used methodology within RAMS analysis are fault trees, representing failure propagations throughout a system.
Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg
Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...
Xu, Z.; Yang, B.
In this project, the authors investigated the dynamics and reliability of a brake control system using a test bench which is a Lincoln Town Car brake system. The objectives of the project are to: 1) experimentally characterize the brake system; 2) obtain good nonlinear models of the brake system; 3) perform reliability analysis of the brake control system; and, 4) develop algorithms for brake malfunction detection and brake reliability enhancement. By using the brake test bench, the dynamic c...
Houghton, F.K.; Morzinski, J.
When performing a hazards analysis (HA) for a high consequence process, human error often plays a significant role in the hazards analysis. In order to integrate human error into the hazards analysis, a human reliability analysis (HRA) is performed. Human reliability is the probability that a person will correctly perform a system-required activity in a required time period and will perform no extraneous activity that will affect the correct performance. Even though human error is a very complex subject that can only approximately be addressed in risk assessment, an attempt must be made to estimate the effect of human errors. The HRA provides data that can be incorporated in the hazard analysis event. This paper will discuss the integration of HRA into a HA for the disassembly of a high explosive component. The process was designed to use a retaining fixture to hold the high explosive in place during a rotation of the component. This tool was designed as a redundant safety feature to help prevent a drop of the explosive. This paper will use the retaining fixture to demonstrate the following HRA methodology`s phases. The first phase is to perform a task analysis. The second phase is the identification of the potential human, both cognitive and psychomotor, functions performed by the worker. During the last phase the human errors are quantified. In reality, the HRA process is an iterative process in which the stages overlap and information gathered in one stage may be used to refine a previous stage. The rationale for the decision to use or not use the retaining fixture and the role the HRA played in the decision will be discussed.
FDOT began design of a Surface Transportation Security and Reliability Information System Model Deployment in May 2003. This model deployment focuses on enhancing the security and reliability of the surface transportation system through the widesprea...
Varlataya, S. K.; Evdokimov, V. E.; Urzov, A. Y.
This article describes a process of calculating a certain complex information security system (CISS) reliability using the example of the technospheric security management model as well as ability to determine the frequency of its maintenance using the system reliability parameter which allows one to assess man-made risks and to forecast natural and man-made emergencies. The relevance of this article is explained by the fact the CISS reliability is closely related to information security (IS) risks. Since reliability (or resiliency) is a probabilistic characteristic of the system showing the possibility of its failure (and as a consequence - threats to the protected information assets emergence), it is seen as a component of the overall IS risk in the system. As it is known, there is a certain acceptable level of IS risk assigned by experts for a particular information system; in case of reliability being a risk-forming factor maintaining an acceptable risk level should be carried out by the routine analysis of the condition of CISS and its elements and their timely service. The article presents a reliability parameter calculation for the CISS with a mixed type of element connection, a formula of the dynamics of such system reliability is written. The chart of CISS reliability change is a S-shaped curve which can be divided into 3 periods: almost invariable high level of reliability, uniform reliability reduction, almost invariable low level of reliability. Setting the minimum acceptable level of reliability, the graph (or formula) can be used to determine the period of time during which the system would meet requirements. Ideally, this period should not be longer than the first period of the graph. Thus, the proposed method of calculating the CISS maintenance frequency helps to solve a voluminous and critical task of the information assets risk management.
Gardiner, M. [GL Industrial Services, Loughborough (United Kingdom); Mendes, Renato F.; Donato, Guilherme V.P. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)
Quantitative Risk Assessment (QRA) of pipelines requires two main components to be provided. These are models of the consequences that follow from some loss of containment incident, and models for the likelihood of such incidents occurring. This paper describes how PETROBRAS have used Structural Reliability Analysis for the second of these, to provide pipeline- and location-specific predictions of failure frequency for a number of pipeline assets. This paper presents an approach to estimating failure rates for liquid and gas pipelines, using Structural Reliability Analysis (SRA) to analyze the credible basic mechanisms of failure such as corrosion and mechanical damage. SRA is a probabilistic limit state method: for a given failure mechanism it quantifies the uncertainty in parameters to mathematical models of the load-resistance state of a structure and then evaluates the probability of load exceeding resistance. SRA can be used to benefit the pipeline risk management process by optimizing in-line inspection schedules, and as part of the design process for new construction in pipeline rights of way that already contain multiple lines. A case study is presented to show how the SRA approach has recently been used on PETROBRAS pipelines and the benefits obtained from it. (author)
This paper presents a method for Reliability Analysis of wind energy embedded in power generation system for Indian scenario. This is done by evaluating the reliability index, loss of load expectation, for the power generation system with and without integration of wind energy sources in the overall electric power system.
Reliability analysis of the safety levels of the criteria for bending, shear and deflection of singly reinforced, concrete slabs, have been evaluated over the practical range of thicknesses 100mm to 250mm. The First Order Reliability Method was employed in the evaluation procedure for continuous slabs of equal spans as a ...
Diamond, P.; Payne, A. O.
The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.
DeMott, D. L.
Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed
Full Text Available The output performance of the manufacturing system has a direct impact on the mechanical product quality. For guaranteeing product quality and production cost, many firms try to research the crucial issues on reliability of the manufacturing system with small sample data, to evaluate whether the manufacturing system is capable or not. The existing reliability methods depend on a known probability distribution or vast test data. However, the population performances of complex systems become uncertain as processing time; namely, their probability distributions are unknown, if the existing methods are still taken into account; it is ineffective. This paper proposes a novel evaluation method based on poor information to settle the problems of reliability of the running state of a manufacturing system under the condition of small sample sizes with a known or unknown probability distribution. Via grey bootstrap method, maximum entropy principle, and Poisson process, the experimental investigation on reliability evaluation for the running state of the manufacturing system shows that, under the best confidence level P=0.95, if the reliability degree of achieving running quality is r>0.65, the intersection area between the inspection data and the intrinsic data is A(T>0.3 and the variation probability of the inspection data is PB(T≤0.7, and the running state of the manufacturing system is reliable; otherwise, it is not reliable. And the sensitivity analysis regarding the size of the samples can show that the size of the samples has no effect on the evaluation results obtained by the evaluation method. The evaluation method proposed provides the scientific decision and suggestion for judging the running state of the manufacturing system reasonably, which is efficient, profitable, and organized.
National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...
"System reliability, availability and robustness are often not well understood by system architects, engineers and developers. They often don't understand what drives customer's availability expectations, how to frame verifiable availability/robustness requirements, how to manage and budget availability/robustness, how to methodically architect and design systems that meet robustness requirements, and so on. The book takes a very pragmatic approach of framing reliability and robustness as a functional aspect of a system so that architects, designers, developers and testers can address it as a concrete, functional attribute of a system, rather than an abstract, non-functional notion"--Provided by publisher.
Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...
Full Text Available Background: According to the measurements literature reliability of the test refers to the consistency of the test results and shows whether the obtained score is stable indication of the student’s performance in particular test Reliability can be measured by different statistics formula.Purpose: To determine the factors influenced the reliability of 392 MCQ examinations.Methods: The correlation of reliabilities of MCQ based examination and other characteristics of tests such as length difficult items, discrimination index, mean, standard deviation and time for answering was calculated based on the data available on examination center of Tehran University of Medical Sciences. Multivariate regression has been used for data analysis.Results: overall reliability of teacher made test is at satisfactory level in most cases. The mean value of reliability was 0.71 ±0.15. In comparing previous semester with last series of examination some improvement have been found during these years (P=0.000, for first semester, P=0.002 for second, P= 0.005 for third and P=0.005 for forth semester. Keeping other variable fixed the interaction of length of exam according to item difficulty showedl significant difference on value of test reliability. Comparing difficult and easy items question with moderate difficultyindex can increase reliability 8 times more than difficult and 13 times more than easy items P=0.000.Conclusion: Our study showed that with documentation of tests’ metric features an analysis and evaluation of tests are within reach of medical school .Key words: RELIABILITY , TEACHER MADE TEST, RELIABILITY MEASUREMENTS
Brunett, Acacia; Bucknor, Matthew; Grabaskas, David; Sofu, Tanju; Grelle, Austin
The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passive system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.
Berzonskis, Arvydas; Sørensen, John Dalsgaard
in the volume of the casted ductile iron main shaft, on the reliability of the component. The probabilistic reliability analysis conducted is based on fracture mechanics models. Additionally, the utilization of the probabilistic reliability for operation and maintenance planning and quality control is discussed.......One of the main challenges for the wind turbine industry currently, is to reduce the cost of levelized energy, especially for offshore wind. Failures in the wind turbine drivetrain generally result in the second largest down times of the wind turbine, hence significantly increasing the cost...
A. V. Mamaev
Full Text Available The problem of protection from deliberate leaks of information is one of the most difficult. Integrated systems of information protection against insider have a serious drawback. Using this disadvantage the offender receives the possibility of unauthorized theft of information from working machine.
Zhang, Xinzhou; Chen, Lan; Gan, Shuyuan; Wang, Yuan; Ren, Naifei; Zhou, Jian
The crankshaft, which has fateful consequence to the performance and reliability, is a key component of high-speed punch. In manufacturing process, the high reliability value of crankshaft contributes to the increasing of the reliability of the whole punch system. The study builds a reliability analysis model of the crankshaft for punching process under the premise of regarding design parameters as random variables. Monte-Carlo method is employed to make reliability analysis for crankshaft based on ANSYS. The numerical results present that the failure probability of strength and stiffness for crankshaft satisfied the use requirement. The impact of every input variable on reliability are obtained from the sensitivity analysis of status function factors. The design variable D4 and PRXY have the greatest impact on the strength of crankshaft and the design variable YOUNG and L3 have the greatest impact on the stiffness of crankshaft. And the result matches up with the response surface graph. The reliability analysis result provides some useful information for the improvement of the reliability of crankshaft for high-speed punch.
Henriksen, Marius; Lund, Hans; Moe-Nilssen, R
The purpose of this study was to determine the test-retest reliability of a trunk accelerometric gait analysis in healthy subjects. Accelerations were measured during walking using a triaxial accelerometer mounted on the lumbar spine of the subjects. Six men and 14 women (mean age 35.2; range 18...... a definite potential in clinical gait analysis....
Lassing, B.L.; Vrouwenvelder, A.C.W.M.; Waarts, P.H.
In recent years an advanced program for reliability analysis of dike systems has been under de-velopment in the Netherlands. This paper describes the global data requirements for application and the set-up of the models in the Netherlands. The analysis generates an estimate of the probability of
Mirhaghi, Amir; Heydari, Abbas; Mazlom, Reza; Hasanzadeh, Farzaneh
Although triage systems based on the Emergency Severity Index (ESI) have many advantages in terms of simplicity and clarity, previous research has questioned their reliability in practice. Therefore, the aim of this meta-analysis was to determine the reliability of ESI triage scales. This meta-analysis was performed in March 2014. Electronic research databases were searched and articles conforming to the Guidelines for Reporting Reliability and Agreement Studies were selected. Two researchers independently examined selected abstracts. Data were extracted in the following categories: version of scale (latest/older), participants (adult/paediatric), raters (nurse, physician or expert), method of reliability (intra/inter-rater), reliability statistics (weighted/unweighted kappa) and the origin and publication year of the study. The effect size was obtained by the Z-transformation of reliability coefficients. Data were pooled with random-effects models and a meta-regression was performed based on the method of moments estimator. A total of 19 studies from six countries were included in the analysis. The pooled coefficient for the ESI triage scales was substantial at 0.791 (95% confidence interval: 0.787-0.795). Agreement was higher with the latest and adult versions of the scale and among expert raters, compared to agreement with older and paediatric versions of the scales and with other groups of raters, respectively. ESI triage scales showed an acceptable level of overall reliability. However, ESI scales require more development in order to see full agreement from all rater groups. Further studies concentrating on other aspects of reliability assessment are needed.
Conclusion: According to our results, the websites intended to attract patients searching for information regarding femoroacetabular impingement are providing a highly accessible, readable information source, but do not appear to apply a comparable amount of rigor to scientific literature or healthcare practitioner websites in regard to matters such as citing sources for information, supplying methodology and including a publication date. This indicates that while these resources are easily accessed by patients, there is potential for them to be a source of misinformation.
Snijders, C.; van der Schaaf, T.; Klip, H.; van Lingen, R.A.; Fetter, W.P.F.; Molendijk, H.A.
Aims and objectives: In this study, the feasibility and reliability of the Prevention Recovery Information System for Monitoring and Analysis (PRISMA)-Medical method for systematic, specialty-based analysis and classification of incidents in the neonatal intensive care unit (NICU) were determined.
Snijders, C.; van der Schaaf, T. W.; Klip, H.; van Lingen W P F Fetter, R. A.; Molendijk, A.; Kok, J. H.; te Pas, E.; Pas, H.; van der Starre, C.; Bloemendaal, E.; Cardozo, Lopes; Molenaar, A. M.; van Lingen, R. A.; Maat, H. E.; Lavrijssen, S.; Mulder, A. L. M.; de Kleine, M. J. K.; Koolen, A. M. P.; Schellekens, M.; Verlaan, W.; Vrancken, S.; Fetter, W. P. F.; Schotman, L.; van der Zwaan, A.; van der Tuijn, Y.; Tibboel, D.
AIMS AND OBJECTIVES: In this study, the feasibility and reliability of the Prevention Recovery Information System for Monitoring and Analysis (PRISMA)-Medical method for systematic, specialty-based analysis and classification of incidents in the neonatal intensive care unit (NICU) were determined.
Spiliotopoulou, E.; Donohue, K.; Gurbuz, M.C.
Inventory decisions made at a centralized level often rely on demand forecast information passed from regional managers within a supply chain. Such managers often have unique insights into the demand patterns at their local sites that can help inform how much inventory to order for the system as a
Thomson, Edward; Lamb, Kevin; Nicholas, Ceri
The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.
Full Text Available A new method for structural reliability analysis using orthogonalizable power polynomial basis vector is presented. Firstly, a power polynomial basis vector is adopted to express the initial series solution of structural response, which is determined by a series of deterministic recursive equation based on perturbation technique, and then transferred to be a set of orthogonalizable power polynomial basis vector using the orthogonalization technique. By conducting Garlekin projection, an accelerating factor vector of the orthogonalizable power polynomial expansion is determined by solving small scale algebraic equations. Numerical results of a continuous bridge structure on reliability analysis shows that the proposed method can achieve the accuracy of the Direct Monte Carlo method and can save a lot of computation time at the same time, it is both accurate and efficient, and is very competitive to be used in structural reliability analysis.
Full Text Available This paper investigates reliability analysis of wireless sensor networks whose topology is switching among possible connections which are governed by a Markovian chain. We give the quantized relations between network topology, data acquisition rate, nodes' calculation ability, and network reliability. By applying Lyapunov method, sufficient conditions of network reliability are proposed for such topology switching networks with constant or varying data acquisition rate. With the conditions satisfied, the quantity of data transported over wireless network node will not exceed node capacity such that reliability is ensured. Our theoretical work helps to provide a deeper understanding of real-world wireless sensor networks, which may find its application in the fields of network design and topology control.
Umair, Rafia; Shahid, Kamal; Olsen, Rasmus Løvenstein
information reliability and evaluate the performance of a controller. Therefore, considering the dynamic nature of information, this paper analyzes the information reliability in terms of correct and timely delivery of message signals, for remote control of a WPP using IEC-61850 MMS in a smart grid scenario...... and thereby the voltages in the distribution grid, an effective control system is required, to govern the production from all ReGen plants. For this, control messages must be exchanged between the grid assets with reliable information to achieve optimum efficiency. This raises a challenge to assess...
Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard
This paper presents a probabilistic approach for structural robustness assessment for a timber structure built a few years ago. The robustness analysis is based on a structural reliability based framework for robustness and a simplified mechanical system modelling of a timber truss system....... A complex timber structure with a large number of failure modes is modelled with only a few dominant failure modes. First, a component based robustness analysis is performed based on the reliability indices of the remaining elements after the removal of selected critical elements. The robustness...
Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo
Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical
Dunlap, Aimee S; Nielsen, Matthew E; Dornhaus, Anna; Papaj, Daniel R
Many animals, including insects, make decisions using both personally gathered information and social information derived from the behavior of other, usually conspecific, individuals . Moreover, animals adjust use of social versus personal information appropriately under a variety of experimental conditions [2-5]. An important factor in how information is used is the information's reliability, that is, how consistently the information is correlated with something of relevance in the environment . The reliability of information determines which signals should be attended to during communication [6-9], which types of stimuli animals should learn about, and even whether learning should evolve [10, 11]. Here, we show that bumble bees (Bombus impatiens) account for the reliability of personally acquired information (which flower color was previously associated with reward) and social information (which flowers are chosen by other bees) in making foraging decisions; however, the two types of information are not treated equally. Bees prefer to use social information if it predicts a reward at all, but if social information becomes entirely unreliable, flower color will be used instead. This greater sensitivity to the reliability of social information, and avoidance of conspecifics in some cases, may reflect the specific ecological circumstances of bee foraging. Overall, the bees' ability to make decisions based on both personally acquired and socially derived information, and the relative reliability of both, demonstrates a new level of sophistication and flexibility in animal, particularly insect, decision-making. Copyright © 2016 Elsevier Ltd. All rights reserved.
Full Text Available The quality of Vessel Traffic Management and Information Systems depends on the quality of all subsystems, in particular the quality of control centers. The most commonly used quantitative indicators of the control centers' quality are: reliability, availability, safety, and system failure. Therefore, a block diagram of reliability and the model for reliability / availability (Markov model have been created in this paper and a detailed analysis and calculation of the quantitative indicators of critical components (servers of the control center have been performed. The quality functioning of the control centers will enable gathering, processing, storing and dissemination of timely, safe, and reliable data and information to the services in charge of monitoring and management of maritime traffic.
Thiago Henrique de Lima
Full Text Available The objective of the present study was to identify and evaluate the content of information about dengue available on Brazilian websites. Thirty-two websites were selected for the analysis. For the evaluation of the content of information about dengue, a form was prepared with 16 topics grouped in six information blocks: etiology/transmission, vector, control and prevention, disease/diagnosis, treatment and epidemiology. The websites were also evaluated according to the following criteria: authorship, update, language, interactivity, scientific basis and graphic elements. The results showed a predominantly lack of information in relation to the topics analyzed in each information block. Regarding the technical quality of the websites, only 28.1% showed some indication of scientific basis and 34.3% contained the date of publication or of the last update. Such results attested the low reliability of the selected websites. Knowing that the internet is an efficient mechanism for disseminating information on health topics, we concluded that the creation of such mechanisms to disseminate correct and comprehensive information about dengue is necessary in order to apply this useful tool in the prevention and control of the disease in Brazil.
K. S. NG
Full Text Available Stone column is an effective ground improvement method to improve the weak ground. This paper describes the implementation of reliability based analysis on the consolidation behaviour of stone column reinforced ground. Hasofer-Lind reliability index is computed involving non-correlated normal random variables which include stone column diameter, coefficient of volume compressibility, coefficient of consolidation and stress concentration ratio. The sensitivity of these variables on the effect of consolidation settlement is investigated in this study. Results show the importance of considering spatial variability in design and analysis of stone column reinforced ground. The probabilities of failure inferred from reliability indices are compared with Monte Carlo simulation where good agreements are obtained.
Sørensen, John Dalsgaard
Reliability analysis and probabilistic models for wind turbines are considered with special focus on structural components and application for reliability-based calibration of partial safety factors. The main design load cases to be considered in design of wind turbine components are presented...... for extreme and fatigue limit states are presented. Operation & Maintenance planning often follows corrective and preventive strategies based on information from condition monitoring and structural health monitoring systems. A reliability- and riskbased approach is presented where a life-cycle approach...... including the effects of the control system and possible faults due to failure of electrical / mechanical components. Considerations are presented on the target reliability level for wind turbine structural components. Application is shown for reliability-based calibrations of partial safety factors...
The amount of digital data we produce every day far surpasses our ability to process this data, and finding useful information in this constant flow of data has become one of the major challenges of the 21st century. Search engines are one way of accessing large data collections. Their algorithms
Schmid, Benjamin; Karg, Katja; Perner, Josef; Tomasello, Michael
Social animals frequently rely on information from other individuals. This can be costly in case the other individual is mistaken or even deceptive. Human infants below 4 years of age show proficiency in their reliance on differently reliable informants. They can infer the reliability of an informant from few interactions and use that assessment in later interactions with the same informant in a different context. To explore whether great apes share that ability, in our study we confronted great apes with a reliable or unreliable informant in an object choice task, to see whether that would in a subsequent task affect their gaze following behaviour in response to the same informant. In our study, prior reliability of the informant and habituation during the gaze following task affected both great apes' automatic gaze following response and their more deliberate response of gaze following behind barriers. As habituation is very context specific, it is unlikely that habituation in the reliability task affected the gaze following task. Rather it seems that apes employ a reliability tracking strategy that results in a general avoidance of additional information from an unreliable informant.
captured by a safety-factor based approach due to the intricate nonlinear relationships between the system parameters and the natural frequencies. For these reasons a scientific and systematic approach is required to predict the probability of failure of a structure at the design stage. Probabilistic structural reliability analysis ...
William, R. K.; Stillwell, A. S.
Urban environments continue to face the challenges of localized flooding and decreased water quality brought on by the increasing amount of impervious area in the built environment. Green infrastructure provides an alternative to conventional storm sewer design by using natural processes to filter and store stormwater at its source. However, there are currently few consistent standards available in North America to ensure that installed green infrastructure is performing as expected. This analysis offers a method for characterizing green roof failure using a visual aid commonly used in earthquake engineering: fragility curves. We adapted the concept of the fragility curve based on the efficiency in runoff reduction provided by a green roof compared to a conventional roof under different storm scenarios. We then used the 2D distributed surface water-groundwater coupled model MIKE SHE to model the impact that a real green roof might have on runoff in different storm events. We then employed a multiple regression analysis to generate an algebraic demand model that was input into the Matlab-based reliability analysis model FERUM, which was then used to calculate the probability of failure. The use of reliability analysis as a part of green infrastructure design code can provide insights into green roof weaknesses and areas for improvement. It also supports the design of code that is more resilient than current standards and is easily testable for failure. Finally, the understanding of reliability of a single green roof module under different scenarios can support holistic testing of system reliability.
Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Huang, Dongli [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gleicher, Frederick [Idaho National Lab. (INL), Idaho Falls, ID (United States); Wang, Bei [Idaho National Lab. (INL), Idaho Falls, ID (United States); Adbel-Khalik, Hany S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pascucci, Valerio [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)
This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.
The focus of the NASA 'Space Flight Risk Data Collection and Analysis' project was to acquire and evaluate space flight data with the express purpose of establishing a database containing measurements of specific risk assessment - reliability - availability - maintainability - supportability (RRAMS) parameters. The developed comprehensive RRAMS database will support the performance of future NASA and aerospace industry risk and reliability studies. One of the primary goals has been to acquire unprocessed information relating to the reliability and availability of launch vehicles and the subsystems and components thereof from the 45th Space Wing (formerly Eastern Space and Missile Command -ESMC) at Patrick Air Force Base. After evaluating and analyzing this information, it was encoded in terms of parameters pertinent to ascertaining reliability and availability statistics, and then assembled into an appropriate database structure.
Nishihara, Takuo; Tomita, Seiji [Nippon Telegraph and Telephone Corp., Yokosuka, Kanagawa (Japan). Information and Communication Systems Labs.
In this paper, we propose a system configuration suitable for the hard realtime systems in which integrity and durability of information are important. On most hard real-time systems, where response time constraints are critical, the data which program access are volatile, and may be lost in case the systems are down. But for some real-time systems, the value-added intelligent network (IN) systems, e.g., integrity and durability of the stored data are very important. We propose a distributed system configuration for such hard real-time systems, comprised of service control modules and data management modules. The service control modules process transactions and responses based on deadline control, and the data management modules deal the stored data based on information recovery schemes well-restablished in fault real-time systems. (author).
The goal of the research presented in this report is to propose, develop and test an integrated reliability analysis to optimise the maintenance strategies of the railway industry. This integrated analysis applies traditional statistics theories as well as Bayesian statistics using Markov Chain Monte Carlo (MCMC) methodologies. Using the Bayesian inference leads to greater flexibility because such analysis can simultaneously accommodate the following: • Small sample data;• Incomplete data set...
Full Text Available With an increasing pervasiveness, prevalence and severity of cybercrimes, various metrics, measures and statistics have been developed and used to measure various aspects of this phenomenon. Cybercrime-related data, metrics, and information, however, pose important and difficult dilemmas regarding the issues of reliability, validity, comparability and practical utility. While many of the issues of the cybercrime economy are similar to other underground and underworld industries, this economy also has various unique aspects. For one thing, this industry also suffers from a problem partly rooted in the incredibly broad definition of the term “cybercrime”. This article seeks to provide insights and analysis into this phenomenon, which is expected to advance our understanding into cybercrime-related information.
Menicucci, David F. (Building Specialists, Inc., Albuquerque, NM)
Utilities are overseeing the installations of thousand of solar hot water (SHW) systems. Utility planners have begun to ask for quantitative measures of the expected lifetimes of these systems so that they can properly forecast their loads. This report, which augments a 2009 reliability analysis effort by Sandia National Laboratories (SNL), addresses this need. Additional reliability data have been collected, added to the existing database, and analyzed. The results are presented. Additionally, formal reliability theory is described, including the bathtub curve, which is the most common model to characterize the lifetime reliability character of systems, and for predicting failures in the field. Reliability theory is used to assess the SNL reliability database. This assessment shows that the database is heavily weighted with data that describe the reliability of SHW systems early in their lives, during the warranty period. But it contains few measured data to describe the ends of SHW systems lives. End-of-life data are the most critical ones to define sufficiently the reliability of SHW systems in order to answer the questions that the utilities pose. Several ideas are presented for collecting the required data, including photometric analysis of aerial photographs of installed collectors, statistical and neural network analysis of energy bills from solar homes, and the development of simple algorithms to allow conventional SHW controllers to announce system failures and record the details of the event, similar to how aircraft black box recorders perform. Some information is also presented about public expectations for the longevity of a SHW system, information that is useful in developing reliability goals.
Ronald L. Boring
This paper addresses the fact that existing human reliability analysis (HRA) methods do not provide guidance on digital human-machine interfaces (HMIs). Digital HMIs are becoming ubiquitous in nuclear power operations, whether through control room modernization or new-build control rooms. Legacy analog technologies like instrumentation and control (I&C) systems are costly to support, and vendors no longer develop or support analog technology, which is considered technologically obsolete. Yet, despite the inevitability of digital HMI, no current HRA method provides guidance on how to treat human reliability considerations for digital technologies.
Why a book on fault-tolerant search algorithms? Searching is one of the fundamental problems in computer science. Time and again algorithmic and combinatorial issues originally studied in the context of search find application in the most diverse areas of computer science and discrete mathematics. On the other hand, fault-tolerance is a necessary ingredient of computing. Due to their inherent complexity, information systems are naturally prone to errors, which may appear at any level - as imprecisions in the data, bugs in the software, or transient or permanent hardware failures. This book pr
A.A.J. (Jos) van Helvoort; Frank Huysmans; Saskia Brand-Gruwel; Ellen Sjoer
Purpose: The main purpose of the research was to measure reliability and validity of the Scoring Rubric for Information Literacy (Van Helvoort, 2010). Design/methodology/approach: Percentages of agreement and Intraclass Correlation were used to describe interrater reliability. For the determination
This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.
Berzonskis, Arvydas; Sørensen, John Dalsgaard
in the volume of the casted ductile iron main shaft, on the reliability of the component. The probabilistic reliability analysis conducted is based on fracture mechanics models. Additionally, the utilization of the probabilistic reliability for operation and maintenance planning and quality control is discussed....... of operation and maintenance. The manufacturing of casted drivetrain components, like the main shaft of the wind turbine, commonly result in many smaller defects through the volume of the component with sizes that depend on the manufacturing method. This paper considers the effect of the initial defect present......One of the main challenges for the wind turbine industry currently, is to reduce the cost of levelized energy, especially for offshore wind. Failures in the wind turbine drivetrain generally result in the second largest down times of the wind turbine, hence significantly increasing the cost...
Full Text Available Current formulas for calculating scour depth below of a free over fall are mostly deterministic in nature and do not adequately consider the uncertainties of various scouring parameters. A reliability-based assessment of scour, taking into account uncertainties of parameters and coefficients involved, should be performed. This paper studies the reliability of a dam foundation under the threat of scour. A model for calculating the reliability of scour and estimating the probability of failure of the dam foundation subjected to scour is presented. The Maximum Entropy Method is applied to construct the probability density function (PDF of the performance function subject to the moment constraints. Monte Carlo simulation (MCS is applied for uncertainty analysis. An example is considered, and there liability of its scour is computed, the influence of various random variables on the probability failure is analyzed.
Mohit Kumar Kakkar
Full Text Available The aim of this work is to present a reliability and profit analysis of a two-dissimilar parallel unit system under the assumption that operative unit cannot fail after post repair inspection and replacement and there is only one repair facility. Failure and repair times of each unit are assumed to be uncorrelated. Using regenerative point technique various reliability characteristics are obtained which are useful to system designers and industrial managers. Graphical behaviors of mean time to system failure (MTSF and profit function have also been studied. In this paper, some important measures of reliability characteristics of a two non-identical unit standby system model with repair, inspection and post repair are obtained using regenerative point technique.
Full Text Available Before starting and also during the exploitation of va1ioussystems, it is vety imp011ant to know how the system and itsparts will behave during operation regarding breakdowns, i.e.failures. It is possible to predict the service behaviour of a systemby determining the functions of reliability, as well as frequencyand intensity of failures.The paper considers the theoretical basics of the functionsof reliability, frequency and intensity of failures for the twomain approaches. One includes 6 equal intetvals and the other13 unequal intetvals for the concrete case taken from practice.The reliability of the "alternator- alternator belt" system installedin the buses, has been analysed, according to the empiricaldata on failures.The empitical data on failures provide empirical functionsof reliability and frequency and intensity of failures, that arepresented in tables and graphically. The first analysis perfO!med by dividing the mean time between failures into 6 equaltime intervals has given the forms of empirical functions of fa ilurefrequency and intensity that approximately cotTespond totypical functions. By dividing the failure phase into 13 unequalintetvals with two failures in each interval, these functions indicateexplicit transitions from early failure inte1val into the randomfailure interval, i.e. into the ageing intetval. Functions thusobtained are more accurate and represent a better solution forthe given case.In order to estimate reliability of these systems with greateraccuracy, a greater number of failures needs to be analysed.
Utkin, V. L.
Tests with cardioleader to control the physical, technical and tactical preparedness of athletes in cyclic types of sports are discussed. Ways of increasing the reliability and information content of the tests were studied.
Travel time reliability information includes static data about traffic speeds or trip times that capture historic variations from day to day, and it can help individuals understand the level of variation in traffic. Unlike real-time travel time infor...
Full Text Available Analysis of the impact of scholarly artifacts is constrained by current unreliable practices in cross-referencing, citation discovering, and citation indexing and analysis, which have not kept pace with the technological advances that are occurring in several areas like knowledge management and security. Because citation analysis has become the primary component in scholarly impact factor calculation, and considering the relevance of this metric within both the scholarly publishing value chain and (especially important the professional curriculum evaluation of scholarly professionals, we defend that current practices need to be revised. This paper describes a reference architecture that aims to provide openness and reliability to the citation-tracking lifecycle. The solution relies on the use of digitally signed semantic metadata in the different stages of the scholarly publishing workflow in such a manner that authors, publishers, repositories, and citation-analysis systems will have access to independent reliable evidences that are resistant to forgery, impersonation, and repudiation. As far as we know, this is the first paper to combine Semantic Web technologies and public-key cryptography to achieve reliable citation analysis in scholarly publishing
Piqueras, Jose A; Martín-Vivar, María; Sandin, Bonifacio; San Luis, Concepción; Pineda, David
Anxiety and depression are among the most common mental disorders during childhood and adolescence. Among the instruments for the brief screening assessment of symptoms of anxiety and depression, the Revised Child Anxiety and Depression Scale (RCADS) is one of the more widely used. Previous studies have demonstrated the reliability of the RCADS for different assessment settings and different versions. The aims of this study were to examine the mean reliability of the RCADS and the influence of the moderators on the RCADS reliability. We searched in EBSCO, PsycINFO, Google Scholar, Web of Science, and NCBI databases and other articles manually from lists of references of extracted articles. A total of 146 studies were included in our meta-analysis. The RCADS showed robust internal consistency reliability in different assessment settings, countries, and languages. We only found that reliability of the RCADS was significantly moderated by the version of RCADS. However, these differences in reliability between different versions of the RCADS were slight and can be due to the number of items. We did not examine factor structure, factorial invariance across gender, age, or country, and test-retest reliability of the RCADS. The RCADS is a reliable instrument for cross-cultural use, with the advantage of providing more information with a low number of items in the assessment of both anxiety and depression symptoms in children and adolescents. Copyright © 2017. Published by Elsevier B.V.
Kumar, Naveen; Komal; Lather, J. S.
In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.
Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)
Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)
Full Text Available Complex system performance reliability prediction is one of the means to understand complex systems reliability level, make maintenance decision, and guarantee the safety of operation. By the use of complex system condition monitoring information and condition monitoring information based on support vector machine, the paper aims to provide an evaluation of the degradation of complex system performance. With degradation assessment results as input variables, the prediction model of reliability is established in Winer random process. Taking the aircraft engine as an example, the effectiveness of the proposed method is verified in the paper.
Vallee, F.; Lobry, J.; Deblecker, O. [Faculte Polytechnique de Mons, Mons (Belgium)
An analysis of the future trends in the reliability of the electricity supply in Belgium was presented. In an effort to meet the Kyoto obligations to reduce carbon dioxide emissions, the Belgian government is promoting the use of renewable energy in its electricity supply mix. Among its policies, the government plans to end its nuclear program by 2030 and develop natural gas based thermal plants. It is expected that 6 per cent of the electrical production in Belgium will come from wind. However, that energy source is highly variable and the risk of having fluctuating active power at the output of a wind generator could threaten the reliability of the electrical supply. For that reason, this study applied a reliability index to a modified Roy Billinton Test System (RBTS). This study also demonstrated the absolute need in nuclear alternatives and classical supply, such as natural gas based thermal plants, to maintain the reliability of the electricity supply in Belgium. In addition to these changes in the electricity supply mix, projections have indicated an increase in electricity consumption. The main purpose of this study was to point out the impacts of those expected modifications on the reliability indexes of the Belgian electrical network. The proposed results were obtained by MATLAB simulations performed at the RBTS network which was modified to meet Belgian production plans. It was estimated that even with a 1 GW wind potential planned for 2015, an additional 4.5 GW capacity is needed in cogeneration and combined cycle gas units in order to maintain reliability in power supply. 8 refs., 5 tabs., 11 figs.
Peltier, Thomas R
Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex
Full Text Available The problem of large amounts of carbon emissions causes wide concern across the world, and it has become a serious threat to the sustainable development of the manufacturing industry. The intensive research into technologies and methodologies for green product design has significant theoretical meaning and practical value in reducing the emissions of the manufacturing industry. Therefore, a low carbon-oriented product reliability optimal design model is proposed in this paper: (1 The related expert evaluation information was prepared in interval numbers; (2 An improved product failure analysis considering the uncertain carbon emissions of the subsystem was performed to obtain the subsystem weight taking the carbon emissions into consideration. The interval grey correlation analysis was conducted to obtain the subsystem weight taking the uncertain correlations inside the product into consideration. Using the above two kinds of subsystem weights and different caution indicators of the decision maker, a series of product reliability design schemes is available; (3 The interval-valued intuitionistic fuzzy sets (IVIFSs were employed to select the optimal reliability and optimal design scheme based on three attributes, namely, low carbon, correlation and functions, and economic cost. The case study of a vertical CNC lathe proves the superiority and rationality of the proposed method.
Abdenov, A. Zh; Trushin, V. A.; Abdenova, G. A.
The paper considers the questions of filling the relevant SIEM nodes based on calculations of objective assessments in order to improve the reliability of subjective expert assessments. The proposed methodology is necessary for the most accurate security risk assessment of information systems. This technique is also intended for the purpose of establishing real-time operational information protection in the enterprise information systems. Risk calculations are based on objective estimates of the adverse events implementation probabilities, predictions of the damage magnitude from information security violations. Calculations of objective assessments are necessary to increase the reliability of the proposed expert assessments.
Faisal Karim Shaikh
Full Text Available Delivering reliable services in service oriented architectures entails the underlying basis of having communication network models and well structured systems. With the rapid proliferation of ad-hoc mode of communication the reliable delivery of services increasingly encounter new communication and network perturbations. Empirically the core of service delivery in WSNs (Wireless Sensor Networks is information transport from the sensor nodes to the sink node where the service resides. In this work we provide a reliable information transport for enhanced service delivery by using spatio-temporal correlation in WSN. The classification for different types of information required by the services is also presented. To overcome dynamic network conditions and evolving service requirements an adaptive retransmission mechanism based on spatial correlation is utilized. Simulation results show that the proposed solutions provide service specific reliability and save expensive retransmissions and thus provide energy efficient solution.
Li Yejun; Huang Bin
A new method for structural reliability analysis using orthogonalizable power polynomial basis vector is presented. Firstly, a power polynomial basis vector is adopted to express the initial series solution of structural response, which is determined by a series of deterministic recursive equation based on perturbation technique, and then transferred to be a set of orthogonalizable power polynomial basis vector using the orthogonalization technique. By conducting Garlekin projection, an accel...
Kainz, Hans; Graham, David; Edwards, Julie; Walsh, Henry P J; Maine, Sheanna; Boyd, Roslyn N; Lloyd, David G; Modenese, Luca; Carty, Christopher P
Three-dimensional gait analysis (3DGA) has become a common clinical tool for treatment planning in children with cerebral palsy (CP). Many clinical gait laboratories use the conventional gait analysis model (e.g. Plug-in-Gait model), which uses Direct Kinematics (DK) for joint kinematic calculations, whereas, musculoskeletal models, mainly used for research, use Inverse Kinematics (IK). Musculoskeletal IK models have the advantage of enabling additional analyses which might improve the clinical decision-making in children with CP. Before any new model can be used in a clinical setting, its reliability has to be evaluated and compared to a commonly used clinical gait model (e.g. Plug-in-Gait model) which was the purpose of this study. Two testers performed 3DGA in eleven CP and seven typically developing participants on two occasions. Intra- and inter-tester standard deviations (SD) and standard error of measurement (SEM) were used to compare the reliability of two DK models (Plug-in-Gait and a six degrees-of-freedom model solved using Vicon software) and two IK models (two modifications of 'gait2392' solved using OpenSim). All models showed good reliability (mean SEM of 3.0° over all analysed models and joint angles). Variations in joint kinetics were less in typically developed than in CP participants. The modified 'gait2392' model which included all the joint rotations commonly reported in clinical 3DGA, showed reasonable reliable joint kinematic and kinetic estimates, and allows additional musculoskeletal analysis on surgically adjustable parameters, e.g. muscle-tendon lengths, and, therefore, is a suitable model for clinical gait analysis. Copyright © 2017. Published by Elsevier B.V.
Owhor, Sampson Chisa; Abdul Alim Ibrahim Gambo; Ojo, Victor Kayode; Dan’azumi Daniel
In reliability analysis of car maintenance forecast and performance, researchers have mostly dealt with problems either without maintenance or with deterministic maintenance when no failure can occur. This can be unrealistic in practical settings. In this work, a statistical model is developed to evaluate the effect of predictive and preventive maintenance schemes on car performance in the presence of system failure where the forecasting objective is to minimize schedule duration. It was s...
Clifford, Courtenay B.; Hengel, John E.
The parachutes on the Space Transportation System (STS) Solid Rocket Booster (SRB) are the means for decelerating the SRB and allowing it to impact the water at a nominal vertical velocity of 75 feet per second. Each SRB has one pilot, one drogue, and three main parachutes. About four minutes after SRB separation, the SRB nose cap is jettisoned, deploying the pilot parachute. The pilot chute then deploys the drogue parachute. The drogue chute provides initial deceleration and proper SRB orientation prior to frustum separation. At frustum separation, the drogue pulls the frustum from the SRB and allows the main parachutes that are mounted in the frustum to unpack and inflate. These chutes are retrieved, inspected, cleaned, repaired as needed, and returned to the flight inventory and reused. Over the course of the Shuttle Program, several improvements have been introduced to the SRB main parachutes. A major change was the replacement of the small (115 ft. diameter) main parachutes with the larger (136 ft. diameter) main parachutes. Other modifications were made to the main parachutes, main parachute support structure, and SRB frustum to eliminate failure mechanisms, improve damage tolerance, and improve deployment and inflation characteristics. This reliability analysis is limited to the examination of the SRB Large Main Parachute (LMP) and drogue parachute failure history to assess the reliability of these chutes. From the inventory analysis, 68 Large Main Parachutes were used in 651 deployments, and 7 chute failures occurred in the 651 deployments. Logistic regression was used to analyze the LMP failure history, and it showed that reliability growth has occurred over the period of use resulting in a current chute reliability of R = .9983. This result was then used to determine the reliability of the 3 LMPs on the SRB, when all must function. There are 29 drogue parachutes that were used in 244 deployments, and no in-flight failures have occurred. Since there are no
Kyyrä, Tomi; Wilke, Ralf
The retrospectively recalled calendar of activities in the European Community Household Panel is a prime resource for cross-country analysis of unemployment experience. We investigate the reliability of these data and find that 26 % of unemployed respondents misreported retrospectively...
Tasooji, Amaneh; Ghaffarian, Reza; Rinaldi, Antonio
Area Array microelectronic packages with small pitch and large I/O counts are now widely used in microelectronics packaging. The impact of various package design and materials/process parameters on reliability has been studied through extensive literature review. Reliability of Ceramic Column Grid Array (CCGA) package assemblies has been evaluated using JPL thermal cycle test results (-50(deg)/75(deg)C, -55(deg)/100(deg)C, and -55(deg)/125(deg)C), as well as those reported by other investigators. A sensitivity analysis has been performed using the literature da to study the impact of design parameters and global/local stress conditions on assembly reliability. The applicability of various life-prediction models for CCGA design has been investigated by comparing model's predictions with the experimental thermal cycling data. Finite Element Method (FEM) analysis has been conducted to assess the state of the stress/strain in CCGA assembly under different thermal cycling, and to explain the different failure modes and locations observed in JPL test assemblies.
Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.
Full Text Available A probabilistic study of a circular tunnel excavated in a soil mass using the response surface methodology (RSM is presented. A deterministic model based on two-dimensional numerical simulations in a transversal section is used, and the serviceability limit state (SLS is considered in the analysis. The model permits the surface settlement curve and the bending moment on the tunnel lining to be obtained. Only the soil parameters are considered as random variables. The first-order reliability method (FORM and the response surface methodology (RSM are utilized for the assessment of the Hasofer-Lind reliability index (βHL optimized by the use of a genetic algorithm (GA. Two assumptions (normal and non-normal distribution were used for the random variables. The comparison analysis considering a correlation between the friction angle and the cohesion indicates that the results are conservative if a negative correlation among strength parameters is not taken into account. The assumption of a non-normal distribution for the random variables has an important effect on the reliability index for the practical range of values of surface settlements.
Umair, Rafia; Shahid, Kamal; Olsen, Rasmus Løvenstein
The trend of producing energy from Renewable Generation (ReGen) plants is greatly increasing. This leads to the objective of building future power generation system entirely based on renewable sources. Since the power output of ReGen plants, such as wind power plants (WPP), varies continuously...... information reliability and evaluate the performance of a controller. Therefore, considering the dynamic nature of information, this paper analyzes the information reliability in terms of correct and timely delivery of message signals, for remote control of a WPP using IEC-61850 MMS in a smart grid scenario...
Döhler, Michael; Thöns, Sebastian
and prognosis is hardly exploited nor treated in scientific literature up to now. In order to utilize the information provided by DDS for the structural performance, usually high computational efforts for the pre-determination of DDS reliability are required. In this paper, an approach for the DDS performance......Damage detection systems and algorithms (DDS and DDA) provide information of the structural system integrity in contrast to e.g. local information by inspections or non-destructive testing techniques. However, the potential of utilizing DDS information for the structural integrity assessment...... modelling is introduced building upon the non-destructive testing reliability which applies to structural systems and DDS containing a strategy to overcome the high computational efforts for the pre-determination of the DDS reliability. This approach takes basis in the subspace-based damage detection method...
Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)
There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.
Full Text Available The Soil Water Characteristic Curve (SWCC, also known as the soil water-retention curve, is an important part of any constitutive relationship for unsaturated soils. Deterministic assessment of SWCC has received considerable attention in the past few years. However the uncertainties of the parameters which affect SWCC dictate that the problem is of a probabilistic nature rather than being deterministic. In this research, a Gene Expression Programming (GEP-based SWCC model is employed to assess the reliability of SWCC. For this purpose, the Jointly Distributed Random Variables (JDRV method is used as an analytical method for reliability analysis. All input parameters of the model which are initial void ratio, initial water content, silt and clay contents are set to be stochastic and modelled using truncated normal probability density functions. The results are compared with those of the Monte Carlo (MC simulation. It is shown that the initial water content is the most effective parameter in SWCC.
Despite its good physical properties, the glassy carbon material is not widely used, especially for structural applications. Nevertheless, its transparency to particles and temperature resistance are interesting properties for the applications to vacuum chambers and components in high energy physics. For example, it has been proposed for fast shutter valve in particle accelerator  . The mechanical properties have to be carefully determined to assess the reliability of structures in such a material. In this paper, mechanical tests have been carried out to determine the elastic parameters, the strength and toughness on commercial grades. A statistical approach, based on the Weibull’s distribution, is used to characterize the material both in tension and compression. The results are compared to the literature and the difference of properties for these two loading cases is shown. Based on a Finite Element analysis, a statistical approach is applied to define the reliability of a structural component in gl...
Takegami, Yasuhiko; Seki, Taisuke; Amano, Takafumi; Higuchi, Yoshitoshi; Komatsu, Daigo; Nishida, Yoshihiro; Ishiguro, Naoki
Although many patients use the internet to access health-related information, the quality and the reliability of the information is highly inconsistent. Periacetabular osteotomy (PAO) is one of the surgical procedures for hip dysplasia. However, medical information on PAO is limited on the internet. This study aims to evaluate the quality and reliability of information available on PAO on the internet in Japan. A web search was conducted on two search engines for the following terms: "hip osteotomy," "pelvic osteotomy," and "osteotomy for hip preservation" in Japanese. In total, we found 120 websites. To determine the quality and reliability of information on each website, we used the Health on the Net Foundation (HON) score, the Brief DISCERN score, and an osteotomy-specific content (OSC) score. After eliminating duplicate websites, we reviewed 49 unique websites. Only three websites (6.1%) had good reliability, as indicated by their HON scores. Twelve websites (24.4%) had good-quality information, as measured by their Brief DISCERN scores. As evaluated by their OSC scores, physician websites were found to be biased toward etiology and surgical indication and did not provide information on the complications of procedures. Non-physician websites were generally insufficient. The information about PAO on the internet is, therefore, unreliable and of poor-quality for Japanese patients.
Zhou, Z.-G.; Tang, P.; Zhou, M.
Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.
Fokoue, Ernest; Gunduz, Necla
We propose an information-theoretic alternative to the popular Cronbach alpha coefficient of reliability. Particularly suitable for contexts in which instruments are scored on a strictly nonnumeric scale, our proposed index is based on functions of the entropy of the distributions of defined on the sample space of responses. Our reliability index tracks the Cronbach alpha coefficient uniformly while offering several other advantages discussed in great details in this paper.
For the new in-house developed CERN Radiation Monitoring Electronic System (CROME) a reliability analysis is necessary to ensure compliance with the statu-tory requirements regarding the Safety Integrity Level. The required Safety Integrity Level by IEC 60532 standard is SIL 2 (for the Safety Integrated Functions Measurement, Alarm Triggering and Interlock Triggering). The ﬁrst step of the reliability analysis was a system and functional analysis which served as basis for the implementation of the CROME system in the software “Iso-graph”. In the “Prediction” module of Isograph the failure rates of all components were calculated. Failure rates for passive components were calculated by the Military Standard 217 and failure rates for active components were obtained from lifetime tests by the manufacturers. The FMEA was carried out together with the board designers and implemented in the “FMECA” module of Isograph. The FMEA served as basis for the Fault Tree Analysis and the detection of weak points...
Robinson, D.G. [Sandia National Labs., Albuquerque, NM (United States)
This paper discusses preliminary research at Sandia National Laboratories into the application of artificial neural networks for reliability and risk analysis. The goal of this effort is to develop a reliability based methodology that captures the complex relationship between uncertainty in material properties and manufacturing processes and the resulting uncertainty in life prediction estimates. The inputs to the neural network model are probability density functions describing system characteristics and the output is a statistical description of system performance. The most recent application of this methodology involves the comparison of various low-residue, lead-free soldering processes with the desire to minimize the associated waste streams with no reduction in product reliability. Model inputs include statistical descriptions of various material properties such as the coefficients of thermal expansion of solder and substrate. Consideration is also given to stochastic variation in the operational environment to which the electronic components might be exposed. Model output includes a probabilistic characterization of the fatigue life of the surface mounted component.
Valentine, Mark; Boyer, Roger; Cross, Bob; Hamlin, Teri; Roelant, Henk; Stewart, Mike; Bigler, Mark; Winter, Scott; Reistle, Bruce; Heydorn,Dick
The Johnson Space Center (JSC) Safety & Mission Assurance (S&MA) Directorate s Risk and Reliability Analysis Group provides both mathematical and engineering analysis expertise in the areas of Probabilistic Risk Assessment (PRA), Reliability and Maintainability (R&M) analysis, and data collection and analysis. The fundamental goal of this group is to provide National Aeronautics and Space Administration (NASA) decisionmakers with the necessary information to make informed decisions when evaluating personnel, flight hardware, and public safety concerns associated with current operating systems as well as with any future systems. The Analysis Group includes a staff of statistical and reliability experts with valuable backgrounds in the statistical, reliability, and engineering fields. This group includes JSC S&MA Analysis Branch personnel as well as S&MA support services contractors, such as Science Applications International Corporation (SAIC) and SoHaR. The Analysis Group s experience base includes nuclear power (both commercial and navy), manufacturing, Department of Defense, chemical, and shipping industries, as well as significant aerospace experience specifically in the Shuttle, International Space Station (ISS), and Constellation Programs. The Analysis Group partners with project and program offices, other NASA centers, NASA contractors, and universities to provide additional resources or information to the group when performing various analysis tasks. The JSC S&MA Analysis Group is recognized as a leader in risk and reliability analysis within the NASA community. Therefore, the Analysis Group is in high demand to help the Space Shuttle Program (SSP) continue to fly safely, assist in designing the next generation spacecraft for the Constellation Program (CxP), and promote advanced analytical techniques. The Analysis Section s tasks include teaching classes and instituting personnel qualification processes to enhance the professional abilities of our analysts
Ronald L. Boring
In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø National Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.
Full Text Available Kenya’s water abstraction must meet the projected growth in municipal and irrigation demand by the end of 2030 in order to achieve the country’s industrial and economic development plan. The Masinga dam, on the Tana River, is the key to meeting this goal to satisfy the growing demands whilst also continuing to provide hydroelectric power generation. This study quantitatively assesses the reliability and robustness of the Masinga dam system under uncertain future supply and demand using probabilistic climate and population projections, and examines how long-term planning may improve the longevity of the dam. River flow and demand projections are used alongside each other as inputs to the dam system simulation model linked to an optimisation engine to maximise water availability. Water availability after demand satisfaction is assessed for future years, and the projected reliability of the system is calculated for selected years. The analysis shows that maximising power generation on a short-term year-by-year basis achieves 80%, 50% and 1% reliability by 2020, 2025 and 2030 onwards, respectively. Longer term optimal planning, however, has increased system reliability to up to 95% in 2020, 80% in 2025, and more than 40% in 2030 onwards. In addition, increasing the capacity of the reservoir by around 25% can significantly improve the robustness of the system for all future time periods. This study provides a platform for analysing the implication of different planning and management of Masinga dam and suggests that careful consideration should be given to account for growing municipal needs and irrigation schemes in both the immediate and the associated Tana River basin.
Full Text Available The article analyses the scientists’ views on the phenomenon of “reliability”. The author pays special attention to the fact that the concept “reliability” is considered by the scientists as the systematic characteristic, which is characterized by the specific number of professional, psychological, physiological qualities and functions at the different levels of a person’s activity. This activity provides stable and reliable work. It is found out that scientists think the quality is the most important property which provides definiteness to any phenomenon. Personal qualities that affect the maturity of emotional and volitional reliability are established. Scientific interpretation of the concept of “reliability” and the main features of performing reliability is revealed. The analysis of pedagogical and psychological literature proves that the emotional qualities are being formed throughout the life according to the person’s environmental and genetic conditions. Emotional irritability, emotional stability, emotional tone, emotional reactions, emotional stability are the qualities which depend upon a type of a person’s higher nervous activity. The person’s activity, especially music one, can’t exist without involving emotions and feelings. Music plays an important role; due to it emotions become conscious processes. Thanks to music the person’s higher emotions (moral, intellectual, esthetic are formed. It is noticed that individual differences in demonstration of emotions depend on a person’s volitional qualities. Volition is considered to be a person’s psychological activity that determines purposefulness of actions. The author concluded that the maturity level of emotional and volitional reliability depends on direction and resistance of socially significant motives (needs, interests, values, attitudes; personality’s psychophysical traits (abilities, capacities, which provide the required level and effectiveness
Tobgay, Sonam; Olsen, Rasmus Løvenstein; Prasad, Ramjee
This paper examines the effect of different information access strategies on power consumption and information reliability, considering the wireless sensor network as the source of information. Basically, the paper explores three different access strategies, namely; reactive, periodic and hybrid...... and computes power consumption and mismatch probability  in each of these access strategies. Based on our study, we make some recommendations when and where, which access strategy is suitable depending upon the application's requirements and network behavior. It also provides the model implementation...
Kirkegaard, Poul Henning; Enevoldsen, I.; Sørensen, John Dalsgaard
In this paper a reliability analysis of a Mono-tower platform is presented. The failure modes, considered, are yelding in the tube cross-sections, and fatigue failure in the butt welds. The fatigue failure mode is investigated with a fatigue model, where the fatigue strength is expressed through SN...... for the fatigue limit state is a significant failure mode for the Mono.tower platform. Further, it is shown for the fatigue failure mode the the largest contributions to the overall uncertainty are due to the damping ratio, the inertia coefficient, the stress concentration factor, the model uncertainties...
A reliability analysis of structures subjected to random service loads and periodic proof tests treats gust loads and maneuver loads as random processes. Crack initiation, crack propagation, and strength degradation are treated as the fatigue process. The time to fatigue crack initiation and ultimate strength are random variables. Residual strength decreases during crack propagation, so that failure rate increases with time. When a structure fails under periodic proof testing, a new structure is built and proof-tested. The probability of structural failure in service is derived from treatment of all the random variables, strength degradations, service loads, proof tests, and the renewal of failed structures. Some numerical examples are worked out.
Full Text Available In this paper, it is modeled by using ADAMS to portable axle load meter of dynamic weighing system, controlling a single variable simulation weighing process, getting the simulation weighing data under the different speed and weight; simultaneously using portable weighing system with the same parameters to achieve the actual measurement, comparative analysis the simulation results under the same conditions, at 30 km/h or less, the simulation value and the measured value do not differ by more than 5 %, it is not only to verify the reliability of dynamic weighing model, but also to create possible for improving algorithm study efficiency by using dynamic weighing model simulation.
G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.
Full Text Available The work describes the impact the reliability of the information quality IQ for information and communication systems. One of the components of IQ is the reliability properties such as relativity, accuracy, timeliness, completeness, consistency, adequacy, accessibility, credibility, congruence. Each of these components of IQ is independent and to properly estimate the value of IQ, use one of the methods of modeling uncertainty. In this article, we used a hybrid method that has been developed jointly by one of the authors. This method is based on the mathematical theory of evidence know as Dempstera-Shafera (DS theory and serial links of dependent hybrid named IQ (hyb.
Boyer Walther, Célia; Appel, Ron David; Ball, Marion J; van Bemmel, Jan H; Bergmans, Jean-Paul; Carpentier, Michel; Hochstrasser, Denis; Lindberg, Donald; Miller, Randolph; Peterschmitt, Jean-Claude; Safran, Charlie; Thonnet, Michèle; Geissbuhler, Antoine
The Health On the Net Foundation (HON) was born in 1996, during the beginning of the World Wide Web, from a collective decision by health specialists, led by the late Jean-Raoul Scherrer, who anticipated the need for online trustworthy health information. Because the Internet is a free space that everyone shares, a search for quality information is like a shot in the dark: neither will reliably hit their target. Thus, HON was created to promote deployment of useful and reliable online health ...
Butler, R. W.
SARA, the SURE/ASSIST Reliability Analysis Workstation, is a bundle of programs used to solve reliability problems. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. The Systems Validation Methods group at NASA Langley Research Center has created a set of four software packages that form the basis for a reliability analysis workstation, including three for use in analyzing reconfigurable, fault-tolerant systems and one for analyzing non-reconfigurable systems. The SARA bundle includes the three for reconfigurable, fault-tolerant systems: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), and PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920). As indicated by the program numbers in parentheses, each of these three packages is also available separately in two machine versions. The fourth package, which is only available separately, is FTC, the Fault Tree Compiler (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree which describes a non-reconfigurable system. PAWS/STEM and SURE are analysis programs which utilize different solution methods, but have a common input language, the SURE language. ASSIST is a preprocessor that generates SURE language from a more abstract definition. ASSIST, SURE, and PAWS/STEM are described briefly in the following paragraphs. For additional details about the individual packages, including pricing, please refer to their respective abstracts. ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, allows a reliability engineer to describe the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. A one-page ASSIST-language description may result in a semi-Markov model with thousands of states and
Griffith, Candice D. [Vanderbilt University, Nashville, TN (United States); Mahadevan, Sankaran, E-mail: email@example.com [Vanderbilt University, Nashville, TN (United States)
The effect of fatigue on human performance has been observed to be an important factor in many industrial accidents. However, defining and measuring fatigue is not easily accomplished. This creates difficulties in including fatigue effects in probabilistic risk assessments (PRA) of complex engineering systems that seek to include human reliability analysis (HRA). Thus the objectives of this paper are to discuss (1) the importance of the effects of fatigue on performance, (2) the difficulties associated with defining and measuring fatigue, (3) the current status of inclusion of fatigue in HRA methods, and (4) the future directions and challenges for the inclusion of fatigue, specifically sleep deprivation, in HRA. - Highlights: >We highlight the need for fatigue and sleep deprivation effects on performance to be included in human reliability analysis (HRA) methods. Current methods do not explicitly include sleep deprivation effects. > We discuss the difficulties in defining and measuring fatigue. > We review sleep deprivation research, and discuss the limitations and future needs of the current HRA methods.
Tsarouhas, Panagiotis H; Arvanitoyannis, Ioannis S
The statistical analysis of the bread production line of the failure and repair data at machine and line levels was displayed. The experiment covers a period of twenty-five months. The best fit of the failure data between the common theoretical distributions was found and its parameters were computed. The reliability and hazard rate modes for all machines and the entire production line were calculated as well. The models could prove to be a useful tool to assess the current conditions, and to predict the reliability for upgrading the maintenance policy of the production line. It was pointed out that (a) the availability of the bread production line is 90.74% and went down to 86.76% because the equipment's failures cause an additional production gap in the line, (b) the 53.5% of all failures occurred at the bread machine, cooling tower machine, and volumetric-divider machine, and (c) the machines of the bread production line that displayed increasing hazard rate functions were identified. This analysis will be very useful in terms of identifying the occurring and latent problems in manufacturing process of bread and improve it.
Rusu, Oana; Petcu, Ana Elena; Drăgan, Eliza; Haba, Danisia; Moscalu, Mihaela; Zetu, Irina Nicoleta
The aim of this investigation was to determine, compare and evaluate three different computerized tracing programs, where the lateral cephalograms were digitized on the screen. 39 randomly selected cephalometric radiographs were used in the present study. Three programs Planmeca Romexis® (Romexis 3.2.0., Helsinki, Finland), Orthalis (France) and AxCeph (A.C 22.214.171.124, Ljubljana, Slovenia) were evaluated. 12 skeletal, 9 dental and 3 soft tissue parameters were measured that consisted of 11 linear and 13 angular measurements. Statistical analysis was carried out using multivariate analysis of variance (MANOVA), Levene test, Tukey Honestly Significant Difference (HSD) test and Kruskal-Wallis test. The measurements obtained with the cephalometric analyses programs used in the study were reliable.
Zeleníková, Renáta; Homzová, Pavlína; Homza, Miroslav; Bužgová, Radka
The purpose of this study was to validate the Czech version of the Amsterdam Preoperative Anxiety and Information Scale (APAIS) in adult patients undergoing elective surgery. A cross-sectional study. Data were collected from July 2012 to January 2013. For reliability and validity testing, two instruments measuring preoperative anxiety were administered to the participants on the same occasion, (APAIS and the Spielberg State Anxiety Inventory (STAI-S)). The sample consisted of 344 patients undergoing elective surgery. Reliability of APAIS anxiety subscale measured by Cronbach's alpha was 0.91. Reliability of APAIS information subscale measured by Cronbach's alpha was 0.78. The APAIS anxiety subscale correlated significantly with the STAI-S (0.69). Women scored significantly higher on anxiety scales than men. APAIS may be a useful tool to measure preoperative anxiety in Czech patients undergoing elective surgery. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.
Full Text Available Economic exchange between strangers happens extremely frequently due to the growing number of internet transactions. In trust situations like online transactions, a trustor usually does not know whether she encounters a trustworthy trustee. However, the trustor might form beliefs about the trustee's trustworthiness by relying on third-party information. Different kinds of third-party information can vary dramatically in their importance to the trustor. We ran a factorial design to study how the different characteristics of third-party information affect the trustor's decision to trust. We systematically varied unregulated third-party information regarding the source (friend or a stranger, the reliability (gossip or experiences, and the valence (positive or negative of the information. The results show that negative information is more salient for withholding trust than positive information is for placing trust. If third-party information is positive, experience of a friend has the strongest effect on trusting followed by friend's gossip. Positive information from a stranger does not matter to the trustor. With respect to negative information, the data show that even the slightest hint of an untrustworthy trustee leads to significantly less placed trust irrespective of the source or the reliability of the information.
Bozoyan, Christiane; Vogt, Sonja
Economic exchange between strangers happens extremely frequently due to the growing number of internet transactions. In trust situations like online transactions, a trustor usually does not know whether she encounters a trustworthy trustee. However, the trustor might form beliefs about the trustee's trustworthiness by relying on third-party information. Different kinds of third-party information can vary dramatically in their importance to the trustor. We ran a factorial design to study how the different characteristics of third-party information affect the trustor's decision to trust. We systematically varied unregulated third-party information regarding the source (friend or a stranger), the reliability (gossip or experiences), and the valence (positive or negative) of the information. The results show that negative information is more salient for withholding trust than positive information is for placing trust. If third-party information is positive, experience of a friend has the strongest effect on trusting followed by friend's gossip. Positive information from a stranger does not matter to the trustor. With respect to negative information, the data show that even the slightest hint of an untrustworthy trustee leads to significantly less placed trust irrespective of the source or the reliability of the information.
Boyer, Célia; Appel, Ron D; Ball, Marion J; van Bemmel, Jan H; Bergmans, Jean-Paul; Carpentier, Michel; Hochstrasser, Denis; Lindberg, Donald; Miller, Randolph; Peterschmitt, Jean-Claude; Safran, Charlie; Thonnet, Michèle; Geissbühler, Antoine
The Health On the Net Foundation (HON) was born in 1996, during the beginning of the World Wide Web, from a collective decision by health specialists, led by the late Jean-Raoul Scherrer, who anticipated the need for online trustworthy health information. Because the Internet is a free space that everyone shares, a search for quality information is like a shot in the dark: neither will reliably hit their target. Thus, HON was created to promote deployment of useful and reliable online health information, and to enable its appropriate and efficient use. Two decades on, HON is the oldest and most valued quality marker for online health information. The organization has maintained its reputation through dynamic measures, innovative endeavors and dedication to upholding key values and goals. This paper provides an overview of the HON Foundation, and its activities, challenges, and achievements over the years.
Irina P. Kurochkina
Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated ﬁ nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit ﬁrms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.
Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia
The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools.
Garetto, Anthony; Rademacher, Thomas; Schulz, Kristian
The decreasing size and increasing complexity of photomask features, driven by the push to ever smaller technology nodes, places more and more challenges on the mask house, particularly in terms of yield management and cost reduction. Particularly challenging for mask shops is the inspection, repair and review cycle which requires more time and skill from operators due to the higher number of masks required per technology node and larger nuisance defect counts. While the measurement throughput of the AIMS™ platform has been improved in order to keep pace with these trends, the analysis of aerial images has seen little advancement and remains largely a manual process. This manual analysis of aerial images is time consuming, dependent on the skill level of the operator and significantly contributes to the overall mask manufacturing process flow. AutoAnalysis, the first application available for the FAVOR® platform, offers a solution to these problems by providing fully automated analysis of AIMS™ aerial images. Direct communication with the AIMS™ system allows automated data transfer and analysis parallel to the measurements. User defined report templates allow the relevant data to be output in a manner that can be tailored to various internal needs and support the requests of your customers. Productivity is significantly improved due to the fast analysis, operator time is saved and made available for other tasks and reliability is no longer a concern as the most defective region is always and consistently captured. In this paper the concept and approach of AutoAnalysis will be presented as well as an update to the status of the project. The benefits arising from the use of AutoAnalysis will be discussed in more detail and a study will be performed in order to demonstrate.
Sharapov, V. I.; Orlov, M. E.; Kunin, M. V.
Technologies that improve the reliability and efficiency of the combined district heating systems in urban areas are considered. The calculation method of reliability of the CHP combined district heating systems is proposed. The comparative estimation of the reliability of traditional and combined district heating systems is performed.
Clayson, Peter E; Miller, Gregory A
Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.
This paper address the reliability and availability issues to be faced in deploying and operating the klystron modulator assemblies proposed for the Next Linear Collider (NLC). The rf power sources are a major system of the NLC and require a high uptime in order to reach the goal of 0.85 availability. Since the NLC is made up of several systems, not just klystron-modulator assemblies, the availability goal for the assemblies must be higher than 0.85. Currently this goal is at least 0.95. This short paper summarizes the analysis currently under way to determine whether the design of the rf power system will meet the design availability goal.
Ellingwood, B.R. [Johns Hopkins Univ., Baltimore, MD (United States)
Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process.
Rafsanjani, Hesam Mirzaei; Sørensen, John Dalsgaard; Fæster, Søren
The fatigue life of wind turbine cast components, such as the main shaft in a drivetrain, is generally determined by defects from the casting process. These defects may reduce the fatigue life and they are generally distributed randomly in components. The foundries, cutting facilities and test.......) and to quantify the relevant uncertainties using available fatigue tests. Illustrative results are presented as obtained by statistical analysis of a large set of fatigue data for casted test components typically used for wind turbines. Furthermore, the SN curves (fatigue life curves based on applied stress...... facilities can affect the verification of properties by testing. Hence, it is important to have a tool to identify which foundry, cutting and/or test facility produces components which, based on the relevant uncertainties, have the largest expected fatigue life or, alternatively, have the largest reliability...
Li, Lin; Huang, Yanzhao; Xiao, Yi
In many protein-protein docking algorithms, binding site information is used to help predicting the protein complex structures. Using correct and accurate binding site information can increase protein-protein docking success rate significantly. On the other hand, using wrong binding sites information should lead to a failed prediction, or, at least decrease the success rate. Recently, various successful theoretical methods have been proposed to predict the binding sites of proteins. However, the predicted binding site information is not always reliable, sometimes wrong binding site information could be given. Hence there is a high risk to use the predicted binding site information in current docking algorithms. In this paper, a softly restricting method (SRM) is developed to solve this problem. By utilizing predicted binding site information in a proper way, the SRM algorithm is sensitive to the correct binding site information but insensitive to wrong information, which decreases the risk of using predicted binding site information. This SRM is tested on benchmark 3.0 using purely predicted binding site information. The result shows that when the predicted information is correct, SRM increases the success rate significantly; however, even if the predicted information is completely wrong, SRM only decreases success rate slightly, which indicates that the SRM is suitable for utilizing predicted binding site information.
Chetvertakova, E. S.; Chimitova, E. V.
In this paper, we consider the application of the statistical degradation models for reliability analysis in non-destructive testing. Such models enable to estimate the reliability function (the dependence of non-failure probability on time) for the fixed critical level using the information of the degradation paths of tested items. The most widely used models are the gamma and Wiener degradation models, in which the gamma or normal distributions are assumed as the distribution of degradation increments, respectively. Using the computer simulation technique, we have analysed the accuracy of the reliability estimates, obtained for considered models. The number of increments can be enlarged by increasing the sample size (the number of tested items) or by increasing the frequency of measuring degradation. It has been shown, that the sample size has a greater influence on the accuracy of the reliability estimates in comparison with the measuring frequency. Moreover, it has been shown that another important factor, influencing the accuracy of reliability estimation, is the duration of observing degradation process.
Sembiring, N.; Ginting, E.; Darnello, T.
Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.
This note analytically derives the impact that wrong and missing sire information (WSI and MSI, respectively) has on the reliability of predicting merit and gain compared with perfect information. In particular, for small WSI and MSI, WSI was shown to have twice the impact of MSI for both reliability and gain, and the impact of both WSI and MSI increased as the reliability of predicting merit with perfect information decreased. The overall impact on the efficiency of gain for small WSI and MS...
Melchior Jacobsen, Rasmus; Popovski, Petar
Stationary collectors reading wireless, battery powered smart meters, often operate in harsh channel conditions to cut network installation cost to a minimum, challenging the individual link to each meter. The desired performance measure is reliable reception of at least some data from as many...... as possible meters, rather than maximizing the number of received packets from one meter. We consider a method for improving the reliable reception in a metering system that operates under the constraints of the popular Wireless M-Bus protocol. We develop a framework for reliable reception in which we use...... the deterministic protocol structure to obtain side information and group the packets from the same meter. We derive the probability of falsely pairing packets from different senders in the simple case of no channel errors, and show through simulation and data from an experimental deployment the probability...
Samimi, Parnia; Ravana, Sri Devi
Test collection is used to evaluate the information retrieval systems in laboratory-based evaluation experimentation. In a classic setting, generating relevance judgments involves human assessors and is a costly and time consuming task. Researchers and practitioners are still being challenged in performing reliable and low-cost evaluation of retrieval systems. Crowdsourcing as a novel method of data acquisition is broadly used in many research fields. It has been proven that crowdsourcing is an inexpensive and quick solution as well as a reliable alternative for creating relevance judgments. One of the crowdsourcing applications in IR is to judge relevancy of query document pair. In order to have a successful crowdsourcing experiment, the relevance judgment tasks should be designed precisely to emphasize quality control. This paper is intended to explore different factors that have an influence on the accuracy of relevance judgments accomplished by workers and how to intensify the reliability of judgments in crowdsourcing experiment.
Lee, David; Sanders, Thomas J.
This paper addresses the integrated circuit industry needs for non-isothermal simulation in device reliability analysis, initial input factor sensitivity analysis and their software implementation. The key reliability issues are the hot-electron induced oxide damages and electro-static discharge (ESD) damages. The main purpose of this work is to provide a design aid tool to improve device reliability and performance. The reliability simulator developed in this work not only predicts designed device reliability, but also provides some information about the effect of manufacturing variations on reliability. This is accomplished by combining the statistical methodology with existing technology computer aided design (TCAD) tools. The design of experiment (DoE) technique can be successfully employed to analyze the effect of manufacturing variations on the SOT device reliability. As an example, the reliability analysis and the statistical analysis have performed on SOT MOS devices (partially depleted and fully depleted SOT) and submicron bulk-Si MOSFET's to verify the applied modeling method.
Lee, Kenneth; Hoti, Kreshnik; Hughes, Jeffery D; Emmerton, Lynne M
Health information on the Internet is ubiquitous, and its use by health consumers prevalent. Finding and understanding relevant online health information, and determining content reliability, pose real challenges for many health consumers. To identify the types of interventions that have been implemented to assist health consumers to find reliable online health information, and where possible, describe and compare the types of outcomes studied. PubMed, PsycINFO, CINAHL Plus and Cochrane Library databases; WorldCat and Scirus 'gray literature' search engines; and manual review of reference lists of selected publications. Publications were selected by firstly screening title, abstract, and then full text. Seven publications met the inclusion criteria, and were summarized in a data extraction form. The form incorporated the PICOS (Population Intervention Comparators Outcomes and Study Design) Model. Two eligible gray literature papers were also reported. Relevant data from included studies were tabulated to enable descriptive comparison. A brief critique of each study was included in the tables. This review was unable to follow systematic review methods due to the paucity of research and humanistic interventions reported. While extensive, the gray literature search may have had limited reach in some countries. The paucity of research on this topic limits conclusions that may be drawn. The few eligible studies predominantly adopted a didactic approach to assisting health consumers, whereby consumers were either taught how to find credible websites, or how to use the Internet. Common types of outcomes studied include knowledge and skills pertaining to Internet use and searching for reliable health information. These outcomes were predominantly self-assessed by participants. There is potential for further research to explore other avenues for assisting health consumers to find reliable online health information, and to assess outcomes via objective measures.
Full Text Available BACKGROUND: Health information on the Internet is ubiquitous, and its use by health consumers prevalent. Finding and understanding relevant online health information, and determining content reliability, pose real challenges for many health consumers. PURPOSE: To identify the types of interventions that have been implemented to assist health consumers to find reliable online health information, and where possible, describe and compare the types of outcomes studied. DATA SOURCES: PubMed, PsycINFO, CINAHL Plus and Cochrane Library databases; WorldCat and Scirus 'gray literature' search engines; and manual review of reference lists of selected publications. STUDY SELECTION: Publications were selected by firstly screening title, abstract, and then full text. DATA EXTRACTION: Seven publications met the inclusion criteria, and were summarized in a data extraction form. The form incorporated the PICOS (Population Intervention Comparators Outcomes and Study Design Model. Two eligible gray literature papers were also reported. DATA SYNTHESIS: Relevant data from included studies were tabulated to enable descriptive comparison. A brief critique of each study was included in the tables. This review was unable to follow systematic review methods due to the paucity of research and humanistic interventions reported. LIMITATIONS: While extensive, the gray literature search may have had limited reach in some countries. The paucity of research on this topic limits conclusions that may be drawn. CONCLUSIONS: The few eligible studies predominantly adopted a didactic approach to assisting health consumers, whereby consumers were either taught how to find credible websites, or how to use the Internet. Common types of outcomes studied include knowledge and skills pertaining to Internet use and searching for reliable health information. These outcomes were predominantly self-assessed by participants. There is potential for further research to explore other avenues for
In this work, a FORTRAN-based computer computer. Eurocode 2 (EC 2) ... addresses addresses: 1 firstname.lastname@example.org, 2 email@example.com computer computer program was developed to aid the design of reinforced co program was ..... Haldar, A. and Mahadevan, S. Reliability Assessment using Stochastic Finite ...
St. Germain, S.; Boring, R.; Banaseanu, G.; Akl, Y.; Chatri, H.
This paper uses the insights from the Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) methodology to help identify human actions currently modeled in the single unit PSA that may need to be modified to account for additional challenges imposed by a multi-unit accident as well as identify possible new human actions that might be modeled to more accurately characterize multi-unit risk. In identifying these potential human action impacts, the use of the SPAR-H strategy to include both errors in diagnosis and errors in action is considered as well as identifying characteristics of a multi-unit accident scenario that may impact the selection of the performance shaping factors (PSFs) used in SPAR-H. The lessons learned from the Fukushima Daiichi reactor accident will be addressed to further help identify areas where improved modeling may be required. While these multi-unit impacts may require modifications to a Level 1 PSA model, it is expected to have much more importance for Level 2 modeling. There is little currently written specifically about multi-unit HRA issues. A review of related published research will be presented. While this paper cannot answer all issues related to multi-unit HRA, it will hopefully serve as a starting point to generate discussion and spark additional ideas towards the proper treatment of HRA in a multi-unit PSA.
Li Changyou; Liu Haiyang; Guo Song; Zhang Yimin; Li Zhenyuan
A lot of mechanical parts are subject to failure due to the deterioration. Usually the preventive maintenance is taken to ensure the safety and reliability. Therefore, it is very important to study the gradual reliability design of the mechanical part for improving the gradual reliability of the mechanical system under the condition of considering the preventive maintenance. Beta distribution is employed to describe the randomness of the mechanical part state after the preventive maintenance....
Brunnekreef, J.J.; Uden, C. van; Moorsel, S. van; Kooloos, J.G.M.
BACKGROUND: In clinical practice, visual gait observation is often used to determine gait disorders and to evaluate treatment. Several reliability studies on observational gait analysis have been described in the literature and generally showed moderate reliability. However, patients with orthopedic
Tolstrup, Terkel Kristian; Nielson, Flemming; Nielson, Hanne Riis
We describe a fragment of the hardware description language VHDL that is suitable for implementing the Advanced Encryption Standard algorithm. We then define an Information Flow analysis as required by the international standard Common Criteria. The goal of the analysis is to identify the entire ...
Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)
Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.
Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.
Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.
Lendering, K.T.; Jonkman, S.N.; Kok, M.
During flood events emergency measures are used to prevent breaches in flood defences. However, there is still limited insight in their reliability and effectiveness. The objective of this paper is to develop a method to determine the reliability and effectiveness of emergency measures for flood
Van Eekelen, A.J.
To select a method for analyzing structural reliability problems, including pptimization under reliability constraints, a literature survey was performed. In this review the most frequently used and most generally applicable methods are described. An extensive list of references is included. The
Linsday, James (ARES Corporation); Briand, Daniel; Hill, Roger Ray; Stinebaugh, Jennifer A.; Benjamin, Allan S. (ARES Corporation)
The US wind Industry has experienced remarkable growth since the turn of the century. At the same time, the physical size and electrical generation capabilities of wind turbines has also experienced remarkable growth. As the market continues to expand, and as wind generation continues to gain a significant share of the generation portfolio, the reliability of wind turbine technology becomes increasingly important. This report addresses how operations and maintenance costs are related to unreliability - that is the failures experienced by systems and components. Reliability tools are demonstrated, data needed to understand and catalog failure events is described, and practical wind turbine reliability models are illustrated, including preliminary results. This report also presents a continuing process of how to proceed with controlling industry requirements, needs, and expectations related to Reliability, Availability, Maintainability, and Safety. A simply stated goal of this process is to better understand and to improve the operable reliability of wind turbine installations.
Woolliams, J A
This note analytically derives the impact that wrong and missing sire information (WSI and MSI, respectively) has on the reliability of predicting merit and gain compared with perfect information. In particular, for small WSI and MSI, WSI was shown to have twice the impact of MSI for both reliability and gain, and the impact of both WSI and MSI increased as the reliability of predicting merit with perfect information decreased. The overall impact on the efficiency of gain for small WSI and MSI was half the overall impact on reliability.
Menicucci, David F. (Building Specialists, Inc., Albuquerque, NM)
Solar hot water (SHW) systems have been installed commercially for over 30 years, yet few quantitative details are known about their reliability. This report describes a comprehensive analysis of all of the known major previous research and data regarding the reliability of SHW systems and components. Some important conclusions emerged. First, based on a detailed inspection of ten-year-old systems in Florida, about half of active systems can be expected to fail within a ten-year period. Second, valves were identified as the probable cause of a majority of active SHW failures. Third, passive integral and thermosiphon SHW systems have much lower failure rates than active ones, probably due to their simple design that employs few mechanical parts. Fourth, it is probable that the existing data about reliability do not reveal the full extent of fielded system failures because most of the data were based on trouble calls. Often an SHW system owner is not aware of a failure because the backup system silently continues to produce hot water. Thus, a repair event may not be generated in a timely manner, if at all. This final report for the project provides all of the pertinent details about this study, including the source of the data, the techniques to assure their quality before analysis, the organization of the data into perhaps the most comprehensive reliability database in existence, a detailed statistical analysis, and a list of recommendations for additional critical work. Important recommendations include the inclusion of an alarm on SHW systems to identify a failed system, the need for a scientifically designed study to collect high-quality reliability data that will lead to design improvements and lower costs, and accelerated testing of components that are identified as highly problematic.
The study was motivated by the information related problems commonly observed in the administration of Nigerian tertiary institutions. The study investigated the levels of information acquisition, information management capacity and decision-making effectiveness of administrators in 14 tertiary institutions in three out of six ...
Sözer, Hasan; Tekinerdogan, B.; Aksit, Mehmet; de Lemos, Rogerio; Gacek, Cristina
Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.
Bell, B.J.; Swain, A.D.
This document describes in detail a procedure to be followed in conducting a human reliability analysis as part of a probabilistic risk assessment when such an analysis is performed according to the methods described in NUREG/CR-1278, Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. An overview of the procedure describing the major elements of a human reliability analysis is presented along with a detailed description of each element and an example of an actual analysis. An appendix consists of some sample human reliability analysis problems for further study.
Cuschieri, Alfred; Tang, B
This review explains the nature of human reliability analysis (HRA) methods developed and used for predicting safety in high-risk human activities. HRA techniques have evolved over the years and have become less subjective as a result of inclusion of (i) cognitive factors in the man-machine interface and (ii) high and low dependency levels between human failure events (HFEs). All however remain probabilistic in the assessment of safety. In the translation of these techniques, developed for assessment of safety of high-risk industries (nuclear, aerospace etc.) where catastrophic failures from the man-machine complex interface are fortunately rare, to the clinical operative surgery (with its high incidence of human errors), the system loses subjectivity since the documentation of HFEs can be assessed and studied prospectively on the basis of an objective data capture of errors enacted during a defined clinical activity. The observational clinical-HRA (OC-HRA) was developed specifically for this purpose, initially for laparoscopic general surgery. It has however been used by other surgical specialties. OC-HRA has the additional merit of objective determination of the proficiency of a surgeon in executing specific interventions and is adaptable to the evaluation of safety and proficiency in clinical activities within the preoperative and postoperative periods.
Relevance of symptom analysis during hydrogen breath test (HBT) for establishing a clinical diagnosis of sugar intolerance is reviewed. Evaluation of symptoms developed in response to the ingestion of 50 g lactose could represent a simple screening test to select patients for lactose intolerance testing. Patients who do not develop symptoms do not need to be referred for HBT. In addition, symptoms reported by patients during a negative HBT cannot be at all times attributed to a false-negative test; instead, a 'nocebo' effect is likely to be implicated. On the other hand, in a double-blind randomized study, a dose of 25 g fructose was suggested as the most appropriate for testing individuals with suspected fructose malabsorption, whereas symptom reliability to diagnose fructose intolerance was inaccurate. Whereas the development of symptoms after a positive HBT may indicate sugar intolerance, it is still not clear whether the absence of symptoms after sugar malabsorption gives any indication as to the role of that sugar in the genesis of patient's complaints. Further studies should evaluate whether the disappearance of symptoms with a sugar-restricted diet after a positive HBT is a better diagnostic criterion of sugar intolerance than the development of symptoms.
V. V. Nadurak
Full Text Available Purpose of the research is a critical analysis of the reliability of intuitive moral decisions. Methodology. The work is based on the methodological attitude of empirical ethics, involving the use of findings from empirical research in ethical reflection and decision making. Originality. The main kinds of intuitive moral decisions are identified: 1 intuitively emotional decisions (i.e. decisions made under the influence of emotions that accompanies the process of moral decision making; 2 decisions made under the influence of moral risky psychological aptitudes (unconscious human tendencies that makes us think in a certain way and make decisions, unacceptable from the logical and ethical point of view; 3 intuitively normative decisions (decisions made under the influence of socially learned norms, that cause evaluative feeling «good-bad», without conscious reasoning. It was found that all of these kinds of intuitive moral decisions can lead to mistakes in the moral life. Conclusions. Considering the fact that intuition systematically leads to erroneous moral decisions, intuitive reaction cannot be the only source for making such decisions. The conscious rational reasoning can compensate for weaknesses of intuition. In this case, there is a necessity in theoretical model that would structure the knowledge about the interactions between intuitive and rational factors in moral decisions making and became the basis for making suggestions that would help us to make the right moral decision.
O. O. Matusevych
Full Text Available The author proposed the numerous methods of solving the multi-criterion task – increasing of reliability of control system on the basis of expert information. The information, which allows choosing thoughtfully the method of reliability increasing for a control system of electric transport, is considered.
Full Text Available Digital terrain models (DTMs represent segments of spatial data bases related to presentation of terrain features and landforms. Square grid elevation models (DEMs have emerged as the most widely used structure during the past decade because of their simplicity and simple computer implementation. They have become an important segment of Topographic Information Systems (TIS, storing natural and artificial landscape in forms of digital models. This kind of a data structure is especially suitable for morph metric terrain evaluation and analysis, which is very important in environmental and urban planning and Earth surface modeling applications. One of the most often used functionalities of Geographical information systems software packages is indivisibility or view shed analysis of terrain. Indivisibility determination from analog topographic maps may be very exhausting, because of the large number of profiles that have to be extracted and compared. Terrain representation in form of the DEMs databases facilitates this task. This paper describes simple algorithm for terrain view shed analysis by using DEMs database structures, taking into consideration the influence of uncertainties of such data to the results obtained thus far. The concept of probability maps is introduced as a mean for evaluation of results, and is presented as thematic display.
Gutiérrez, Nicolás L.; Valencia, Sarah R.; Branch, Trevor A.; Agnew, David J.; Baum, Julia K.; Bianchi, Patricia L.; Cornejo-Donoso, Jorge; Costello, Christopher; Defeo, Omar; Essington, Timothy E.; Hilborn, Ray; Hoggarth, Daniel D.; Larsen, Ashley E.; Ninnes, Chris; Sainsbury, Keith; Selden, Rebecca L.; Sistla, Seeta; Smith, Anthony D. M.; Stern-Pirlot, Amanda; Teck, Sarah J.; Thorson, James T.; Williams, Nicholas E.
Concerns over fishing impacts on marine populations and ecosystems have intensified the need to improve ocean management. One increasingly popular market-based instrument for ecological stewardship is the use of certification and eco-labeling programs to highlight sustainable fisheries with low environmental impacts. The Marine Stewardship Council (MSC) is the most prominent of these programs. Despite widespread discussions about the rigor of the MSC standards, no comprehensive analysis of the performance of MSC-certified fish stocks has yet been conducted. We compared status and abundance trends of 45 certified stocks with those of 179 uncertified stocks, finding that 74% of certified fisheries were above biomass levels that would produce maximum sustainable yield, compared with only 44% of uncertified fisheries. On average, the biomass of certified stocks increased by 46% over the past 10 years, whereas uncertified fisheries increased by just 9%. As part of the MSC process, fisheries initially go through a confidential pre-assessment process. When certified fisheries are compared with those that decline to pursue full certification after pre-assessment, certified stocks had much lower mean exploitation rates (67% of the rate producing maximum sustainable yield vs. 92% for those declining to pursue certification), allowing for more sustainable harvesting and in many cases biomass rebuilding. From a consumer’s point of view this means that MSC-certified seafood is 3–5 times less likely to be subject to harmful fishing than uncertified seafood. Thus, MSC-certification accurately identifies healthy fish stocks and conveys reliable information on stock status to seafood consumers. PMID:22928029
Nicolás L Gutiérrez
Full Text Available Concerns over fishing impacts on marine populations and ecosystems have intensified the need to improve ocean management. One increasingly popular market-based instrument for ecological stewardship is the use of certification and eco-labeling programs to highlight sustainable fisheries with low environmental impacts. The Marine Stewardship Council (MSC is the most prominent of these programs. Despite widespread discussions about the rigor of the MSC standards, no comprehensive analysis of the performance of MSC-certified fish stocks has yet been conducted. We compared status and abundance trends of 45 certified stocks with those of 179 uncertified stocks, finding that 74% of certified fisheries were above biomass levels that would produce maximum sustainable yield, compared with only 44% of uncertified fisheries. On average, the biomass of certified stocks increased by 46% over the past 10 years, whereas uncertified fisheries increased by just 9%. As part of the MSC process, fisheries initially go through a confidential pre-assessment process. When certified fisheries are compared with those that decline to pursue full certification after pre-assessment, certified stocks had much lower mean exploitation rates (67% of the rate producing maximum sustainable yield vs. 92% for those declining to pursue certification, allowing for more sustainable harvesting and in many cases biomass rebuilding. From a consumer's point of view this means that MSC-certified seafood is 3-5 times less likely to be subject to harmful fishing than uncertified seafood. Thus, MSC-certification accurately identifies healthy fish stocks and conveys reliable information on stock status to seafood consumers.
Swain, A D; Guttmann, H E
The primary purpose of the Handbook is to present methods, models, and estimated human error probabilities (HEPs) to enable qualified analysts to make quantitative or qualitative assessments of occurrences of human errors in nuclear power plants (NPPs) that affect the availability or operational reliability of engineered safety features and components. The Handbook is intended to provide much of the modeling and information necessary for the performance of human reliability analysis (HRA) as a part of probabilistic risk assessment (PRA) of NPPs. Although not a design guide, a second purpose of the Handbook is to enable the user to recognize error-likely equipment design, plant policies and practices, written procedures, and other human factors problems so that improvements can be considered. The Handbook provides the methodology to identify and quantify the potential for human error in NPP tasks.
Travel-time reliability is a key performance measure in any transportation system. It is a : measure of quality of travel time experienced by transportation system users and reflects the efficiency : of the transportation system to serve citizens, bu...
environmental conditions at the time of the reported failure as well as the exact nature of the failure. 4 The diskette format (FMDR-21A) contains...based upon the reliability and maintainability standards and tasks delineated in NAC R&M-STD-ROO010 (Reliability Program Requirements Seleccion ). These...characteristics, environmental conditions at the time of the reported failure, and the exact nature of the failure, which has been categorized as follows
Wei Zhao; Yi-Min Zhang
The vibration transmission path systems are generally composed of the vibration source, the vibration transfer path, and the vibration receiving structure. The transfer path is the medium of the vibration transmission. Moreover, the randomness of transfer path influences the transfer reliability greatly. In this paper, based on the matrix calculus, the generalized second moment technique, and the stochastic finite element theory, the effective approach for the transfer reliability of vibratio...
Botella, Juan; Suero, Manuel; Gambara, Hilda
A meta-analysis of the reliability of the scores from a specific test, also called reliability generalization, allows the quantitative synthesis of its properties from a set of studies. It is usually assumed that part of the variation in the reliability coefficients is due to some unknown and implicit mechanism that restricts and biases the…
Full Text Available Within urban areas, green spaces play a critically important role in the quality of life. They have remarkable impact on the local microclimate and the regional climate of the city. Quantifying the ‘greenness’ of urban areas allows comparing urban areas at several levels, as well as monitoring the evolution of green spaces in urban areas, thus serving as a tool for urban and developmental planning. Different categories of vegetation have different impacts on recreation potential and microclimate, as well as on the individual perception of green spaces. However, when quantifying the ‘greenness’ of urban areas the reliability of the underlying information is important in order to qualify analysis results. The reliability of geo-information derived from remote sensing data is usually assessed by ground truth validation or by comparison with other reference data. When applying methods of object based image analysis (OBIA and fuzzy classification, the degrees of fuzzy membership per object in general describe to what degree an object fits (prototypical class descriptions. Thus, analyzing the fuzzy membership degrees can contribute to the estimation of reliability and stability of classification results, even when no reference data are available. This paper presents an object based method using fuzzy class assignments to outline and classify three different classes of vegetation from GeoEye imagery. The classification result, its reliability and stability are evaluated using the reference-free parameters Best Classification Result and Classification Stability as introduced by Benz et al. in 2004 and implemented in the software package eCognition (www.ecognition.com. To demonstrate the application potentials of results a scenario for quantifying urban ‘greenness’ is presented.
Szatmary, S. A.
The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES/PC performs statistical analysis of data obtained from the fracture of simple, uniaxial tensile or flexural specimens and estimates the Weibull and Batdorf material parameters from this data. CARES/PC is a subset of the program CARES (COSMIC program number LEW-15168) which calculates the fast-fracture reliability or failure probability of ceramic components utilizing the Batdorf and Weibull models to describe the effects of multi-axial stress states on material strength. CARES additionally requires that the ceramic structure be modeled by a finite element program such as MSC/NASTRAN or ANSYS. The more limited CARES/PC does not perform fast-fracture reliability estimation of components. CARES/PC estimates ceramic material properties from uniaxial tensile or from three- and four-point bend bar data. In general, the parameters are obtained from the fracture stresses of many specimens (30 or more are recommended) whose geometry and loading configurations are held constant. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests measure the accuracy of the hypothesis that the fracture data comes from a population with a distribution specified by the estimated Weibull parameters. Ninety-percent confidence intervals on the Weibull parameters and the unbiased value of the shape parameter for complete samples are provided
Murthy, Pappu L. N.; Mital, Subodh K.; Gyekenyesi, John Z.; Gyekenyesi, John P.
High temperature ceramic matrix composites (CMC) are being explored as viable candidate materials for hot section gas turbine components. These advanced composites can potentially lead to reduced weight and enable higher operating temperatures requiring less cooling; thus leading to increased engine efficiencies. There is a need for convenient design tools that can accommodate various loading conditions and material data with their associated uncertainties to estimate the minimum predicted life as well as the failure probabilities of a structural component. This paper presents a review of the life prediction and probabilistic analyses performed for a CMC turbine stator vane. A computer code, NASALife, is used to predict the life of a 2-D woven silicon carbide fiber reinforced silicon carbide matrix (SiC/SiC) turbine stator vane due to a mission cycle which induces low cycle fatigue and creep. The output from this program includes damage from creep loading, damage due to cyclic loading and the combined damage due to the given loading cycle. Results indicate that the trends predicted by NASALife are as expected for the loading conditions used for this study. In addition, a combination of woven composite micromechanics, finite element structural analysis and Fast Probability Integration (FPI) techniques has been used to evaluate the maximum stress and its probabilistic distribution in a CMC turbine stator vane. Input variables causing scatter are identified and ranked based upon their sensitivity magnitude. Results indicate that reducing the scatter in proportional limit strength of the vane material has the greatest effect in improving the overall reliability of the CMC vane.
Toro A, Richard; Campos, Claudia; Molina, Carolina; Morales S, Raul G E; Leiva-Guzmán, Manuel A
A critical analysis of Chile's National Air Quality Information System (NAQIS) is presented, focusing on particulate matter (PM) measurement. This paper examines the complexity, availability and reliability of monitoring station information, the implementation of control systems, the quality assurance protocols of the monitoring station data and the reliability of the measurement systems in areas highly polluted by particulate matter. From information available on the NAQIS website, it is possible to confirm that the PM2.5 (PM10) data available on the site correspond to 30.8% (69.2%) of the total information available from the monitoring stations. There is a lack of information regarding the measurement systems used to quantify air pollutants, most of the available data registers contain gaps, almost all of the information is categorized as "preliminary information" and neither standard operating procedures (operational and validation) nor assurance audits or quality control of the measurements are reported. In contrast, events that cause saturation of the monitoring detectors located in northern and southern Chile have been observed using beta attenuation monitoring. In these cases, it can only be concluded that the PM content is equal to or greater than the saturation concentration registered by the monitors and that the air quality indexes obtained from these measurements are underestimated. This occurrence has been observed in 12 (20) public and private stations where PM2.5 (PM10) is measured. The shortcomings of the NAQIS data have important repercussions for the conclusions obtained from the data and for how the data are used. However, these issues represent opportunities for improving the system to widen its use, incorporate comparison protocols between equipment, install new stations and standardize the control system and quality assurance. Copyright © 2015 Elsevier Ltd. All rights reserved.
Mastroianni, Patrícia C; Noto, Ana Regina; Galduróz, José Carlos F
According to the World Health Organization, medicinal drug promotion should be reliable, accurate, truthful, informative, balanced, up-to-date and capable of substantiation. The objective of the present study was to review psychoactive drug advertisements to physicians as for information consistency with the related references and accessibility of the cited references. Data was collected in the city of Araraquara, Southeastern Brazil, in 2005. There were collected and reviewed 152 drug advertisements, a total of 304 references. References were requested directly from pharmaceutical companies' customer services and searched in UNESP (Ibict, Athenas) and BIREME (SciELO, PubMed, free-access indexed journals) library network and CAPES journals. Advertisement statements were checked against references using content analysis. Of all references cited in the advertisements studied, 66.7% were accessed. Of 639 promotional statements identified, 346 (54%) were analyzed. The analysis showed that 67.7% of promotional statements in the advertisements were consistent with their references, while the remaining was either partially consistent or inconsistent. Of the material analyzed, an average 2.5 (1-28) references was cited per advertisement. In the text body, there were identified 639 pieces of information clearly associated with at least one cited reference (average 3.5 pieces of information per advertisement). The study results evidenced difficult access to the references. Messages on efficacy, safety and cost, among others, are not always supported by scientific studies. There is a need for regulation changes and effective monitoring of drug promotional materials.
Alba-Ruiz, Ruben; Bermúdez-Tamayo, Clara; Pernett, Jaime Jiménez; Garcia-Gutierrez, Jose Francisco; Cózar-Olmo, José Manuel; Valero-Aguilera, Beatriz
People who use the Internet to research health topics do not usually find all the information they need and do not trust what they read. This study was designed to assess the reliability, accessibility, readability, and popularity of cancer Web sites in Spanish and to analyze the suitability of Web site content in accordance with the specific information needs of cancer patients. This was a two-phase, cross-sectional, descriptive study. The first phase involved data gathering through online searches and direct observation. The second phase involved individual structured interviews with 169 patients with breast, prostate, bladder, and kidney cancer. Spearman rank correlations were calculated between variables. Most sites belonged to nonprofit organizations, followed by universities or medical centers (14%). Thirty-one percent of the Web sites had quality seals, 59% provided details of authorship, 62% provided references to bibliographic sources, 38% identified their funding sources, and 54% showed the date of their last update. Twenty-one percent of the Web sites did not meet the minimum accessibility criteria. With regard to readability, 24% of the texts were considered to be "quite difficult." Patients' information needs vary depending on the type of cancer they have, although all patients want to know about the likelihood of a cure, survival rates, the side effects, and risks of treatment. The health information on cancer available on the Internet in Spanish is not very reliable, accessible, or readable and is not necessarily the information that breast, kidney, prostate, and bladder cancer patients require. The content of cancer Web sites needs to be assessed according to the information needs of patients.
Full Text Available Corrosion is recognized as one of the most important degradation mechanisms that affect the long-term reliability and integrity of metallic structures. Studying the structural reliability with pitting corrosion damage is useful for risk control and safety operation for the corroded structure. This paper proposed a structure corrosion reliability analysis approach based on the physics-based failure model of pitting corrosion, where the states of pitting growth, pit-to-crack, and cracking propagation are included in failure model. Then different probabilistic analysis methods such as Monte-Carlo Simulation (MCS, First-Order Reliability Method (FORM, Second-Order Reliability Method (SORM, and response surface method are employed to calculate the reliability. At last, an example is presented to demonstrate the capability of the proposed structural reliability model and calculating methods for structural corrosion failure analysis.
Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan
According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.
Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan
According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.
Vaniachine, A; The ATLAS collaboration; Karpenko, D
During three years of LHC data taking, the ATLAS collaboration completed three petascale data reprocessing campaigns on the Grid, with up to 2 PB of data being reprocessed every year. In reprocessing on the Grid, failures can occur for a variety of reasons, while Grid heterogeneity makes failures hard to diagnose and repair quickly. As a result, Big Data processing on the Grid must tolerate a continuous stream of failures, errors and faults. While ATLAS fault-tolerance mechanisms improve the reliability of Big Data processing in the Grid, their benefits come at costs and result in delays making the performance prediction difficult. Reliability Engineering provides a framework for fundamental understanding of the Big Data processing on the Grid, which is not a desirable enhancement but a necessary requirement. In ATLAS, cost monitoring and performance prediction became critical for the success of the reprocessing campaigns conducted in preparation for the major physics conferences. In addition, our Reliability...
Vaniachine, A; The ATLAS collaboration; Karpenko, D
During three years of LHC data taking, the ATLAS collaboration completed three petascale data reprocessing campaigns on the Grid, with up to 2 PB of data being reprocessed every year. In reprocessing on the Grid, failures can occur for a variety of reasons, while Grid heterogeneity makes failures hard to diagnose and repair quickly. As a result, Big Data processing on the Grid must tolerate a continuous stream of failures, errors and faults. While ATLAS fault-tolerance mechanisms improve the reliability of Big Data processing in the Grid, their benefits come at costs and result in delays making the performance prediction difficult. Reliability Engineering provides a framework for fundamental understanding of the Big Data processing on the Grid, which is not a desirable enhancement but a necessary requirement. In ATLAS, cost monitoring and performance prediction became critical for the success of the reprocessing campaigns conducted in preparation for the major physics conferences. In addition, our Reliability...
Full Text Available The vibration transmission path systems are generally composed of the vibration source, the vibration transfer path, and the vibration receiving structure. The transfer path is the medium of the vibration transmission. Moreover, the randomness of transfer path influences the transfer reliability greatly. In this paper, based on the matrix calculus, the generalized second moment technique, and the stochastic finite element theory, the effective approach for the transfer reliability of vibration transfer path systems was provided. The transfer reliability of vibration transfer path system with uncertain path parameters including path mass and path stiffness was analyzed theoretically and computed numerically, and the correlated mathematical expressions were derived. Thus, it provides the theoretical foundation for the dynamic design of vibration systems in practical project, so that most random path parameters can be considered to solve the random problems for vibration transfer path systems, which can avoid the system resonance failure.
Karki, Rajesh; Verma, Ajit Kumar
The volume presents the research work in understanding, modeling and quantifying the risks associated with different ways of implementing smart grid technology in power systems in order to plan and operate a modern power system with an acceptable level of reliability. Power systems throughout the world are undergoing significant changes creating new challenges to system planning and operation in order to provide reliable and efficient use of electrical energy. The appropriate use of smart grid technology is an important drive in mitigating these problems and requires considerable research acti
Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte
Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....
Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.
Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....
El Hami, Abdelkhalak
In operation, mechatronics embedded systems are stressed by loads of different causes: climate (temperature, humidity), vibration, electrical and electromagnetic. These stresses in components which induce failure mechanisms should be identified and modeled for better control. AUDACE is a collaborative project of the cluster Mov'eo that address issues specific to mechatronic reliability embedded systems. AUDACE means analyzing the causes of failure of components of mechatronic systems onboard. The goal of the project is to optimize the design of mechatronic devices by reliability. The projec
Sørensen, John Dalsgaard
Wind turbines are exposed to highly dynamic loads that cause fatigue and extreme load effects which are subject to significant uncertainties. Further, reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources......, substructure and foundation considering especially fatigue loads. The function of a wind turbine is highly dependent on many electrical and mechanical components as well as a control system also reliability aspects of these components are discussed and it is described how there reliability influences...
Kristjanson Linda J
Full Text Available Abstract Background It is difficult to determine the most effective approach to patient education or tailor education interventions for patients in radiotherapy without tools that assess patients' specific radiation therapy information needs and concerns. Therefore, the aim of this study was to develop psychometrically sound tools to adequately determine the concerns and information needs of cancer patients during radiation therapy. Patients and Methods Two tools were developed to (1 determine patients concerns about radiation therapy (RT Concerns Scale and (2 ascertain patient's information needs at different time point during their radiation therapy (RT Information Needs Scale. Tools were based on previous research by the authors, published literature on breast cancer and radiation therapy and information behaviour research. Thirty-one breast cancer patients completed the questionnaire on one occasion and thirty participants completed the questionnaire on a second occasion to facilitate test-retest reliability. One participant's responses were removed from the analysis. Results were analysed for content validity, internal consistency and stability over time. Results Both tools demonstrated high internal consistency and adequate stability over time. The nine items in the RT Concerns Scale were retained because they met all pre-set psychometric criteria. Two items were deleted from the RT Information Needs Scale because they did not meet content validity criteria and did not achieve pre-specified criteria for internal consistency. This tool now contains 22 items. Conclusion This paper provides preliminary data suggesting that the two tools presented are reliable and valid and would be suitable for use in trials or in the clinical setting.
Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.
Background: Computer programmes have been introduced to electrocardiography (ECG) with most physicians in Africa depending on computer interpretation of ECG. This study was undertaken to evaluate the reliability of computer interpretation of the 12-Lead ECG in the Black race. Methodology: Using the SCHILLER ...
The defence against flooding by storm surges and large river discharges are provided by complex systems of dikes, dunes, retaining walls, higher grounds, barriers, locks and so on. Spatial correlations inside and between the various components play an important role in the reliability and risk
This paper presents the structural reliability assessment of a two span timber floor of strength class ... buildings uses some form of wood-based panel products, ..... grain(N/mm²). Lognormal. 0.15. 4. Imposed load (N/mm). Gumbel. 0.30. 5. B.
1Department of Mathematics, Faculty of Science, Usmanu Danfodiyo University, Sokoto, Nigeria ... INTRODUCTION. Reliability is a vital for proper utilization and maintenance of any system. It involves technique for increasing system effectiveness through reducing .... P t denote the probability row vector at time t, the.
Waters, Robert D. [Vanderbilt Univ., Nashville, TN (United States)
Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.
Valentin, Stephanie; Yeates, Tobey DeMott; Licka, Theresia; Elliott, James
Inter-rater reliability of generalised lumbar extensor muscle CSA has been identified, however, more detailed reliability metrics of individual trunk muscles are lacking. To report muscle volume and muscle fatty infiltrate (MFI) inter-rater reliability of individual trunk muscles between two novice assessors. Lumbar axial MRI scans from 10 healthy male participants were analysed. The muscles erector spinae (ES), multifidus (M), rectus abdominis (RA), and psoas (PS) were manually traced, region of interest quantified and muscle volume and MFI determined by both assessors. Agreement between the assessors was determined using intraclass correlation coefficients (3,1), Bland-Altman plots and Lin's concordance coefficient. Good to excellent agreement was found for volume (ICC 0.77-0.96) and MFI (0.84-0.96) for all muscles on first evaluation, except for M volume, which required a second evaluation. Best agreement for muscle volume and MFI was found for ES (ICC 0.96). First evaluation of muscle volume and MFI yields high to excellent inter-rater agreement, except for M, where further training and/or experience is required to achieve acceptable reliability outcomes. This may have clinical implications due to the relevance of M atrophy reported in patients with low back pain.
Background: Brain fag is an indigenous psychopathology or culture-bound syndrome formally documented in Nigeria in the 1960's by Raymond Prince. Objective: The need for a factorial examination of the scale to ensure factorial validity and also to examine the reliability of this screening scale. Methods: Two hundred thirty ...
Rahmani, Cobra Mariam
In a Service Oriented Architecture (SOA), the hierarchical complexity of Web Services (WS) and their interactions with the underlying Application Server (AS) create new challenges in providing a realistic estimate of WS performance and reliability. The current approaches often treat the entire WS environment as a black-box. Thus, the sensitivity…
In this paper, probabilistic models for a system with different stage deteriorations have been developed to analyze and compare some reliability characteristics. Three configurations are studied under the assumption that each state that is working in reduced capacity is minimally repaired and the system is replaced at failure.
The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysi...
Saleh, Joseph Homer; Geng, Fan; Ku, Michelle; Walker, Mitchell L. R.
With a few hundred spacecraft launched to date with electric propulsion (EP), it is possible to conduct an epidemiological study of EP's on orbit reliability. The first objective of the present work was to undertake such a study and analyze EP's track record of on orbit anomalies and failures by different covariates. The second objective was to provide a comparative analysis of EP's failure rates with those of chemical propulsion. Satellite operators, manufacturers, and insurers will make reliability- and risk-informed decisions regarding the adoption and promotion of EP on board spacecraft. This work provides evidence-based support for such decisions. After a thorough data collection, 162 EP-equipped satellites launched between January 1997 and December 2015 were included in our dataset for analysis. Several statistical analyses were conducted, at the aggregate level and then with the data stratified by severity of the anomaly, by orbit type, and by EP technology. Mean Time To Anomaly (MTTA) and the distribution of the time to (minor/major) anomaly were investigated, as well as anomaly rates. The important findings in this work include the following: (1) Post-2005, EP's reliability has outperformed that of chemical propulsion; (2) Hall thrusters have robustly outperformed chemical propulsion, and they maintain a small but shrinking reliability advantage over gridded ion engines. Other results were also provided, for example the differentials in MTTA of minor and major anomalies for gridded ion engines and Hall thrusters. It was shown that: (3) Hall thrusters exhibit minor anomalies very early on orbit, which might be indicative of infant anomalies, and thus would benefit from better ground testing and acceptance procedures; (4) Strong evidence exists that EP anomalies (onset and likelihood) and orbit type are dependent, a dependence likely mediated by either the space environment or differences in thrusters duty cycles; (5) Gridded ion thrusters exhibit both
Sullivan, T Barrett; Anderson, Joshua T; Ahn, Uri M; Ahn, Nicholas U
appointment scheduling. Seven percent of sites were classified as excellent quality, 6% as high quality, 11% as moderate quality, 19% as poor quality, and 57% as unacceptable. Sixteen percent of sites were sponsored by academic institutions, 62% by private groups, 8% by biomedical device companies, and 14% were sponsored otherwise. Academic sites reported fewer risks of the procedure than private sites or other sites (p = 0.05 and p = 0.04), but reported more risks than industry sites (p = 0.007). Academic sites were more likely than sites classified as other to offer contact information for patient appointment scheduling (p = 0.004). Nine percent of sites evaluated were Health on the Net Foundation (HONCode) certified. No association with improved information quality was observed in these sites relative to noncertified sites (all p > 0.05). Internet information regarding vertebroplasty is not only inadequate for proper patient education, but also potentially misleading as sites are more likely to present benefits of the procedure than risks. Although academic sites might be expected to offer higher-quality information than private, industry, or other sites, our data would suggest that they do not. HONCode certification cannot be used reliably as a means of qualifying website information quality. Academic sites should be expected to set a high standard and alter their Internet presence with adequate information distribution. Certification bodies also should alter their standards to necessitate provision of complete information in addition to emphasizing accurate information. Treating physicians may want to counsel their patients regarding the limitations of information present on the Internet and the pitfalls of current certification systems. Level IV, economic and decision analyses. See the Instructions for Authors for a complete description of levels of evidence.
Meshkat, Leila; Grenander, Sven; Evensen, Ken
center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.
Full Text Available Reliability-based maintenance policies allow qualitative and quantitative evaluation of system downtimes via revealing main causes of breakdowns and discussing required preventive activities against failures. Application of preventive maintenance is especially important for mining machineries since production is highly affected from machinery breakdowns. Overburden stripping operations are one of the integral parts in surface coal mine productions. Draglines are extensively utilized in overburden stripping operations and they achieve earthmoving activities with bucket capacities up to 168 m3. The massive structure and operational severity of these machines increase the importance of performance awareness for individual working components. Research on draglines is rarely observed in the literature and maintenance studies for these earthmovers have been generally ignored. On this basis, this paper offered a comprehensive reliability assessment for two draglines currently operating in the Tunçbilek coal mine and discussed preventive replacement for wear-out components of the draglines considering cost factors.
Decisions. The SHORAD leadership (oattery and platcon) and the divi- sion airspace management element (DAME) are concerned with the irplementation of the...REO.GER ’El test it was aeterrrined that the rost significant feature of Reliable STING was its ability tc pinpcint aircraft Locations. Also, the rate of...distance that an aircraft travels between radar detection and fire unit receipt of tne track report is the rost critical of the Trobleirs identified above
Gupta, N.; Tiwari, B. N.; S. Bellucci
This paper presents the intrinsic geometric model for the solution of power system planning and its operation. This problem is large-scale and nonlinear, in general. Thus, we have developed the intrinsic geometric model for the network reliability and voltage stability, and examined it for the IEEE 5 bus system. The robustness of the proposed model is illustrated by introducing variations of the network parameters. Exact analytical results show the accuracy as well as the efficiency of the pr...
Megan E McCool
Full Text Available Patients actively seek information about how to cope with their health problems, but the quality of the information available varies. A number of instruments have been developed to assess the quality of patient information, primarily though in English. Little is known about the reliability of these instruments when applied to patient information in German. The objective of our study was to investigate and compare the reliability of two validated instruments, DISCERN and EQIP, in order to determine which of these instruments is better suited for a further study pertaining to the quality of information available to German patients with eczema. Two independent raters evaluated a random sample of 20 informational brochures in German. All the brochures addressed eczema as a disorder and/or therapy options and care. Intra-rater and inter-rater reliability were assessed by calculating intra-class correlation coefficients, agreement was tested with weighted kappas, and the correlation of the raters' scores for each instrument was measured with Pearson's correlation coefficient. DISCERN demonstrated substantial intra- and inter-rater reliability. It also showed slightly better agreement than EQIP. There was a strong correlation of the raters' scores for both instruments. The findings of this study support the reliability of both DISCERN and EQIP. However, based on the results of the inter-rater reliability, agreement and correlation analyses, we consider DISCERN to be the more precise tool for our project on patient information concerning the treatment and care of eczema.
Cates, Grant; Gelito, Justin; Stromgren, Chel; Cirillo, William; Goodliff, Kandyce
NASA's future human space exploration strategy includes single and multi-launch missions to various destinations including cis-lunar space, near Earth objects such as asteroids, and ultimately Mars. Each campaign is being defined by Design Reference Missions (DRMs). Many of these missions are complex, requiring multiple launches and assembly of vehicles in orbit. Certain missions also have constrained departure windows to the destination. These factors raise concerns regarding the reliability of launching and assembling all required elements in time to support planned departure. This paper describes an integrated methodology for analyzing launch and assembly reliability in any single DRM or set of DRMs starting with flight hardware manufacturing and ending with final departure to the destination. A discrete event simulation is built for each DRM that includes the pertinent risk factors including, but not limited to: manufacturing completion; ground transportation; ground processing; launch countdown; ascent; rendezvous and docking, assembly, and orbital operations leading up to trans-destination-injection. Each reliability factor can be selectively activated or deactivated so that the most critical risk factors can be identified. This enables NASA to prioritize mitigation actions so as to improve mission success.
Đukić Ljiljana C.
Full Text Available Introduction Today, there are many drugs for the treatment of a large number of indicator areas. Significant financial resources are invested in research with the aim of introducing reliable therapeutics to therapy. Therefore, it is necessary to provide health care professionals exact information about new therapies. The overall process of scientific data, ideas and information exchange is possible through numerous communications of modern IT tools. Methodology According to the Law, key information on registered drug is included in the Summary of Product Characteristics (SPC for health professionals, which is harmonized with EU directives and regulations (SmPC.Protocol content and structure of the information provided in SPC is determined in the guidelines of the EU, therefore, a unique set of data is established for all the drugs registered in Serbia. Topic This paper presents the key segments of SPC, with special reference to the description of the regulations that are required for data related to indications, mechanism of action, dosage, contraindications, side effects, interactions and other important information regarding the profile of the drug, which are standardized and harmonized with the structure of identical documents which operate at the EU level, or EMEA. Conclusions SPC is the regulatory determined technical document on medicinal products in the RS in which there are listed scientifically proven, clinical and pharmacological data and information on the profile of the drug, which are essential for health professionals - doctors and pharmacists in the implementation of pharmacotherapy in our society. This document is the starting point for the development of applied Pharmacoinformatics and it includes a range of activities important for the development of appropriate manuals and makes available data and information for monitoring indicators of the national policy on drugs and modern effective drugs treatment.
Conclusion: The explored sociocultural factors influence the human reliability both in qualitative and quantitative manners. The proposed model shows how reliability can be enhanced by some measures such as experience feedback based on, for example, safety improvements, training, and information. With that is added the continuous systems improvements to improve sociocultural reality and to reduce negative behaviors.
Nassiri, Mujtaba; Bruce-Brand, Robert A; O'Neill, Francis; Chenouri, Shojaeddin; Curtin, Paul
Research has shown that up to 89% of parents used the Internet to seek health information regarding their child's medical condition. Much of the information on the Internet is valuable; however, the quality of health information is variable and unregulated. The aim of this study was to evaluate the quality and content of information about Perthes disease on the Internet using recognized scoring systems, identification of quality markers, and describe a novel specific score. We searched the top 3 search engines (Google, Yahoo!, and Bing) for the following keywords: "Perthes disease." Forty-five unique Web sites were identified. The Web sites were then categorized by type and assessed using the DISCERN score, the Journal of the American Medical Association (JAMA) benchmark criteria, and a novel Perthes-specific Content score. The presence of the Health On the Net (HON) code, a reported quality assurance marker, was noted. Of the Web sites analyzed, the Majority were Governmental and Nonprofit Organizations (NPO) (37.8%), followed by commercial Web sites (22.2%). Only 6 of the Web sites were HONcode certified. The mean DISCERN score was 53.1 (SD=9.0). The Governmental and NPO Web sites had the highest overall DISCERN scores followed closely by Physician Web sites. The mean JAMA benchmark criteria score was 2.1 (SD=1.2). Nine Web sites had maximal scores and the Academic Web sites had the highest overall JAMA benchmark scores. DISCERN scores, JAMA benchmark scores, and Perthes-specific Content scores were all greater for Web sites that bore the HONcode seal. The quality of information available online regarding Perthes disease is of variable quality. Governmental and NPO Web sites predominate and also provide higher quality content. The HONcode seal is a reliable indicator of Web site quality. Physicians should recommend the HONcode seal to their patients as a reliable indicator of Web site quality or, better yet, refer patients to sites they have personally reviewed
Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S
The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.
Full Text Available Failure mode and effects analysis (FMEA has been proven to be an effective methodology to improve system design reliability. However, the standard approach reveals some weaknesses when applied to wind turbine systems. The conventional criticality assessment method has been criticized as having many limitations such as the weighting of severity and detection factors. In this paper, we aim to overcome these drawbacks and develop a hybrid cost-FMEA by integrating cost factors to assess the criticality, these costs vary from replacement costs to expected failure costs. Then, a quantitative comparative study is carried out to point out average failure rate, main cause of failure, expected failure costs and failure detection techniques. A special reliability analysis of gearbox and rotor-blades are presented.
Hartzell, Allyson L; Shea, Herbert R
This book focuses on the reliability and manufacturability of MEMS at a fundamental level. It demonstrates how to design MEMs for reliability and provides detailed information on the different types of failure modes and how to avoid them.
Sousa, Hélder; Sørensen, John Dalsgaard; Kirkegaard, Poul Henning
The first part of this document presents, in chapter 2, a description of timber characteristics and common used NDT and MDT for timber elements. Stochastic models for timber properties and damage accumulation models are also referred. According to timber’s properties a framework is proposed...... for robustness are dealt in chapter 5. The second part of this document begins in chapter 6, where a practical application of the premise definitions and methodologies is given through the implementation of upgraded models with NDT and MDT data. Structural life-cycle is, therefore, assessed and reliability...
Full Text Available Problems of the optimization reliability in electrical networks of the different class of voltage have probabilistic nature, they discretely change and depend on the number of factors both definite and indefinite and have importance by selection of electric equipment, graph of development of electrical networks and voltage levels. The definition of the major factors, which have determining significance on their value and speed of their change allow to elaborate methods of their optimization and to elaborate effective methods of their growth limitation in electrical networks with the different class of voltage.
Staff, P S B
The PS Booster Synchrotron being a complex accelerator with four superposed rings and substantial additional equipment for beam splitting and recombination, doubts were expressed at the time of project authorization as to its likely operational reliability. For 1975 and 1976, the average down time was 3.2% (at least one ring off) or 1.5% (all four rings off). The items analysed are: operational record, design features, maintenance, spare parts policy, operating temperature, effects of thunderstorms, fault diagnostics, role of operations staff and action by experts. (15 refs).
Full Text Available Macroscopically heterogeneous materials like concrete are generally sampled by too small, i.e., subrepresentative elements that can be either of 2D (section images or of 3D nature (specimens. Based on scientific notions, like stochastic heterogeneity and structure-sensitivity, which are at the very heart of materials science and stereology, the paper demonstrates biases in obtained information to be generally inevitable when derived from such sub-representative designs. Only reliable comparison studies can be performed under the condition that the linear size of samples and of minimum structural dimensions (resulting from observation resolution are maintained as fixed proportions of the relevant representative area and/or volume elements. This is demonstrated by three case studies.
Rakul Bharatwaj Ramesh
Full Text Available This study proposes an algorithm to solve inverse reliability problems with a single unknown parameter. The proposed algorithm is based on an existing algorithm, the inverse first-order reliability method (inverse-FORM, which uses the Hasofer Lind Rackwitz Fiessler (HLRF algorithm. The initial algorithm analyzed in this study was developed by modifying the HLRF algorithm in inverse-FORM using the Broyden-Fletcher-Goldarb-Shanno (BFGS update formula completely. Based on numerical experiments, this modification was found to be more efficient than inverse-FORM when applied to most of the limit state functions considered in this study, as it requires comparatively a smaller number of iterations to arrive at the solution. However, to achieve this higher computational efficiency, this modified algorithm sometimes compromised the accuracy of the final solution. To overcome this drawback, a hybrid method by using both the algorithms, original HLRF algorithm and the modified algorithm with BFGS update formula, is proposed. This hybrid algorithm achieves better computational efficiency, compared to inverse-FORM, without compromising the accuracy of the final solution. Comparative numerical examples are provided to demonstrate the improved performance of this hybrid algorithm over that of inverse-FORM in terms of accuracy and efficiency.
This paper proposes a new method to evaluate informative hypotheses for meta-analysis of Cronbach's coefficient alpha using a Bayesian approach. The coefficient alpha is one of the most widely used reliability indices. In meta-analyses of reliability, researchers typically form specific informative hypotheses beforehand, such as "alpha of…
Malcov, Mira; Gold, Veronica; Peleg, Sagit; Frumkin, Tsvia; Azem, Foad; Amit, Ami; Ben-Yosef, Dalit; Yaron, Yuval; Reches, Adi; Barda, Shimi; Kleiman, Sandra E; Yogev, Leah; Hauser, Ron
The study is aimed to describe a novel strategy that increases the accuracy and reliability of PGD in patients using sperm donation by pre-selecting the donor whose haplotype does not overlap the carrier's one. A panel of 4-9 informative polymorphic markers, flanking the mutation in carriers of autosomal dominant/X-linked disorders, was tested in DNA of sperm donors before PGD. Whenever the lengths of donors' repeats overlapped those of the women, additional donors' DNA samples were analyzed. The donor that demonstrated the minimal overlapping with the patient was selected for IVF. In 8 out of 17 carriers the markers of the initially chosen donors overlapped the patients' alleles and 2-8 additional sperm donors for each patient were haplotyped. The selection of additional sperm donors increased the number of informative markers and reduced misdiagnosis risk from 6.00% ± 7.48 to 0.48% ±0.68. The PGD results were confirmed and no misdiagnosis was detected. Our study demonstrates that pre-selecting a sperm donor whose haplotype has minimal overlapping with the female's haplotype, is critical for reducing the misdiagnosis risk and ensuring a reliable PGD. This strategy may contribute to prevent the transmission of affected IVF-PGD embryos using a simple and economical procedure. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. DNA testing of donors was approved by the institutional Helsinki committee (registration number 319-08TLV, 2008). The present study was approved by the institutional Helsinki committee (registration number 0385-13TLV, 2013).
Chaoyang Xie; Hong-Zhong Huang
Corrosion is recognized as one of the most important degradation mechanisms that affect the long-term reliability and integrity of metallic structures. Studying the structural reliability with pitting corrosion damage is useful for risk control and safety operation for the corroded structure. This paper proposed a structure corrosion reliability analysis approach based on the physics-based failure model of pitting corrosion, where the states of pitting growth, pit-to-crack, and cracking propa...
The second Strategic Highway Research Program (SHRP 2) Reliability program aims to improve trip time reliability by reducing the frequency and effects of events that cause travel times to fluctuate unpredictably. Congestion caused by unreliable, or n...
The design and analysis of transport protocols for reliable communications constitutes the topic of this dissertation. These transport protocols guarantee the sequenced and complete delivery of user data over networks which may lose, duplicate and reorder packets. Reliable transport services are
Gait impairment is one of the primary symptoms of cervical spondylotic myelopathy (CSM). Detailed assessment is possible using three-dimensional gait analysis (3DGA), however the reliability of 3DGA for this population has not been established. The aim of this study was to evaluate the test-retest reliability of temporal-spatial, kinematic and kinetic parameters in a CSM population.
Burcharth, H. F.; Sørensen, John Dalsgaard; Christiani, E.
Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of some of the most important failure modes are described. The failures are sliding and slip surface failure of a rubble mound and a clay foundation. Relevant design vari...
Tan, Chung Huat J.; Gillies, Duncan F.
Recently considerable research has been undertaken into estimating the quality of information (QoI) delivered by military sensor networks. QoI essentially estimates the probability that the information available from the network is correct. Knowledge of the QoI would clearly be of great use to decision makers using a network. An important class of sensors, that provide inputs to networks in real-life, are concerned with target tracking. Assessing the tracking performance of these sensors is an essential component in estimating the QoI of the whole network. We have investigated three potential QoI metrics for estimating the dynamic target tracking performance of systems based on some state estimation algorithms. We have tested them on different scenarios with varying degrees of tracking difficulty. We performed experiments on simulated data so that we have a ground truth against which to assess the performance of each metric. Our measure of ground truth is the Euclidean distance between the estimated position and the true position. Recently researchers have suggested using the entropy of the covariance matrix as a metric of QoI . Two of our metrics were based on this approach, the first being the entropy of the co-variance matrix relative to an ideal distribution, and the second is the information gain at each update of the covariance matrix. The third metric was calculated by smoothing the residual likelihood value at each new measurement point, similar to the model update likelihood function in an IMM filter. Our experiment results show that reliable QoI metrics cannot be formulated by using solely the covariance matrices. In other words it is possible that a covariance matrix can have high information content, while the position estimate is wrong. On the other hand the smoothed residual likelihood does correlate well with tracking performance, and can be measured without knowledge of the true target position.
Scheuerer, Michael; Webb, Robert S.; Hamill, Thomas M.
Many reservoirs operated by the U.S. Army Corps of Engineers (Corps) in California provide flood control as well as water supply, recreation and stream flow regulation. Operations for flood control follow seasonally specified elevations for an upper volume of reservoir storage with unused storage capacity designated for flood risk management and thus not available for water supply storage. In the flood control operation of these reservoirs, runoff is captured during rain events and then released soon after at rates that do not result in downstream flooding (typically over a 5 to 8 day period), resulting in evacuated storage space to capture runoff from the next potential storm. As part of the Forecast-Informed Reservoir Operations (FIRO) partnership to more effectively balance flood and drought risks, we developed an experimental California medium-range precipitation forecast system based on NCEP GEFS reforecasts and Climatology-Calibrated Precipitation Analysis (CCPA). We have applied this experimental forecast system to predict the probability of day 5-10 precipitation accumulations at each CCPA grid point within California to exceed certain pre-specified thresholds. Discussions with flood and water supply managers indicate that forecast guidance for the very low risk of extreme precipitation for watersheds above reservoirs can be valuable for decision making. In this study, we assess the skill and reliability of this experimental forecast system to predict low probabilities of precipitation extreme events for select watersheds during recent winter precipitation seasons. Our analysis indicate there may be sufficient reliability in forecasts guidance for low probabilities of heavy precipitation events to inform decision making in reservoir management in select California river basins to manage flood risk while increasing water supply for consumptive use and ecosystem services.
This book presents a unique collection of contributions from some of the foremost scholars in the field of risk and reliability analysis. Combining the most advanced analysis techniques with practical applications, it is one of the most comprehensive and up-to-date books available on risk-based engineering. All the fundamental concepts needed to conduct risk and reliability assessments are covered in detail, providing readers with a sound understanding of the field and making the book a powerful tool for students and researchers alike. This book was prepared in honor of Professor Armen Der Kiureghian, one of the fathers of modern risk and reliability analysis.
Yun, Lifen; Wang, Xifu; Fan, Hongqiang; Li, Xiaopeng
This paper proposes a reliable facility location design model under imperfect information with site-dependent disruptions; i.e., each facility is subject to a unique disruption probability that varies across the space. In the imperfect information contexts, customers adopt a realistic "trial-and-error" strategy to visit facilities; i.e., they visit a number of pre-assigned facilities sequentially until they arrive at the first operational facility or give up looking for the service. This proposed model aims to balance initial facility investment and expected long-term operational cost by finding the optimal facility locations. A nonlinear integer programming model is proposed to describe this problem. We apply a linearization technique to reduce the difficulty of solving the proposed model. A number of problem instances are studied to illustrate the performance of the proposed model. The results indicate that our proposed model can reveal a number of interesting insights into the facility location design with site-dependent disruptions, including the benefit of backup facilities and system robustness against variation of the loss-of-service penalty.
Full Text Available This paper proposes a reliable facility location design model under imperfect information with site-dependent disruptions; i.e., each facility is subject to a unique disruption probability that varies across the space. In the imperfect information contexts, customers adopt a realistic "trial-and-error" strategy to visit facilities; i.e., they visit a number of pre-assigned facilities sequentially until they arrive at the first operational facility or give up looking for the service. This proposed model aims to balance initial facility investment and expected long-term operational cost by finding the optimal facility locations. A nonlinear integer programming model is proposed to describe this problem. We apply a linearization technique to reduce the difficulty of solving the proposed model. A number of problem instances are studied to illustrate the performance of the proposed model. The results indicate that our proposed model can reveal a number of interesting insights into the facility location design with site-dependent disruptions, including the benefit of backup facilities and system robustness against variation of the loss-of-service penalty.
Marco Antonio Figueiredo Milani Filho
Full Text Available Benford's Law (BL is a logarithmic distribution which is useful to detect abnormal patterns of digits in number sets. It is often used as a primary data auditing method for detecting traces of errors, illegal practices or undesired occurrences, such as fraud and earning management. In this descriptive study, I analyzed the financial information (revenue and expenditure of the registered charitable hospitals located in Ontario and Quebec, which have the majority (71.4% of these organizations within Canada. The aim of this study was to verify the reliability of the financial data of the respective hospitals, using the probability distribution predicted by Benford’s Law as a proxy of reliability. The sample was composed by 1,334 observations related to 339 entities operating in the tax year 2009 and 328 entities in 2010, gathered from the Canada Revenue Agency’s database. To analyze the discrepancies between the actual and expected frequencies of the significant-digit, two statistics were calculated: Z-test and Pearson’s chi-square test. The results show that, with a confidence level of 95%, the data set of the organizations located in Ontario and Quebec have similar distribution to the BL, suggesting that, in a preliminary analysis, their financial data are free from bias.
Oh, Byung Hwan; Choi, Seong Cheol; Shin, Ho Sang; Yang, In Hwan; Kim, Yi Sung; Yu, Young; Kim, Se Hun [Seoul, Nationl Univ., Seoul (Korea, Republic of)
Nuclear power plant structures may be exposed to aggressive environmental effects that may cause their strength and stiffness to decrease over their service life. Although the physics of these damage mechanisms are reasonably well understood and quantitative evaluation of their effects on time-dependent structural behavior is possible in some instances, such evaluations are generally very difficult and remain novel. The assessment of existing steel containment in nuclear power plants for continued service must provide quantitative evidence that they are able to withstand future extreme loads during a service period with an acceptable level of reliability. Rational methodologies to perform the reliability assessment can be developed from mechanistic models of structural deterioration, using time-dependent structural reliability analysis to take loading and strength uncertainties into account. The final goal of this study is to develop the analysis method for the reliability of containment structures. The cause and mechanism of corrosion is first clarified and the reliability assessment method has been established. By introducing the equivalent normal distribution, the procedure of reliability analysis which can determine the failure probabilities has been established. The influence of design variables to reliability and the relation between the reliability and service life will be continued second year research.
Reer, Bernhard; Dang, V.N.; Hirschberg, Stefan [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Meyer, Patrick
In the review guidelines recently developed for the Swiss Federal Nuclear Inspectorate, the Human Reliability Analysis (HRA) is reviewed in two stages. The preliminary review is aimed at identifying major shortcomings and potential issues to be examined in the detailed review. The detailed review comprehensively addresses the overall adequacy and transparency of the HRA. For the two review stages, 97 indicators are defined in terms of questions focusing on verifiable features of the methodology, implementation and results. The guidelines provide steps for information gathering and present examples of acceptable practices as well as of potential deficiencies. Both review stages may result in requests for clarification, additional documentation or analyses. The first applications of the guidelines consist of the preliminary reviews of two HRAs. (author)
Reer, B.; Dang, V.N.; Hirschberg, S.; Meyer, P
PSI was commissioned to develop Guidelines for the Regulatory Review of the Human Reliability Analysis (HRA) within Probabilistic Safety Assessments (PSAs) for nuclear power plants. In the Guidelines, HRA quality is addressed in terms of 97 indicators. Each indicator is formulated as a question, described as a specific feature of the analysis, and then explained in detail. Two analysis stages are distinguished: the selection of the human errors to be modelled, and their quantification to determine their impact on the core damage frequency. Review findings are grouped under two headings: transparency and adequacy. An analysis is 'transparent' if an externally qualified person is able to reproduce the analysis results, and 'adequate' if such results reflect the plant-specific conditions related to safety. To allocate resources efficiently, the review is structured in two phases: (1) The Quick Review, which clarifies whether the HRA has a fundamental deficiency and, furthermore, if it points to information needs and areas of emphasis for the detailed review, and (2) The Detailed Review, which results in well-grounded findings, based on extended examinations and close-plant contacts. (authors)
Bornmann, Lutz; Mutz, Rüdiger; Daniel, Hans-Dieter
Background This paper presents the first meta-analysis for the inter-rater reliability (IRR) of journal peer reviews. IRR is defined as the extent to which two or more independent reviews of the same scientific document agree. Methodology/Principal Findings Altogether, 70 reliability coefficients (Cohen's Kappa, intra-class correlation [ICC], and Pearson product-moment correlation [r]) from 48 studies were taken into account in the meta-analysis. The studies were based on a total of 19,443 manuscripts; on average, each study had a sample size of 311 manuscripts (minimum: 28, maximum: 1983). The results of the meta-analysis confirmed the findings of the narrative literature reviews published to date: The level of IRR (mean ICC/r2 = .34, mean Cohen's Kappa = .17) was low. To explain the study-to-study variation of the IRR coefficients, meta-regression analyses were calculated using seven covariates. Two covariates that emerged in the meta-regression analyses as statistically significant to gain an approximate homogeneity of the intra-class correlations indicated that, firstly, the more manuscripts that a study is based on, the smaller the reported IRR coefficients are. Secondly, if the information of the rating system for reviewers was reported in a study, then this was associated with a smaller IRR coefficient than if the information was not conveyed. Conclusions/Significance Studies that report a high level of IRR are to be considered less credible than those with a low level of IRR. According to our meta-analysis the IRR of peer assessments is quite limited and needs improvement (e.g., reader system). PMID:21179459
Kubicka, Katarzyna; Radoń, Urszula; Szaniec, Waldemar; Pawlak, Urszula
The paper concerns the reliability analysis of steel structures subjected to high temperatures of fire gases. Two types of spatial structures were analysed, namely with pinned and rigid nodes. The fire analysis was carried out according to prescriptions of Eurocode. The static-strength analysis was conducted using the finite element method (FEM). The MES3D program, developed by Szaniec (Kielce University of Technology, Poland), was used for this purpose. The results received from MES3D made it possible to carry out the reliability analysis using the Numpress Explore program that was developed at the Institute of Fundamental Technological Research of the Polish Academy of Sciences . The measurement of reliability of structures is the Hasofer-Lind reliability index (β). The reliability analysis was carried out according to approximation (FORM, SORM) and simulation (Importance Sampling, Monte Carlo) methods. As the fire progresses, the value of reliability index decreases. The analysis conducted for the study made it possible to evaluate the impact of node types on those changes. In real structures, it is often difficult to define correctly types of nodes, so some simplifications are made. The presented analysis contributes to the recognition of consequences of such assumptions for the safety of structures, subjected to fire.
Conclusion: Computer assisted lower limb alignment analysis is reliable whether using graphics editing program or specialized planning software. However slight higher variability for angles away from the knee joint can be expected.
.... We show in our analysis that each bearing should be redesigned. In our research, we analyzed and established a historical bearing failure data baseline of current reliability and maintenance costs...
Robert W Youngblood
The Risk-Informed Safety Margin Characterization (RISMC) pathway is a set of activities defined under the U.S. Department of Energy (DOE) Light Water Reactor Sustainability Program. The overarching objective of RISMC is to support plant life-extension decision-making by providing a state-of-knowledge characterization of safety margins in key systems, structures, and components (SSCs). A technical challenge at the core of this effort is to establish the conceptual and technical feasibility of analyzing safety margin in a risk-informed way, which, unlike conventionally defined deterministic margin analysis, is founded on probabilistic characterizations of SSC performance.
Nikulin, M; Mesbah, M; Limnios, N
Parametric and semiparametric models are tools with a wide range of applications to reliability, survival analysis, and quality of life. This self-contained volume examines these tools in survey articles written by experts currently working on the development and evaluation of models and methods. While a number of chapters deal with general theory, several explore more specific connections and recent results in "real-world" reliability theory, survival analysis, and related fields.
This study aims to investigate the reliability and sensitivity of cast iron water pipes for agricultural food irrigation. The Monte Carlo simulation method is used for fracture assessment and reliability analysis of cast iron pipes for agricultural food irrigation. Fracture toughness is considered as a limit state function for corrosion affected cast iron pipes. Then the influence of failure mode on the probability of pipe failure has been discussed. Sensitivity analysis also is carried out t...
Vahdatirad, Mohammad Javad; Griffiths, D. V.; Andersen, Lars Vabbersgaard
Deterministic code-based designs proposed for wind turbine foundations, are typically biased on the conservative side, and overestimate the probability of failure which can lead to higher than necessary construction cost. In this study reliability analysis of a gravity-based foundation concerning...... technique to perform the reliability analysis. The calibrated code-based design approach leads to savings of up to 20% in the concrete foundation volume, depending on the target annual reliability level. The study can form the basis for future optimization on deterministic-based designs for wind turbine...... foundations....
Padilla Zarate, Gerardo
The objective of this research is to produce an early reliability assessment of a sequential assembly of software components using limited component execution-related information and considering the expected assembly use. Accomplishing this objective provides quantitative means to support design decisions and to improve the component selection process. The execution-related information, called execution traces, is gathered during the component testing process.
Despite Failure Mode and Effect Analysis (FMEA) being a strategic technique for creation of error free service operation, detailed survey study and development of opportunity roadmap for FMEA application in service operation is limited in literature. We presented a preliminary literature survey between 1994 and 2010 that ...
Ang, Rebecca P.; Huan, Vivien S.
This article describes the development and initial validation of obtained scores from the Academic Expectations Stress Inventory (AESI), which measures expectations as a source of academic stress in middle and high school Asian students. In the first study, exploratory factor analysis results from 721 adolescents suggested a nine-item scale with…
Mørk, Kim Jørgensen
During the last 30 years response analysis of structures under random excitation has been studied in detail. These studies are motivated by the fact that most of natures excitations, such as earthquakes, wind and wave loads exhibit randomly fluctuating characters. For safety reasons this randomne...
Shahid, Kamal; Saeed, Aamir; Kristensen, Thomas le Fevre
. The question is addressed by analyzing the performance of UDP and TCP over imperfect network conditions to show how the selection of transport layer protocol can dramatically affect controller’s performance. This analysis is based on a quality metric called mismatch probability that considers occurrence...... of events at grid assets as well as the information update strategy in one single metric which otherwise is not very intuitive and difficult to allow a similar useful comparison. Further, the analysis is concluded by providing a clear guide on the selection of the transport protocol to meet application...
Patel, E; Cicatiello, P; Deininger, L; Clench, M R; Marino, G; Giardina, P; Langenburg, G; West, A; Marshall, P; Sears, V; Francese, S
Blood evidence is frequently encountered at the scene of violent crimes and can provide valuable intelligence in the forensic investigation of serious offences. Because many of the current enhancement methods used by crime scene investigators are presumptive, the visualisation of blood is not always reliable nor does it bear additional information. In the work presented here, two methods employing a shotgun bottom up proteomic approach for the detection of blood are reported; the developed protocols employ both an in solution digestion method and a recently proposed procedure involving immobilization of trypsin on hydrophobin Vmh2 coated MALDI sample plate. The methods are complementary as whilst one yields more identifiable proteins (as biomolecular signatures), the other is extremely rapid (5 minutes). Additionally, data demonstrate the opportunity to discriminate blood provenance even when two different blood sources are present in a mixture. This approach is also suitable for old bloodstains which had been previously chemically enhanced, as experiments conducted on a 9-year-old bloodstain deposited on a ceramic tile demonstrate.
Chittenden Thomas W
Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.
Zhi-Ling XU; Pei-Pei Shen
In this paper, it is modeled by using ADAMS to portable axle load meter of dynamic weighing system, controlling a single variable simulation weighing process, getting the simulation weighing data under the different speed and weight; simultaneously using portable weighing system with the same parameters to achieve the actual measurement, comparative analysis the simulation results under the same conditions, at 30 km/h or less, the simulation value and the measured value do not differ by more ...
Lobko, P I; Kovaleva, D V; Kovalchuk, I E; Pivchenko, P G; Rudenok, V V; Davydova, L A
Information parameters (entropia and redundancy) of cervical and thoracic spinal ganglia of albino rat foetuses, mature animals (cat and dog) and human subjects were analysed. Information characteristics of spinal ganglia were shown to be level-specified and to depend on their functional peculiarities. Information parameters of thoracic spinal ganglia of man and different animals are specie specified and may be used in assessment of morphological structures as information systems.
Full Text Available Interruptions in cardiopulmonary resuscitation (CPR compromise defibrillation success. However, CPR must be interrupted to analyze the rhythm because although current methods for rhythm analysis during CPR have high sensitivity for shockable rhythms, the specificity for nonshockable rhythms is still too low. This paper introduces a new approach to rhythm analysis during CPR that combines two strategies: a state-of-the-art CPR artifact suppression filter and a shock advice algorithm (SAA designed to optimally classify the filtered signal. Emphasis is on designing an algorithm with high specificity. The SAA includes a detector for low electrical activity rhythms to increase the specificity, and a shock/no-shock decision algorithm based on a support vector machine classifier using slope and frequency features. For this study, 1185 shockable and 6482 nonshockable 9-s segments corrupted by CPR artifacts were obtained from 247 patients suffering out-of-hospital cardiac arrest. The segments were split into a training and a test set. For the test set, the sensitivity and specificity for rhythm analysis during CPR were 91.0% and 96.6%, respectively. This new approach shows an important increase in specificity without compromising the sensitivity when compared to previous studies.
This book compiles and critically discusses modern engineering system degradation models and their impact on engineering decisions. In particular, the authors focus on modeling the uncertain nature of degradation considering both conceptual discussions and formal mathematical formulations. It also describes the basics concepts and the various modeling aspects of life-cycle analysis (LCA). It highlights the role of degradation in LCA and defines optimum design and operation parameters. Given the relationship between operational decisions and the performance of the system’s condition over time, maintenance models are also discussed. The concepts and models presented have applications in a large variety of engineering fields such as Civil, Environmental, Industrial, Electrical and Mechanical engineering. However, special emphasis is given to problems related to large infrastructure systems. The book is intended to be used both as a reference resource for researchers and practitioners and as an academic text ...
Reer, Bernhard [Paul Scherrer Institute (PSI), 5232 Villigen PSI (Switzerland)], E-mail: firstname.lastname@example.org
In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 1 is presented in this article. Emerging HRA methods addressing the problem of EOC identification are: A Technique for Human Event Analysis (ATHEANA), the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the Misdiagnosis Tree Analysis (MDTA) method, and the Commission Errors Search and Assessment (CESA) method. Most of the EOCs referred to in predictive studies comprise the stop of running or the inhibition of anticipated functions; a few comprise the start of a function. The CESA search scheme-which proceeds from possible operator actions to the affected systems to scenarios and uses procedures and importance measures as key sources of input information-provides a formalized way for identifying relatively important scenarios with EOC opportunities. In the implementation however, attention should be paid regarding EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions.
C. L. Smith; W. J. Galyean; S. T. Beck
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows? operating system. Herein information is provided on the principles used in the construction and operation of Version 6.0 and 7.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, cut set "recovery," end state manipulation, and use of "compound events."
C. L. Smith; W. J. Galyean; S. T. Beck
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows? operating system. Herein information is provided on the principles used in the construction and operation of Version 6.0 and 7.0 of the SAPHIRE system. This report summarizes the fundamental mathematical concepts of sets and logic, fault trees, and probability. This volume then describes the algorithms used to construct a fault tree and to obtain the minimal cut sets. It gives the formulas used to obtain the probability of the top event from the minimal cut sets, and the formulas for probabilities that apply for various assumptions concerning reparability and mission time. It defines the measures of basic event importance that SAPHIRE can calculate. This volume gives an overview of uncertainty analysis using simple Monte Carlo sampling or Latin Hypercube sampling, and states the algorithms used by this program to generate random basic event probabilities from various distributions. Also covered are enhance capabilities such as seismic analysis, cut set "recovery," end state manipulation, and use of "compound events."
Zhang, Ying-Zhi; Liu, Jin-Tong; Shen, Gui-Xiang; Long, Zhe; Sun, Shu-Guang
In order to rectify the problems that the component reliability model exhibits deviation, and the evaluation result is low due to the overlook of failure propagation in traditional reliability evaluation of machine center components, a new reliability evaluation method based on cascading failure analysis and the failure influenced degree assessment is proposed. A direct graph model of cascading failure among components is established according to cascading failure mechanism analysis and graph theory. The failure influenced degrees of the system components are assessed by the adjacency matrix and its transposition, combined with the Pagerank algorithm. Based on the comprehensive failure probability function and total probability formula, the inherent failure probability function is determined to realize the reliability evaluation of the system components. Finally, the method is applied to a machine center, it shows the following: 1) The reliability evaluation values of the proposed method are at least 2.5% higher than those of the traditional method; 2) The difference between the comprehensive and inherent reliability of the system component presents a positive correlation with the failure influenced degree of the system component, which provides a theoretical basis for reliability allocation of machine center system.
Bao, Ke; Zhang, Zhong; Cao, Yuan-fu; Chen, Yi-jie
Spectrometric oil analysis is of great importance for wear condition monitoring of gearbox. In this context, the contents of main elements compositions in the bench test of heavy vehicle gearbox are obtained by atomic emission spectrometric oil analysis first. Then correlation analysis of the test data and wearing mechanism analysis are carried out to get the metal element which could be used to describe the wearing and failure of the gearbox. The spectrometric data after filling/changing oil are corrected, and the laws of the contents of main elements compositions during tests are expressed as linear functions. After that, the reliability assessment is executed with considering the degradation law and discreteness of test data, in which the mean and standard deviation of normal distribution of spectrometric oil data at each time point are adopted. Finally, the influences of the threshold are discussed. It has been proved that the contents of metal element Cu, which is got by spectrometric oil analysis of different samples, could be used to assess the reliability of heavy vehicle gearbox. The reason is that the metal element Cu is closely related to the general wear state of gearbox, and is easy to be measured. When the threshold of Cu content is treated as a constant, bigger threshold means higher reliability at the same time, and the mean value of threshold has significant impact on the reliability assessment results as R > 0.9. When the threshold is treated as a random variable, bigger dispersion of threshold means smaller slope of reliability against time, and also means lower reliability of gearbox as R > 0.9 at the same time. In this study, the spectrometric oil analysis and probability statistics are used together for the reliability assessment of gear box, which extends the application range of spectrometric analysis.
Zakimi, Ken; Watanabe, Hiroyuki; Ishida, Hideki; Take, Toshio; Kato, Mitsuyoshi; Iwai, Tsugunori; Nitta, Masaru; Kato, Kyouichi; Nakazawa, Yasuo
We analyzed a number of cases about the Linac troubles in our hospital and have examined the effect of preventive maintenance with Weibull analysis and exponential distribution from April 2001 to March 2012. The total failure by irradiation disabled was 1, 192. (1) Medical linear accelerator (MLC) system was 24.0%, (2) radiation dosimetry system 13.1%, and the (3) cooling-water system was 26.5%. It accounts for 63.6% of the total number of failures. Each parameter value m, which means the shape parameter, and the failure period expectancy of parts μ were (1) 1.21, 1.46/3.9, 3.8 years. 3.7, 3.6 years. (2) 2.84, 1.59/6.6, 4.3 years. 6.7, 5.9 years. (3) 5.12, 4.16/6.1, 8.5 years. 6.1, 8.5 years. Each shape parameter was m>1. It is believed that they are in the worn-out failure period. To prevent failure, MLC performance should be overhauled once every 3 years and a cooling unit should be overhauled once every 7 years. Preventive maintenance is useful in assessing the failure of radiation therapy equipment. In a radiation dosimetry part, you can make a preemptive move before the failure by changing the monitor's dosimeter board with a new part from the repairs stockpiled every 6 months for maintenance.
Full Text Available There is complicated correlations in mechanical system. By using the advantages of copula function to solve the related issues, this paper proposes the mechanical system reliability model based on copula function. And makes a detailed research for the serial and parallel mechanical system model and gets their reliability function respectively. Finally, the application research is carried out for serial mechanical system reliability model to prove its validity by example. Using Copula theory to make mechanical system reliability modeling and its expectation, studying the distribution of the random variables (marginal distribution of the mechanical product’ life and associated structure of variables separately, can reduce the difficulty of multivariate probabilistic modeling and analysis to make the modeling and analysis process more clearly.
Kimiaeifar, Amin; Sørensen, John Dalsgaard; Lund, Erik
This paper describes a probabilistic approach to calculate the reliability of adhesive bonded composite stepped lap joints loaded in fatigue using three- dimensional finite element analysis (FEA). A method for progressive damage modelling is used to assess fatigue damage accumulation and residual...... by the wind turbine standard IEC 61400-1. Finally, an approach for the assessment of the reliability of adhesive bonded composite stepped lap joints loaded in fatigue is presented. The introduced methodology can be applied in the same way to calculate the reliability level of wind turbine blade components...
Full Text Available The analysis of the operational reliability of electrical and electronic equipment of vehicles, trucks Mercedes-Benz Actros 1844 LS and Volvo FH 1242, conducting international cargo transportation is performed. It is established that the equipment is reliable, which meets modern requirements, but where there is a violation of the resolution. The reason for repair work is constructive and operational factors. Distribution of efficiency and overall performance of operational reliability is retrieved. Items with more bounce are found. Common factors of violation of efficiency cars, trucks in operation, are largely different stages in warranty runs are obtained.
Polcin, Douglas L; Galloway, Gantt P; Bond, Jason; Korcha, Rachael; Greenfield, Thomas K
The addiction field lacks an accepted definition and reliable measure of confrontation. The Alcohol and Drug Confrontation Scale (ADCS) defines confrontation as warnings about the potential consequences of substance use. To assess psychometric properties, 323 individual entering recovery houses in U.S. urban and suburban areas were interviewed between 2003 and 2005 (20% women, 68% white). Analyses included test-retest reliability, confirmatory factor analysis, and measures of internal consistency. Findings support the ADCS as a reliable way of assessing two factors: Internal Support and External intensity. Confrontation was experienced as supportive, accurate and helpful. Additional studies should assess confrontation in different contexts.
Cauquil, J. M.; Seguineau, C.; Martin, J.-Y.; Benschop, T.
The cooled IR detectors are used in a wide range of applications. Most of the time, the cryocoolers are one of the components dimensioning the lifetime of the system. The cooler reliability is thus one of its most important parameters. This parameter has to increase to answer market needs. To do this, the data for identifying the weakest element determining cooler reliability has to be collected. Yet, data collection based on field are hardly usable due to lack of informations. A method for identifying the improvement in reliability has then to be set up which can be used even without field return. This paper will describe the method followed by Thales Cryogénie SAS to reach such a result. First, a database was built from extensive expertizes of RM2 failures occurring in accelerate ageing. Failure modes have then been identified and corrective actions achieved. Besides this, a hierarchical organization of the functions of the cooler has been done with regard to the potential increase of its efficiency. Specific changes have been introduced on the functions most likely to impact efficiency. The link between efficiency and reliability will be described in this paper. The work on the two axes - weak spots for cooler reliability and efficiency - permitted us to increase in a drastic way the MTTF of the RM2 cooler. Huge improvements in RM2 reliability are actually proven by both field return and reliability monitoring. These figures will be discussed in the paper.
Lifetime, reliability and risk analysis methods and applications for structural systems and components of power plants are discussed in this thesis. These analyses involve many fields of science, such as structural mechanics, fracture mechanics, probability mathematics, material science and fluid mechanics. An overview of power plant environments and a description of the various degradation mechanisms damaging the power plant systems and components are presented first. This is followed with a description of deterministic structural analysis methods, covering e.g. structural mechanics and fracture mechanics based analysis methods as well as the disadvantages of the deterministic analysis approach. Often, physical probabilistic methods are based on deterministic analysis methods with the modification that one or more of the model parameters are considered as probabilistically distributed. Several probabilistic analysis procedures are presented, e.g. Monte Carlo Simulation (MCS) and importance sampling. Description of probabilistic analysis methods covers both physical and statistical approaches. When the system/component failure probabilities are combined with knowledge of failure consequences, it is possible to assess system/component risks. Several risk analysis methods are presented as well as some limitations and shortcomings concerning to them. Modelling methods for various degradation (or ageing) mechanisms are presented. These methods are needed in the lifetime analyses of structural systems and components of power plants. In general, the lifetime analyses in question necessitate a thorough knowledge of structural properties, loads, the relevant degradation mechanisms and prevailing environmental conditions. The nature of degradation models of structural systems/components can be deterministic, probabilistic or a combination of these two types. Degradation models of all these kinds are presented here. Some important risk analysis applications are described
Choi, Dong-Hee; Shin, Jin-Chul; Park, Hong-Seong
Profibus is open industrial communication system for wide range of applications in manufacturing and process automation. In Profibus, FDL service use to need hard real-time system. In these systems required data reliability and stability and real-time feature. Profibus fieldbus networks used in many industrial fields because of it supports real-time industrial communication. So we analyze of data reliability and stabilization in profibus network. In this paper, there was to a station for communication which uses FDL from in the communication module which is used a data transfer possibility at once, and from communication period (ex. 10ms) it analyzed the system effect which it follows in transmission lag occurrence element and a data transfer error ratio it analyzed. Like this analytical result it led and there were from transmission for reliability and data stability they confirmed to HR-SDN communication modules and a guarantee yes or no. In this paper, we try to analysis of transmission delay ability for satisfaction data reliability and stability in specific system, which requested real-time feature. And, we analysis system reconstruction time and data delay time according to data/token packet loss. Packet-error occur physical layer in Profibus. As a result of above analysis, we propose method of enhancement of reliability in system which requested system reliability and stability. And, we confirm proposed method.
Vladimir S. Utkin
Full Text Available The article describes the general problem of safe operation of buildings and structures with the dynamics of permafrost in Russia and other countries. The global warming on Earth will lead to global disasters such as failures of buildings and structures. The main reason of these failures will be a reduction of bearing capacity and the reliability of foundations. It is necessary to organize the observations (monitoring for the process of reducing the bearing capacity of foundations to prevent such accidents and reduce negative consequences, to development of preventive measures and operational methods for the piles reliability analysis. The main load-bearing elements of the foundation are reinforced concrete piles and frozen ground. Reinforced concrete piles have a tendency to decrease the bearing capacity and reliability of the upper (aerial part and the part in the soil. The article discusses the problem of reliability analysis of existing reinforced concrete piles in upper part in permafrost regions by the reason of pile degradation in the contact zone of seasonal thawing and freezing soil. The evaluation of the probability of failure is important in itself, but also it important for the reliability of foundation: consisting of piles and frozen soil. Authors offers the methods for reliability analysis of upper part of reinforced concrete piles in the contact zone with seasonally thawed soil under different number of random variables (fuzzy variables in the design mathematical model of a limit state by the strength criterion.
Czuchry, Andrew J.; And Others
This report provides a complete guide to the stand alone mode operation of the reliability and maintenance (R&M) model, which was developed to facilitate the performance of design versus cost trade-offs within the digital avionics information system (DAIS) acquisition process. The features and structure of the model, its input data…
Kimiaeifar, Amin; Lund, Erik; Thomsen, Ole Thybo
Reliability analysis coupled with finite element analysis (FEA) of composite structures is computationally very demanding and requires a large number of simulations to achieve an accurate prediction of the probability of failure with a small standard error. In this paper Asymptotic Sampling, which...... is a promising and time efficient tool to calculate the probability of failure, is utilized, and a probabilistic model for the reliability analysis of adhesive bonded stepped lap composite joints, representative for the main laminate in a wind turbine blade subjected to static flapwise bending load, is presented....... Three dimensional (3D) FEA is used for the structural analysis together with a design equation that is associated with a deterministic code-based design equation where reliability is secured by partial safety factors. The Tsai-Wu and the maximum principal stress failure criteria are used to predict...
Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra
In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in
Full Text Available The paper deals with problems of using dependability terms, defined in actual standard STN IEC 50 (191: International electrotechnical dictionary, chap. 191: Dependability and quality of service (1993, in a technical systems dependability analysis. The goal of the paper is to find a relation between terms introduced in the mentioned standard and used in the technical systems dependability analysis and rules and practices used in a system analysis of the system theory. Description of a part of the system life cycle related to reliability is used as a starting point. The part of a system life cycle is described by the state diagram and reliability relevant therms are assigned.
Kurtz, Nolan Scot
The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may
April M. Whaley; Stacey M. L. Hendrickson; Ronald L. Boring; Jing Xing
This is the second of two papers that discuss the literature review conducted as part of the U.S. Nuclear Regulatory Commission (NRC) effort to develop a hybrid human reliability analysis (HRA) method in response to Staff Requirements Memorandum (SRM) SRM-M061020. This review was conducted with the goal of strengthening the technical basis within psychology, cognitive science and human factors for the hybrid HRA method being proposed. An overview of the literature review approach and high-level structure is provided in the first paper, whereas this paper presents the results of the review. The psychological literature review encompassed research spanning the entirety of human cognition and performance, and consequently produced an extensive list of psychological processes, mechanisms, and factors that contribute to human performance. To make sense of this large amount of information, the results of the literature review were organized into a cognitive framework that identifies causes of failure of macrocognition in humans, and connects those proximate causes to psychological mechanisms and performance influencing factors (PIFs) that can lead to the failure. This cognitive framework can serve as a tool to inform HRA. Beyond this, however, the cognitive framework has the potential to also support addressing human performance issues identified in Human Factors applications.
Data was collected in order to further NASA Langley Research Center's Geographic Information System(GIS). Information on LaRC's communication, electrical, and facility configurations was collected. Existing data was corrected through verification, resulting in more accurate databases. In addition, Global Positioning System(GPS) points were used in order to accurately impose buildings on digitized images. Overall, this project will help the Imaging and CADD Technology Team (ICTT) prove GIS to be a valuable resource for LaRC.
Dorner, Daniel G; Calvert, Philip J
If you want to provide an information service that truly fulfils your users' needs, this book is essential reading. The book supports practitioners in developing an information needs analysis strategy and offers the necessary professional skills and techniques to do so.
Full Text Available Objective. To design a bidimensional facial movement measuring tool and study its reliability. Methods. We utilized the free video-analysis software Kinovea that can track preselected points during movements and measure two-point distances off-line. Three raters positioned facial markers on 10 healthy individuals and video-taped them during maximal bilateral contractions of frontalis, corrugator, orbicularis oculi, zygomaticus, orbicularis oris, and buccinator, on two occasions. Each rater also analyzed the first video twice, one week apart. For each muscle, intrarater reliability was measured by percent agreements (PA and intraclass correlation coefficients (ICC between two assessments of the same video one week apart and between assessments of two videos collected one week apart. Interrater reliability was measured by PA, ICC, and coefficients of variation (CV between assessments of the first video-recording by the three raters. Results. Intrarater and interrater reliabilities were good to excellent for frontalis (PA and ICC > 70%; CV < 15%, moderate for orbicularis oculi, zygomaticus, and orbicularis oris, and poor for corrugator and buccinators. Discussion. Without formal prior training, the proposed method was reliable for frontalis in healthy subjects. Improved marker selection, training sessions, and testing reliability in patients with facial paresis may enhance reliability for orbicularis oculi, zygomaticus, and orbicularis oris.
Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling. Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques. This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...
Sørensen, John Dalsgaard
for extreme and fatigue limit states are presented. Operation & Maintenance planning often follows corrective and preventive strategies based on information from condition monitoring and structural health monitoring systems. A reliability- and risk-based approach is presented where a life-cycle approach...
Vaegler, Sven; Sauer, Otto [Wuerzburg Univ. (Germany). Dept. of Radiation Oncology; Stsepankou, Dzmitry; Hesser, Juergen [University Medical Center Mannheim, Mannheim (Germany). Dept. of Experimental Radiation Oncology
The reduction of dose in cone beam computer tomography (CBCT) arises from the decrease of the tube current for each projection as well as from the reduction of the number of projections. In order to maintain good image quality, sophisticated image reconstruction techniques are required. The Prior Image Constrained Compressed Sensing (PICCS) incorporates prior images into the reconstruction algorithm and outperforms the widespread used Feldkamp-Davis-Kress-algorithm (FDK) when the number of projections is reduced. However, prior images that contain major variations are not appropriately considered so far in PICCS. We therefore propose the partial-PICCS (pPICCS) algorithm. This framework is a problem-specific extension of PICCS and enables the incorporation of the reliability of the prior images additionally. We assumed that the prior images are composed of areas with large and small deviations. Accordingly, a weighting matrix considered the assigned areas in the objective function. We applied our algorithm to the problem of image reconstruction from few views by simulations with a computer phantom as well as on clinical CBCT projections from a head-and-neck case. All prior images contained large local variations. The reconstructed images were compared to the reconstruction results by the FDK-algorithm, by Compressed Sensing (CS) and by PICCS. To show the gain of image quality we compared image details with the reference image and used quantitative metrics (root-mean-square error (RMSE), contrast-to-noise-ratio (CNR)). The pPICCS reconstruction framework yield images with substantially improved quality even when the number of projections was very small. The images contained less streaking, blurring and inaccurately reconstructed structures compared to the images reconstructed by FDK, CS and conventional PICCS. The increased image quality is also reflected in large RMSE differences. We proposed a modification of the original PICCS algorithm. The pPICCS algorithm
Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.
While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.
A.J. Moniz (Andy)
markdownabstractTraditionally, equity investors have relied upon the information reported in firms’ financial accounts to make their investment decisions. Due to the conservative nature of accounting standards, firms cannot value their intangible assets such as corporate culture, brand value and
This thesis deals with ultimate strength and reliability analysis of offshore production ships, accounting for stochastic load combinations, using a typical North Sea production ship for reference. A review of methods for structural reliability analysis is presented. Probabilistic methods are established for the still water and vertical wave bending moments. Linear stress analysis of a midships transverse frame is carried out, four different finite element models are assessed. Upon verification of the general finite element code ABAQUS with a typical ship transverse girder example, for which test results are available, ultimate strength analysis of the reference transverse frame is made to obtain the ultimate load factors associated with the specified pressure loads in Det norske Veritas Classification rules for ships and rules for production vessels. Reliability analysis is performed to develop appropriate design criteria for the transverse structure. It is found that the transverse frame failure mode does not seem to contribute to the system collapse. Ultimate strength analysis of the longitudinally stiffened panels is performed, accounting for the combined biaxial and lateral loading. Reliability based design of the longitudinally stiffened bottom and deck panels is accomplished regarding the collapse mode under combined biaxial and lateral loads. 107 refs., 76 refs., 37 tabs.
Boker, Abdulaziz; Brownell, Laurence; Donen, Neil
To compare three anxiety scales; the anxiety visual analogue scale (VAS), the anxiety component of the Amsterdam preoperative anxiety and information scale (APAIS), and the state portion of the Spielburger state-trait anxiety inventory (STAI), for assessment of preoperative anxiety levels in same day admission patients. Patients completed the three anxiety assessment scales both before and after seeing the anesthesiologist preoperatively. The scales used were the STAI, the six-question APAIS, and the VAS. APAIS was further subdivided to assess anxiety about anesthesia (sum A), anxiety about surgery (sum S) and a combined anxiety total (i.e., sum C = sum A + sum S). These scales were compared to one another. Pearson's correlation (pair-wise deletion) was used for validity testing. Cronbach's alpha analysis was used to test internal validity of the various components of the APAIS scale. A correlation co-efficient (r) > or = 0.6 and P scale sets were completed by 197 patients. There was significant and positive correlation between VAS and STAI r = 0.64, P anxiety components of the APAIS (sum C) and desire for information were 0.84 and 0.77 respectively. In addition to VAS, the anxiety component of APAIS (sum C) is a promising new practical tool to assess preoperative patient anxiety levels.
Mosneron-Dupin, F.; Reer, B.; Heslinga, G.; Straeter, O.; Gerdes, V.; Saliou, G.; Ullwer, W
As an informal working group of researchers from France, Germany and The Netherlands created in 1993, the EARTH association is investigating significant subjects in the field of human reliability analysis (HRA). Our initial review of cases from nuclear operating experience showed that decision-based unrequired actions (DUA) contribute to risk significantly on the one hand. On the other hand, our evaluation of current HRA methods showed that these methods do not cover such actions adequately. Especially, practice-oriented guidelines for their predictive identification are lacking. We assumed that a basic cause for such difficulties was that these methods actually use a limited representation of the stimulus-organism-response (SOR) paradigm. We proposed a human-centered model, which better highlights the active role of the operators and the importance of their culture, attitudes and goals. This orientation was encouraged by our review of current HRA research activities. We therefore decided to envisage progress by identifying cognitive tendencies in the context of operating and simulator experience. For this purpose, advanced approaches for retrospective event analysis were discussed. Some orientations for improvements were proposed. By analyzing cases, various cognitive tendencies were identified, together with useful information about their context. Some of them match psychological findings already published in the literature, some of them are not covered adequately by the literature that we reviewed. Finally, this exploratory study shows that contextual and case-illustrated findings about cognitive tendencies provide useful help for the predictive identification of DUA in HRA. More research should be carried out to complement our findings and elaborate more detailed and systematic guidelines for using them in HRA studies.
Deep neural networks with random Gaussian weights: A universal classification strategy?,” IEEE Trans. Signal Processing, 2016, to appear. 2. M. Tepper...new applications, new camera designs, and numerous DoD and industrial technology transfers. 15. SUBJECT TERMS Information acquisition, integration...Defense (ONR, ARO, NGA), to the NIH, and to industry (Adobe, LSS, etc). As mentioned above, these also resulted in numerous awards and become the core
Wong, Arnold Y L; Parent, Eric C; Kawchuk, Greg N
Reliability study. To compare the within- and between-day intrarater reliability of rehabilitative ultrasound imaging (RUSI) using static images (static RUSI) and video clips (video RUSI) to quantify multifidus muscle thickness at rest and while contracted. Secondary objectives were to compare the measurement precision of averaging multiple measures and to estimate reliability in individuals with and without low back pain (LBP). Although intrarater reliability of static RUSI in measuring multifidus thickness has been established, using video RUSI may improve reliability estimates, as it allows examiners to select the optimal image from a video clip. Further, multiple measurements and LBP status may affect RUSI reliability estimates. Static RUSI and video RUSI were used to quantify multifidus muscle thickness at rest and during contraction and percent thickness change in 27 volunteers (13 without LBP and 14 with LBP). Three static RUSI images and 3 video RUSI video clips were collected in each of 2 sessions 1 to 4 days apart. Reliability and precision were assessed using intraclass correlation coefficients, standard error of measurement, minimal detectable change, bias, and 95% limits of agreement. Using an average of 2 measures yielded optimal measurement precision for static RUSI and video RUSI. Based on the average of 2 measures obtained under the same circumstance, there was no significant difference in the reliability estimates between static RUSI and video RUSI across all testing conditions. Reliability point estimates (intraclass correlation coefficient model 3,2) of multifidus thickness were 0.99 for within-day comparisons and ranged from 0.93 to 0.98 for between-day comparisons. The within- and between-day intraclass correlation coefficients (model 3,2) of percent thickness change ranged from 0.97 to 0.99 and from 0.80 to 0.90, respectively. The exploratory analysis showed no significant difference in the reliability estimates between asymptomatic and LBP
Nikabdullah, N. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia and Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Singh, S. S. K.; Alebrahim, R.; Azizi, M. A. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); K, Elwaleed A. [Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Noorani, M. S. M. [School of Mathematical Sciences, Faculty of Science and Technology, Universiti Kebangsaan Malaysia (Malaysia)
The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading.
Petersen, Kurt Erling
Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...
Human reliability guidance - How to increase the synergies between human reliability, human factors, and system design and engineering. Phase 2: The American Point of View - Insights of how the US nuclear industry works with human reliability analysis
Oxstrand, J. (Vattenfall Ringhals AB, Stockholm (Sweden))
The main goal of this Nordic Nuclear Safety Research Council (NKS) project is to produce guidance for how to use human reliability analysis (HRA) to strengthen overall safety. The project consists of two substudies: The Nordic Point of View - A User Needs Analysis, and The American Point of View - Insights of How the US Nuclear Industry Works with HRA. The purpose of the Nordic Point of View study was a user needs analysis that aimed to survey current HRA practices in the Nordic nuclear industry, with the main focus being to connect HRA to system design. In this study, 26 Nordic (Swedish and Finnish) nuclear power plant specialists with research, practitioner, and regulatory expertise in HRA, PRA, HSI, and human performance were interviewed. This study was completed in 2009. This study concludes that HRA is an important tool when dealing with human factors in control room design or modernizations. The Nordic Point of View study showed areas where the use of HRA in the Nordic nuclear industry could be improved. To gain more knowledge about how these improvements could be made, and what improvements to focus on, the second study was conducted. The second study is focused on the American nuclear industry, which has many more years of experience with risk assessment and human reliability than the Nordic nuclear industry. Interviews were conducted to collect information to help the author understand the similarities and differences between the American and the Nordic nuclear industries, and to find data regarding the findings from the first study. The main focus of this report is to identify potential HRA improvements based on the data collected in the American Point of View survey. (Author)
Full Text Available Abstract Background Breast cancer is the most common cause of cancer-related deaths among women worldwide. The aim of the present study was to determine and compare knowledge, behavior and attitudes among female nurses and teachers concerning breast self-examination (BSE. Methods Two-hundred and eighty nine women working in Aydin, Turkey (125 nurses and 164 teachers were included in the study. The data were collected using a questionnaire designed to measure the knowledge, attitudes and behavior of the groups. Analysis involved percentiles, χ2 tests, t tests and factor analysis. Results The knowledge of nurses about BSE was higher than that of teachers (81.5% versus 45.1%; p 0.05, whereas skills in performing self-examination were higher in nurses (p Conclusion We conclude that nurses and teachers should be supported with information enabling them to accomplish their roles in the community. To improve BSE practice, it is crucial to coordinate continuous and planned education.
Singh, Salvinder; Abdullah, Shahrum; Nik Mohamed, Nik Abdullah; Mohd Noorani, Mohd Salmi
The reliability assessment for an automobile crankshaft provides an important understanding in dealing with the design life of the component in order to eliminate or reduce the likelihood of failure and safety risks. The failures of the crankshafts are considered as a catastrophic failure that leads towards a severe failure of the engine block and its other connecting subcomponents. The reliability of an automotive crankshaft under mixed mode loading using the Markov Chain Model is studied. The Markov Chain is modelled by using a two-state condition to represent the bending and torsion loads that would occur on the crankshaft. The automotive crankshaft represents a good case study of a component under mixed mode loading due to the rotating bending and torsion stresses. An estimation of the Weibull shape parameter is used to obtain the probability density function, cumulative distribution function, hazard and reliability rate functions, the bathtub curve and the mean time to failure. The various properties of the shape parameter is used to model the failure characteristic through the bathtub curve is shown. Likewise, an understanding of the patterns posed by the hazard rate onto the component can be used to improve the design and increase the life cycle based on the reliability and dependability of the component. The proposed reliability assessment provides an accurate, efficient, fast and cost effective reliability analysis in contrast to costly and lengthy experimental techniques.
Haruna, Hussein; Tshuma, Ndumiso; Hu, Xiao
Understanding health information needs and health-seeking behavior is a prerequisite for developing an electronic health information literacy (EHIL) or eHealth literacy program for nondegree health sciences students. At present, interest in researching health information needs and reliable sources paradigms has gained momentum in many countries. However, most studies focus on health professionals and students in higher education institutions. The present study was aimed at providing new insight and filling the existing gap by examining health information needs and reliability of sources among nondegree health sciences students in Tanzania. A cross-sectional study was conducted in 15 conveniently selected health training institutions, where 403 health sciences students were participated. Thirty health sciences students were both purposely and conveniently chosen from each health-training institution. The selected students were pursuing nursing and midwifery, clinical medicine, dentistry, environmental health sciences, pharmacy, and medical laboratory sciences courses. Involved students were either in their first year, second year, or third year of study. Health sciences students' health information needs focus on their educational requirements, clinical practice, and personal information. They use print, human, and electronic health information. They lack eHealth research skills in navigating health information resources and have insufficient facilities for accessing eHealth information, a lack of specialists in health information, high costs for subscription electronic information, and unawareness of the availability of free Internet and other online health-related databases. This study found that nondegree health sciences students have limited skills in EHIL. Thus, designing and incorporating EHIL skills programs into the curriculum of nondegree health sciences students is vital. EHIL is a requirement common to all health settings, learning environments, and
I. S. Shumilov
Full Text Available The paper deals with design requirements for an aviation fuel system (AFS, AFS basic design requirements, reliability, and design precautions to avoid AFS failure. Compares the reliability and fail-safety of AFS and aircraft hydraulic system (AHS, considers the promising alternative ways to raise reliability of fuel systems, as well as elaborates recommendations to improve reliability of the pipeline system components and pipeline systems, in general, based on the selection of design solutions.It is extremely advisable to design the AFS and AHS in accordance with Aviation Regulations АП25 and Accident Prevention Guidelines, ICAO (International Civil Aviation Association, which will reduce risk of emergency situations, and in some cases even avoid heavy disasters.ATS and AHS designs should be based on the uniform principles to ensure the highest reliability and safety. However, currently, this principle is not enough kept, and AFS looses in reliability and fail-safety as compared with AHS. When there are the examined failures (single and their combinations the guidelines to ensure the AFS efficiency should be the same as those of norm-adopted in the Regulations АП25 for AHS. This will significantly increase reliability and fail-safety of the fuel systems and aircraft flights, in general, despite a slight increase in AFS mass.The proposed improvements through the use of components redundancy of the fuel system will greatly raise reliability of the fuel system of a passenger aircraft, which will, without serious consequences for the flight, withstand up to 2 failures, its reliability and fail-safety design will be similar to those of the AHS, however, above improvement measures will lead to a slightly increasing total mass of the fuel system.It is advisable to set a second pump on the engine in parallel with the first one. It will run in case the first one fails for some reasons. The second pump, like the first pump, can be driven from the
Authen, S.; Larsson, J. (Risk Pilot AB, Stockholm (Sweden)); Bjoerkman, K.; Holmberg, J.-E. (VTT, Helsingfors (Finland))
Digital protection and control systems are appearing as upgrades in older nuclear power plants (NPPs) and are commonplace in new NPPs. To assess the risk of NPP operation and to determine the risk impact of digital system upgrades on NPPs, quantitative reliability models are needed for digital systems. Due to the many unique attributes of these systems, challenges exist in systems analysis, modeling and in data collection. Currently there is no consensus on reliability analysis approaches. Traditional methods have clearly limitations, but more dynamic approaches are still in trial stage and can be difficult to apply in full scale probabilistic safety assessments (PSA). The number of PSAs worldwide including reliability models of digital I and C systems are few. A comparison of Nordic experiences and a literature review on main international references have been performed in this pre-study project. The study shows a wide range of approaches, and also indicates that no state-of-the-art currently exists. The study shows areas where the different PSAs agree and gives the basis for development of a common taxonomy for reliability analysis of digital systems. It is still an open matter whether software reliability needs to be explicitly modelled in the PSA. The most important issue concerning software reliability is proper descriptions of the impact that software-based systems has on the dependence between the safety functions and the structure of accident sequences. In general the conventional fault tree approach seems to be sufficient for modelling reactor protection system kind of functions. The following focus areas have been identified for further activities: 1. Common taxonomy of hardware and software failure modes of digital components for common use 2. Guidelines regarding level of detail in system analysis and screening of components, failure modes and dependencies 3. Approach for modelling of CCF between components (including software). (Author)
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.
Alvarengga, Marco Antonio Bayout; Fonseca, Renato Alves da, E-mail: email@example.com, E-mail: firstname.lastname@example.org [Comissao Nacional de Energia Nuclear (CNEN) Rio de Janeiro, RJ (Brazil); Melo, Paulo Fernando Frutuoso e, E-mail: email@example.com [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear
Human errors in human reliability analysis can be classified generically as errors of omission and commission errors. Omission errors are related to the omission of any human action that should have been performed, but does not occur. Errors of commission are those related to human actions that should not be performed, but which in fact are performed. Both involve specific types of cognitive error mechanisms, however, errors of commission are more difficult to model because they are characterized by non-anticipated actions that are performed instead of others that are omitted (omission errors) or are entered into an operational task without being part of the normal sequence of this task. The identification of actions that are not supposed to occur depends on the operational context that will influence or become easy certain unsafe actions of the operator depending on the operational performance of its parameters and variables. The survey of operational contexts and associated unsafe actions is a characteristic of second-generation models, unlike the first generation models. This paper discusses how first generation models can treat errors of commission in the steps of detection, diagnosis, decision-making and implementation, in the human information processing, particularly with the use of THERP tables of errors quantification. (author)
The study focused on yield-reliability analysis and operating rules for optimum scheduling of run-of-river (ROR) abstractions for typical rural water supply schemes using Siloam Village, Limpopo Province, South Africa, as a case study. Efficient operation of water supply systems requires operating rules as decision support ...
The reliability of a method using ^1^H NMR analysis for assessment of oil oxidation at a frying temperature was examined. During heating and frying at 180 °C, changes of soybean oil signals in the ^1^H NMR spectrum including olefinic (5.16-5.30 ppm), bisallylic (2.70-2.88 ppm), and allylic (1.94-2.1...
Nielsen, Søren R. K.; Peng, Yongbo; Sichani, Mahdi Teimouri
The paper deals with the response and reliability analysis of hysteretic or geometric nonlinear uncertain dynamical systems of arbitrary dimensionality driven by stochastic processes. The approach is based on the probability density evolution method proposed by Li and Chen (Stochastic dynamics...
Jonkman, S.N.; Schweckendiek, T.
This paper presents and overview of advances in flood risk and levee reliability analysis in the Netherlands. It is described how new safety standards – in the form of a target failure probability – have been derived on the basis of nationwide flood risk assessments which taken into account both
Boudali, H.; Dugan, J.B.
We present a continuous-time Bayesian network (CTBN) framework for dynamic systems reliability modeling and analysis. Dynamic systems exhibit complex behaviors and interactions between their components; where not only the combination of failure events matters, but so does the sequence ordering of
Yu, Taeho; Richardson, Jennifer C.
The purpose of this study was to develop an effective instrument to measure student readiness in online learning with reliable predictors of online learning success factors such as learning outcomes and learner satisfaction. The validity and reliability of the Student Online Learning Readiness (SOLR) instrument were tested using exploratory factor…
Ivanov, A I; Lapa, V V; Davydov, V V; Riabinin, V A; Golosov, S Iu
Results of tachistiscopic experiments on reliability of symbol recognition on LCD panel as a function of screen definition (640 x 480, 800 x 600 and 1024 x 768 pixels), angular size of a picture element (10, 15, 20 and 30 angular min) and luminance contrast (LC) with the background (0.2 to 1.4 standard units) are presented. The obtained quantitative relations indicate significance of the above parameters for recognition reliability. Symbols with the size of 30 angular min and LC of 0.5 were recognizable irrespective of screen definition in the study. Recognition of symbols 20 and 15 angular min of size was much dependent on screen definition and LC of symbols. For symbols of size 10 angular min and LC ≥ 1.0 the recognition probability did not exceed 0.59-0.7.
Boyer, Célia; Gaudinat, Arnaud; Hanbury, Allan; Appel, Ron D; Ball, Marion J; Carpentier, Michel; van Bemmel, Jan H; Bergmans, Jean-Paul; Hochstrasser, Denis; Lindberg, Donald; Miller, Randolph; Peterschmitt, Jean-Claude; Safran, Charles; Thonnet, Michèle; Geissbühler, Antoine
Accessing online health content of high quality and reliability presents challenges. Laypersons cannot easily differentiate trustworthy content from misinformed or manipulated content. This article describes complementary approaches for members of the general public and health professionals to find trustworthy content with as little bias as possible. These include the Khresmoi health search engine (K4E), the Health On the Net Code of Conduct (HONcode) and health trust indicator Web browser extensions.
Jones, Corinne A.; Hoffman, Matthew R.; Geng, Zhixian; Abdelhalim, Suzan M.; Jiang, Jack J.; McCulloch, Timothy M.
Purpose: The purpose of this study was to investigate inter- and intrarater reliability among expert users, novice users, and speech-language pathologists with a semiautomated high-resolution manometry analysis program. We hypothesized that all users would have high intrarater reliability and high interrater reliability. Method: Three expert…
Full Text Available We present a framework for the estimation of transfer entropy (TE under the conditions typical of physiological system analysis, featuring short multivariate time series and the presence of instantaneous causality (IC. The framework is based on recognizing that TE can be interpreted as the difference between two conditional entropy (CE terms, and builds on an efficient CE estimator that compensates for the bias occurring for high dimensional conditioning vectors and follows a sequential embedding procedure whereby the conditioning vectors are formed progressively according to a criterion for CE minimization. The issue of IC is faced accounting for zero-lag interactions according to two alternative empirical strategies: if IC is deemed as physiologically meaningful, zero-lag effects are assimilated to lagged effects to make them causally relevant; if not, zero-lag effects are incorporated in both CE terms to obtain a compensation. The resulting compensated TE (cTE estimator is tested on simulated time series, showing that its utilization improves sensitivity (from 61% to 96% and specificity (from 5/6 to 0/6 false positives in the detection of information transfer respectively when instantaneous effect are causally meaningful and non-meaningful. Then, it is evaluated on examples of cardiovascular and neurological time series, supporting the feasibility of the proposed framework for the investigation of physiological mechanisms.
Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.
This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.
Nelson, Eve-Lynn; Miller, Edward Alan; Larson, Kiley A
This study's purpose was to adapt the Roter Interaction Analysis System (RIAS) for telemedicine clinics and to investigate the adapted measure's reliability. The study also sought to better understand the volume of technology-related utterance in established telemedicine clinics and the feasibility of using the measure within the telemedicine setting. This initial evaluation is a first step before broadly using the adapted measure across technologies and raters. An expert panel adapted the RIAS for the telemedicine context. This involved accounting for all consultation participants (patient, provider, presenter, family) and adding technology-specific subcategories. Ten new and 36 follow-up telemedicine encounters were videotaped and double coded using the adapted RIAS. These consisted primarily of follow-up visits (78.0%) involving patients, providers, presenters, and other parties. Reliability was calculated for those categories with 15 or more utterances. Traditional RIAS categories related to socioemotional and task-focused clusters had fair to excellent levels of reliability in the telemedicine setting. Although there were too few utterances to calculate the reliability of the specific technology-related subcategories, the summary technology-related category proved reliable for patients, providers, and presenters. Overall patterns seen in traditional patient-provider interactions were observed, with the number of provider utterances far exceeding patient, presenter, and family utterances, and few technology-specific utterances. The traditional RIAS is reliable when applied across multiple participants in the telemedicine context. Reliability of technology-related subcategories could not be evaluated; however, the aggregate technology-related cluster was found to be reliable and may be especially relevant in understanding communication patterns with patients new to the telemedicine setting. Use of the RIAS instrument is encouraged to facilitate comparison
Akhmetova, I. G.; Chichirova, N. D.
Heat supply is the most energy-consuming sector of the economy. Approximately 30% of all used primary fuel-and-energy resources is spent on municipal heat-supply needs. One of the key indicators of activity of heat-supply organizations is the reliability of an energy facility. The reliability index of a heat supply organization is of interest to potential investors for assessing risks when investing in projects. The reliability indices established by the federal legislation are actually reduced to a single numerical factor, which depends on the number of heat-supply outages in connection with disturbances in operation of heat networks and the volume of their resource recovery in the calculation year. This factor is rather subjective and may change in a wide range during several years. A technique is proposed for evaluating the reliability of heat-supply organizations with the use of the simple additive weighting (SAW) method. The technique for integrated-index determination satisfies the following conditions: the reliability level of the evaluated heat-supply system is represented maximum fully and objectively; the information used for the reliability-index evaluation is easily available (is located on the Internet in accordance with demands of data-disclosure standards). For reliability estimation of heat-supply organizations, the following indicators were selected: the wear of equipment of thermal energy sources, the wear of heat networks, the number of outages of supply of thermal energy (heat carrier due to technological disturbances on heat networks per 1 km of heat networks), the number of outages of supply of thermal energy (heat carrier due to technologic disturbances on thermal energy sources per 1 Gcal/h of installed power), the share of expenditures in the cost of thermal energy aimed at recovery of the resource (renewal of fixed assets), coefficient of renewal of fixed assets, and a coefficient of fixed asset retirement. A versatile program is developed
Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.
With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst`s current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs.
Varlamov, Vladimir; Ishkhanov, Boris; Orlin, Vadim; Peskov, Nikolai; Stepanov, Mikhail
The majority of photonuclear reaction cross sections important for many fields of science and technology and various data files (EXFOR, RIPL, ENDF, etc.) supported by the IAEA were obtained in experiments using quasimonoenergetic annihilation photons. There are well-known systematic discrepancies between the partial photoneutron reactions (γ, 1n), (γ, 2n), (γ, 3n). For analysis of the data reliability the objective physical criteria were proposed. It was found out that the experimental data for many nuclei are not reliable because of large systematic uncertainties of the neutron multiplicity sorting method used. The experimentally-theoretical method was proposed for evaluating the reaction cross sections data satisfying the reliability criteria. The partial and total reaction cross sections were evaluated for many nuclei. In many cases evaluated data differ noticeably from both the experimental data and the data evaluated before for the IAEA Photonuclear Data Library. Therefore it became evident that the IAEA Library needs to be revised and updated.
April M. Whaley; Stacey M. L. Hendrickson; Ronald L. Boring; Jeffrey C. Joe; Katya L. Le Blanc; Jing Xing
In response to Staff Requirements Memorandum (SRM) SRM-M061020, the U.S. Nuclear Regulatory Commission (NRC) is sponsoring work to update the technical basis underlying human reliability analysis (HRA) in an effort to improve the robustness of HRA. The ultimate goal of this work is to develop a hybrid of existing methods addressing limitations of current HRA models and in particular issues related to intra- and inter-method variabilities and results. This hybrid method is now known as the Integrated Decision-tree Human Event Analysis System (IDHEAS). Existing HRA methods have looked at elements of the psychological literature, but there has not previously been a systematic attempt to translate the complete span of cognition from perception to action into mechanisms that can inform HRA. Therefore, a first step of this effort was to perform a literature search of psychology, cognition, behavioral science, teamwork, and operating performance to incorporate current understanding of human performance in operating environments, thus affording an improved technical foundation for HRA. However, this literature review went one step further by mining the literature findings to establish causal relationships and explicit links between the different types of human failures, performance drivers and associated performance measures ultimately used for quantification. This is the first of two papers that detail the literature review (paper 1) and its product (paper 2). This paper describes the literature review and the high-level architecture used to organize the literature review, and the second paper (Whaley, Hendrickson, Boring, & Xing, these proceedings) describes the resultant cognitive framework.
Basu, Asit P; Basu, Sujit K
This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul
Seong, Hwal-Gyeong; Choi, Kyung K.
A numerical method for obtaining accurate shape design sensitivity information for built-up structures is developed and demonstrated through analysis of examples. The basic character of the finite element method, which gives more accurate domain information than boundary information, is utilized for shape design sensitivity improvement. A domain approach for shape design sensitivity analysis of built-up structures is derived using the material derivative idea of structural mechanics and the adjoint variable method of design sensitivity analysis. Velocity elements and B-spline curves are introduced to alleviate difficulties in generating domain velocity fields. The regularity requirements of the design velocity field are studied.
Zhong, Yan; Xu, Tianqiu; Dong, Ruijuan; Lyu, Jing; Liu, Bo; Chen, Xueqing
The aim of this study was to investigate the reliability and validity of the Infant-toddler Meaningful Auditory Integration Scale (IT-MAIS), Meaningful Auditory Integration Scale (MAIS), and Meaningful Use of Speech Scale (MUSS). IT-MAIS, MAIS and MUSS were divided into 3 sub dimensions. 300 children with cochlear implants (CI) were included in the investigation. To assess test-retest reliability of these questionnaires, 30 children were selected randomly to be evaluated at a two-week interval indicated that there were no significant changes between test and retest. Furthermore random test analysis by different evaluators was also administered to 30 users. Reliability test: Test-retest reliability of the three scales was proved to be satisfactory. All domains had correlation coefficients that exceeded 0.750(P MAIS, MAIS, MUSS scales have good reliability and validity, and can be used to measure the outcome for children with cochlear implants hearing and speech evaluation. Copyright © 2017 Elsevier B.V. All rights reserved.
Wan, Lipeng [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wang, Feiyi [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Oral, H. Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cao, Qing [Univ. of Tennessee, Knoxville, TN (United States)
High-performance computing (HPC) storage systems provide data availability and reliability using various hardware and software fault tolerance techniques. Usually, reliability and availability are calculated at the subsystem or component level using limited metrics such as, mean time to failure (MTTF) or mean time to data loss (MTTDL). This often means settling on simple and disconnected failure models (such as exponential failure rate) to achieve tractable and close-formed solutions. However, such models have been shown to be insufficient in assessing end-to-end storage system reliability and availability. We propose a generic simulation framework aimed at analyzing the reliability and availability of storage systems at scale, and investigating what-if scenarios. The framework is designed for an end-to-end storage system, accommodating the various components and subsystems, their interconnections, failure patterns and propagation, and performs dependency analysis to capture a wide-range of failure cases. We evaluate the framework against a large-scale storage system that is in production and analyze its failure projections toward and beyond the end of lifecycle. We also examine the potential operational impact by studying how different types of components affect the overall system reliability and availability, and present the preliminary results
Wang, Shan; Fan, Wenjie; Yu, Wanqi; Li, Jian; Xu, Dange; Cao, Hongyan; Xi, Ying; Li, Xiuyang
To evaluate the reliability and validity of SF-36 scale in urban residents, and provide reference for the selection of suitable health measure tools for urban residents. Multi-stage cluster stratified sampling was conducted to select the residents aged ≥18 years in three urbanized communities of Hangzhou. SF-36 scale was used for the measurement of the quality of life and Spearman-Brown and Cronbach' s α coefficients were used for the evaluation of split-half reliability and internal consistency reliability. The convergent and discriminative validity were evaluated by using the success rate of experiments and the criterion-related validity was evaluated with correlation analysis and non-parameter test. Structural equation modeling was used in the evaluation of contract validity. SF-36 scale had good split-half reliability (R=0.94) and internal-consistency reliability (except for bodily pain and vitality, Cronbach's α range: 0.70-0.91). The convergent validity (88.57%), discriminate validity (successful rates 90.61%) and the criterion-related validity (γs=0.56, the score was consistent with the self-reported health status) were good. Second-order confirmatory factor analysis model was not well-fitted (GFI= 0.721, AGFI= 0.682, CFI= 0.731, RMR= 0.084, RMSEA= 0.098), indicating that the construct validity was poor. The reliability, consolidation validity, discrimination validity and criterion-related validity of SF-36 scale were good, while the construct validity was poor. Improvement is needed when the scale is used for urban residents.
Osaka, Hiroshi; Shinkoda, Koichi; Watanabe, Susumu; Fujita, Daisuke; Kobara, Kenichi; Yoshimura, Yosuke; Ito, Tomotaka
The purposes of this study were to construct a real-time acceleration gait analysis system equipped with software to analyse real-time trunk acceleration during walking and to examine the intra-rater and inter-rater reliabilities of the this system. This system has been comprised of an accelerometer, an acceleration amplifier, a transmitter, two foot switches, a receiver and a personal computer installed with the real-time acceleration analysis software. The acceleration signals received were analysed using the real-time acceleration analysis software, and gait parameters were calculated. The subjects were 20 healthy individuals and two raters. The intra-rater and inter-rater reliabilities of the measurement results obtained from this system were examined by performing intraclass correlation coefficients (ICC) and Bland-Altman analysis. The intra-rater and inter-rater ICCs ranged from 0.61 to 0.92 in any gait parameters. In the Bland-Altman analysis, neither fixed nor proportional bias was found in any of the gait parameters. From the ICC and Bland-Altman analysis results, the gait measurement using this system clearly demonstrates that the intra-rater and inter-rater measurements had good reproducibility. Owing to this system, we can improve the clinical efficiency of gait analysis and gait training for physiotherapy. Implication for Rehabilitation This study focused on the advantage of a gait analysis method using an accelerometer and constructed a gait analysis system that calculates real-time gait parameters from trunk acceleration measurements during walking. The gait analysis using this system has good intra-rater and inter-rater reliabilities, and using this system can improve the clinical efficiency of gait analysis and gait training.
Phoenix, S. Leigh; Kezirian, Michael T.; Murthy, Pappu L. N.
Composite Overwrapped Pressure Vessels (COPVs) that have survived a long service time under pressure generally must be recertified before service is extended. Flight certification is dependent on the reliability analysis to quantify the risk of stress rupture failure in existing flight vessels. Full certification of this reliability model would require a statistically significant number of lifetime tests to be performed and is impractical given the cost and limited flight hardware for certification testing purposes. One approach to confirm the reliability model is to perform a stress rupture test on a flight COPV. Currently, testing of such a Kevlar49 (Dupont)/epoxy COPV is nearing completion. The present paper focuses on a Bayesian statistical approach to analyze the possible failure time results of this test and to assess the implications in choosing between possible model parameter values that in the past have had significant uncertainty. The key uncertain parameters in this case are the actual fiber stress ratio at operating pressure, and the Weibull shape parameter for lifetime; the former has been uncertain due to ambiguities in interpreting the original and a duplicate burst test. The latter has been uncertain due to major differences between COPVs in the database and the actual COPVs in service. Any information obtained that clarifies and eliminates uncertainty in these parameters will have a major effect on the predicted reliability of the service COPVs going forward. The key result is that the longer the vessel survives, the more likely the more optimistic stress ratio model is correct. At the time of writing, the resulting effect on predicted future reliability is dramatic, increasing it by about one "nine," that is, reducing the predicted probability of failure by an order of magnitude. However, testing one vessel does not change the uncertainty on the Weibull shape parameter for lifetime since testing several vessels would be necessary.
Simpao, Allan F; Pruitt, Eric Y; Cook-Sather, Scott D; Gurnaney, Harshad G; Rehman, Mohamed A
Manual incident reports significantly under-report adverse clinical events when compared with automated recordings of intraoperative data. Our goal was to determine the reliability of AIMS and CQI reports of adverse clinical events that had been witnessed and recorded by research assistants. The AIMS and CQI records of 995 patients aged 2-12 years were analyzed to determine if anesthesia providers had properly documented the emesis events that were observed and recorded by research assistants who were present in the operating room at the time of induction. Research assistants recorded eight cases of emesis during induction that were confirmed with the attending anesthesiologist at the time of induction. AIMS yielded a sensitivity of 38 % (95 % confidence interval [CI] 8.5-75.5 %), while the sensitivity of CQI reporting was 13 % (95 % CI 0.3-52.7 %). The low sensitivities of the AIMS and CQI reports suggest that user-reported AIMS and CQI data do not reliably include significant clinical events.
Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael
The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.
Brenda L Connors
Full Text Available The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic – the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts – and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from Movement Pattern Analysis (MPA, an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective, inter-rater reliability for patterning (proportional indicators of each factor was significantly higher and excellent (ICC = .89. Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring discrete behavioral counts versus patterning of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns.
The primary objective of this research is to develop a Drainage Information Analysis and Mapping System (DIAMS), with online inspection : data submission, which will comply with the necessary requirements, mandated by both the Governmental Accounting...
Oswald, Fred B.; Savage, Michael; Zaretsky, Erwin V.
The U.S. Space Shuttle fleet was originally intended to have a life of 100 flights for each vehicle, lasting over a 10-year period, with minimal scheduled maintenance or inspection. The first space shuttle flight was that of the Space Shuttle Columbia (OV-102), launched April 12, 1981. The disaster that destroyed Columbia occurred on its 28th flight, February 1, 2003, nearly 22 years after its first launch. In order to minimize risk of losing another Space Shuttle, a probabilistic life and reliability analysis was conducted for the Space Shuttle rudder/speed brake actuators to determine the number of flights the actuators could sustain. A life and reliability assessment of the actuator gears was performed in two stages: a contact stress fatigue model and a gear tooth bending fatigue model. For the contact stress analysis, the Lundberg-Palmgren bearing life theory was expanded to include gear-surface pitting for the actuator as a system. The mission spectrum of the Space Shuttle rudder/speed brake actuator was combined into equivalent effective hinge moment loads including an actuator input preload for the contact stress fatigue and tooth bending fatigue models. Gear system reliabilities are reported for both models and their combination. Reliability of the actuator bearings was analyzed separately, based on data provided by the actuator manufacturer. As a result of the analysis, the reliability of one half of a single actuator was calculated to be 98.6 percent for 12 flights. Accordingly, each actuator was subsequently limited to 12 flights before removal from service in the Space Shuttle.
Operation safety of complex industrial systems. Forward-looking analysis and reliability databases; Surete de fonctionnement des systemes industriels complexes. Analyse previsionnelle et bases de donnees de fiabilite
The forward-looking analysis of systems failure consists in identifying the conditions that may lead to failures and to foresee their consequences on the reliability, maintainability, availability and safety of systems at the design stage or at the operation stage. It is performed from various information, the selection and analysis of which allows to design a system model. The essential information is: a description of the real system (physical and functional structures), the characteristics of the system components and of the interactions between them (failure modes and their consequences), the relations between the system and its environment, and the consideration of human errors at the exploitation step. Content: 1 - steps of an operation safety analysis; 2 - functional analysis methods: FAST, RELIASEP, SADT, IDEFO, APTE and other methods; 3 - Forward-looking analysis methods: qualitative methods, mixed and quantitative methods, human factors; 4 - reliability databases. (J.S.)
Nemeth, Noel N.
Brittle materials are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts. thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The CARES/Life code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. For this presentation an interview of the CARES/Life program will be provided. Emphasis will be placed on describing the latest enhancements to the code for reliability analysis with time varying loads and temperatures (fully transient reliability analysis). Also, early efforts in investigating the validity of using Weibull statistics, the basis of the CARES/Life program, to characterize the strength of MEMS structures will be described as as well as the version of CARES/Life for MEMS (CARES/MEMS) being prepared which incorporates single crystal and edge flaw reliability analysis capability. It is hoped this talk will open a dialog for potential collaboration in the area of MEMS testing and life prediction.
Galkina Elena Vladislavovna
Full Text Available In the article the reliability analysis methods of bidders and their tenders offers for implementation of construction works are offered. The special attention is focused on the complexity of these processes and the necessity of participation of serious, professional and responsible executors. Application of the described methods leads to a conclusion on the decrease of risks related to selection of a participant of a construction pro-ject. In the article the main stages of the implementation procedure are defined. That allows considering the economic state of applicants, and both economic and technical indicators of tender offers’ reliability. The main characteristics to be considered on each stage are revealed. The author makes a conclusion that the reliability of bidders is determined by the comparison of their economic states with the capacities for implementation of the orders with the specified characteristics. According to the terminology of the article, the reliability of applicant’s of-fers is the ability to execute orders on the bidder’s own conditions. In addition the author states that determining the reliability is based on the comparison of tender offers and contender’s characteristics of objects. The rational methods to compare economic indicators are offered. Along with this, it was found out that at the current moment the method of comparing the technical indicators for the projects-analogues with the indicators of a bidder’s object is not formalized. It limits the application of this method. Finally, it was concluded that the development of the methods applied to technical indicators provided a coherent system for evaluating the reliability of the construction bidders and their offers. It allows creating the basis for the development of appropriate automated system that can be used both for selection of competitive organizations and for preparation of offers by the applicants.
Mathematical Analysis of Evolution, Information, and Complexity deals with the analysis of evolution, information and complexity. The time evolution of systems or processes is a central question in science, this text covers a broad range of problems including diffusion processes, neuronal networks, quantum theory and cosmology. Bringing together a wide collection of research in mathematics, information theory, physics and other scientific and technical areas, this new title offers elementary and thus easily accessible introductions to the various fields of research addressed in the book.
Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L. [Pacific Northwest Lab., Richland, WA (United States)
This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual`s performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average.
Full Text Available The parameters of a system have the randomness generally in the process of milling, which influences the stability of the milling. This paper uses the neural network to get a comprehensive analysis of the influences of random factors in milling and proposes a method for reliability analysis of the regenerative chatter stability in milling. Dynamic model of milling regenerative chatter is established, and stability lobe diagram is obtained by the full-discretization method (FDM. The neural network is applied to approximate the functional relationship of the limit axial cutting depth; then the reliability is computed with the Monte Carlo simulation method (MCSM and the moment method (MM, respectively. Finally, the results of an example are used to demonstrate the efficiency and accuracy of the proposed method.
Bender, Niels Christian; Pedersen, Henrik Clemmensen; Plöckinger, Andreas
This paper presents an analysis of a hydraulic on/off valve from a reliability point of view. The objective is to clarify the potential pitfalls of the current valve design, while identifying the component(s) exerting the most significant risk of failure during the lifetime of the valve. Specific......This paper presents an analysis of a hydraulic on/off valve from a reliability point of view. The objective is to clarify the potential pitfalls of the current valve design, while identifying the component(s) exerting the most significant risk of failure during the lifetime of the valve....... Specifically, the mechanical topology of Fast Switching hydraulic Valves (FSVs) are of interest since these undergo operating cycles in the gigacycle regime in theirs functional lifetime. Application of these FSVs is relevant in e.g. digital displacement units, which for the specific design considered...
Wagner, S; Schmidt, R; Zerlauth, M; Vergara-Fernandez, A
In the design of interlock loops for the signal exchange in machine protection systems, the choice of the hardware architecture impacts on machine safety and availability. The reliable performance of a machine stop (leaving the machine in a safe state) in case of an emergency, is an inherent requirement. The constraints in terms of machine availability on the other hand may differ from one facility to another. Spurious machine stops, lowering machine availability, may to a certain extent be tolerated in facilities where they do not cause undue equipment wearout. In order to compare various interlock loop architectures in terms of safety and availability, the occurrence frequencies of related scenarios have been calculated in a reliability analysis, using a generic analytical model. This paper presents the results and illustrates the potential of the analysis method for supporting the choice of interlock system architectures.
This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.
Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali
In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...
FUNDING NUMBERS PROGRAM PROJECT TASK WORK UNIT ELEMENT NO. INO. INO. ACCESSION NO. 1TLTALE•asecan thClsifatin) eStructured Analysis for the Logistic...REQUIR1MENTS IN REQOOREVENTS DOCUMENTS THAT RELEATE DIRECTLY TO OPERATING AND SUPPORT COSTS RAN PARAMETERS APPR•P, IATE FOR CONTRACTING PURPOSES AND STATED AS...RELIABILITY MENTS PARAXMETERS TO QUANTIFY READINESS, MAINTENANCE ACTIONC AND THE COST OR QUANTITY OF PARTS AND DEMAND FOR MAINTENANCE RESOURCES. El3.2A2B2
Osvaldo Tadeu da Silva Junior
Full Text Available Abstract Aims The behavior of laboratory animals has been studied through displacement, with different objectives by researchers. Methods Although different methods have already been used in the tracking of laboratory animals, manual tracking mode videogrammetry for 2D analysis of displacement has not been observed in animal studies. The aim of this study was to verify the accuracy and reliability of determining the displacement of Wistar rats by means of videogrammetry software Dvideo. The accuracy (between the known distance and distance traced was determined by 3 different evaluators twice consecutive, by videoing a course of 10 meters in the enriched environment together with further analysis of the displacement of the midpoint marked on an apparatus. To calculate reliability (accuracy of the measurement system and reproducibility (precision of the evaluators to obtain the ratio of precision to tolerance (P/T, eight animals were filmed for 10 minutes in the enriched environment and analysis of distance covered by one of the animals by three different reviewers thrice consecutive. Results The results obtained in the course of 10 meters of the known distance demonstrated accuracy of 0.10 m, precision of 0.05 m, and bias of 0.07 m. In the reliability test during the 10 minutes of displacement (m of animal, the ratio of precision to tolerance (P/T = 0.1 m was found between three different evaluators, demonstrating adequate capacity of the measure. Conclusion The manual tracking mode of the Dvideo presented high reliability and it can be employed for the displacement analysis of studies with rat experimental models.
Galkina Elena Vladislavovna
In the article the reliability analysis methods of bidders and their tenders offers for implementation of construction works are offered. The special attention is focused on the complexity of these processes and the necessity of participation of serious, professional and responsible executors. Application of the described methods leads to a conclusion on the decrease of risks related to selection of a participant of a construction pro-ject. In the article the main stages of the implementation...
RELIABILITY OF / MISSILE MATERIEL PROGRANI MISSILE gYDRAULIC AND NEUMATIC SYSTEMS ACTUATOR ANALYSIS .. 7 ." Joe C. Mitchell Approved by: Donald R. Earles...amplifier. 3-5 - M77777,77-7WT7 SECTION 4 ACTUATOR CLASSIFICATION Actuators have been classified in accordance with the mechanism and type. Figure 4-1...definition of the data already on hdrnd. More detailed identification of those units classified only by their generic names should be attempted. A more
Jensen, Jacob Skibsted; Werge, Hans Henrik Malmborg; Egebo, Max
A reported analytical method for tannin quantification relies on selective precipitation of tannins with bovine serum albumin. The reliability of tannin analysis by protein precipitation on wines having variable tannin levels was evaluated by measuring the tannin concentration of various dilutions...... of five commercial red wines. Tannin concentrations of both very diluted and concentrated samples were systematically underestimated, which could be explained by a precipitation threshold and insufficient protein for precipitation, respectively. Based on these findings, we have defined a valid range...
C. L. Smith; K. J. Kvarfordt; S. T. Wood
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with
C. L. Smith; K. J. Kvarfordt; S. T. Wood
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with
networks can be applied to better understand informal trade in developing countries, with a particular focus on Africa. The paper starts by discussing some of the fundamental concepts developed by social network analysis. Through a number of case studies, we show how social network analysis can...... illuminate the relevant causes of social patterns, the impact of social ties on economic performance, the diffusion of resources and information, and the exercise of power. The paper then examines some of the methodological challenges of social network analysis and how it can be combined with other...
Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kurtz, Nolan Scot [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
The majority of current societal and economic needs world-wide are met by the existing networked, civil infrastructure. Because the cost of managing such infrastructure is high and increases with time, risk-informed decision making is essential for those with management responsibilities for these systems. To address such concerns, a methodology that accounts for new information, deterioration, component models, component importance, group importance, network reliability, hierarchical structure organization, and efficiency concerns has been developed. This methodology analyzes the use of new information through the lens of adaptive Importance Sampling for structural reliability problems. Deterioration, multi-scale bridge models, and time-variant component importance are investigated for a specific network. Furthermore, both bridge and pipeline networks are studied for group and component importance, as well as for hierarchical structures in the context of specific networks. Efficiency is the primary driver throughout this study. With this risk-informed approach, those responsible for management can address deteriorating infrastructure networks in an organized manner.
de Marco Ario
Full Text Available Abstract Structural characterization of proteins used in biological experiments is largely neglected. In most publications, the information available is totally insufficient to judge the functionality of the proteins used and, therefore, the significance of identified protein-protein interactions (was the interaction specific or due to unspecific binding of misfolded protein regions? or reliability of kinetic and thermodynamic data (how much protein was in its native form?. As a consequence, the results of single experiments might not only become questionable, but the whole reliability of systems biology, built on these fundaments, would be weakened. The introduction of Minimal Information concerning purified proteins to add as metadata to the main body of a manuscript would render straightforward the assessment of their functional and structural qualities and, consequently, of results obtained using these proteins. Furthermore, accepted standards for protein annotation would simplify data comparison and exchange. This article has been envisaged as a proposal for aggregating scientists who share the opinion that the scientific community needs a platform for Minimum Information for Protein Functionality Evaluation (MIPFE.
Full Text Available Flood defence assets such as earth embankments comprise the vital part of linear flood defences in many countries including the UK and protect inland from flooding. The risks of flooding are likely to increase in the future due to increasing pressure on land use, increasing rainfall events and rising sea level caused by climate change also affect aging flood defence assets. Therefore, it is important that the flood defence assets are maintained at a high level of safety and serviceability. The high costs associated with preserving these deteriorating flood defence assets and the limited funds available for their maintenance require the development of systematic approaches to ensure the sustainable flood-risk management system. The integration of realistic deterioration measurement and reliabilitybased performance assessment techniques has tremendous potential for structural safety and economic feasibility of flood defence assets. Therefore, the need for reliability-based performance assessment is evident. However, investigations on time-dependent reliability analysis of flood defence assets are limited. This paper presents a novel approach for time-dependent reliability analysis of flood defence assets. In the analysis, time-dependent fragility curve is developed by using the state-based stochastic deterioration model. The applicability of the proposed approach is then demonstrated with a case study.
Full Text Available Estimating the Probability Density Function (PDF of the performance function is a direct way for structural reliability analysis, and the failure probability can be easily obtained by integration in the failure domain. However, efficiently estimating the PDF is still an urgent problem to be solved. The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation, whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs. While in fact, structures with correlated inputs always exist in engineering, thus this paper improves the maximum entropy method, and applies the Unscented Transformation (UT technique to compute the fractional moments of the performance function for structures with correlations, which is a very efficient moment estimation method for models with any inputs. The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations. Besides, the number of function evaluations of the proposed method in reliability analysis, which is determined by UT, is really small. Several examples are employed to illustrate the accuracy and advantages of the proposed method.
Full Text Available Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general for the postulated transient event.
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin [Argonne National Laboratory, Argonne (United States)
Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.
Jin, Young Ho; Choi, Sun Yeong; Yang, Joon Eon [Korea Atomic Energy Research Institute, Taejon (Korea)
The EDG (Emergency Diesal Generator) of nuclear power plant is one of the most important equipments in mitigating accidents. The FT (Fault Tree) method is widely used to assess the reliability of safety systems like an EDG in nuclear power plant. This method, however, has limitations in modeling dynamic features of safety systems exactly. We, hence, have developed a Markov model to represent the stochastic process of dynamic systems whose states change as time moves on. The Markov model enables us to develop a dynamic reliability model of EDG. This model can represent all possible states of EDG comparing to the FRANTIC code developed by U.S. NRC for the reliability analysis of standby systems. to access the regulation policy for test interval, we performed two simulations based on the generic data and plant specific data of YGN 3, respectively by using the developed model. We also estimate the effects of various repair rates and the fractions of starting failures by demand shock to the reliability of EDG. And finally, Aging effect is analyzed. (author). 23 refs., 19 figs., 9 tabs.
Murilo Leite Alcantara
Full Text Available The sanitizing industry has had a growing importance in the Brazilian industrial sector. One of the most important steps for most of the sanitizing industries is the production processes of bottles. The production of these bottles is usually carried out through polymer processing methods such as extrusion and blow molding. In order to increase the industrial production safely, it is necessary to use methods to reduce the occurrence of failures. The study of production systems reliability can be used as a tool to understand and predict the behavior of industrial units’ failures. This paper studies the reliability behavior of a sanitizing industry with the focus on the bottles production. This study was based on three modeling methodologies: Global Life Distribution (GLD, Composite Life Distribution (CDL and Optimum Composite Life Distribution (OCDL. The distributions used were: exponential, Weibull, normal, lognormal, q-exponential and q- Weibull. The reliability behaviors found by these three methodologies were compared with the reliability obtained experimentally. The OCLD methodology overcomes the limitations of both GLD and CLD methodologies, but inherits their strengths. The OCDL methodology resulted in a good representation of the subsystems as well as the overall system behavior. It is also carried out in this paper an analysis of the contribution of the subsystems to the overall probability of system failure. The outgoing profit was calculated and it is a potential indicative of financial benefit that can be obtained with possible future developments of the study.
Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián
The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.
Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack
combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates......Canonical correlation analysis (CCA) is an established multi-variate statistical method for finding similarities between linear combinations of (normally two) sets of multivariate observations. In this contribution we replace (linear) correlation as the measure of association between the linear...... analysis of variables with different genesis and therefore different statistical distributions and different modalities. As a proof of concept we give a toy example. We also give an example with one (weather radar based) variable in the one set and eight spectral bands of optical satellite data...
Spiers, Harry; Amin, Nikul; Lakhani, Raj; Martin, Andrew J; Patel, Parag M
The aim of this study is to objectively assess the quality and readability of websites related to vestibular schwannomas. Patients are increasingly seeking information on confirmed or suspected diagnoses through the Internet. Clinicians are often concerned regarding the accuracy, quality, and readability of web-based sites. Online information relating to vestibular schwannoma was searched using the three most popular search engines. The terms "acoustic neuroma" and "vestibular schwannoma" were used. The top 50 results from each site were assessed for readability using the Flesch-Kincaid Grade Level, Flesch Reading Ease Score, and the Gunning-Fog Index. Quality of website information was scored using the DISCERN tool. Of 300 search results analyzed, 58 separate appropriate websites were identified. The mean readability score using Flesch-Kincaid Grade Level was 10.27 (95% confidence interval [CI] 9.84-10.70). The mean Flesch Reading Ease Score was 48.75 (95% CI 46.57-50.92). The Gunning-Fog Index was 13.40 (95% CI 12.92-13.89). These scores equate to someone finishing secondary school/first year university student. DISCERN scores were highly variable but consistently demonstrated great variability in quality of information. Online patient information on vestibular schwannoma is highly variable in quality. Although there are a wide range of different websites easily available to patients on their condition and its treatment options, the information is written at a difficult level which may exceed the understanding level of many patients as it is written at a higher than average level of expected reading ability.
Current evidence reveals a continuing upward trend in the misuse of illicit drugs in Slovenia. However, the science of estimating the prevalence of drug abuse and related problems is still undeveloped. Because of current data gathering practices, the data that are available are often of poor quality. In this paper the author describes two methods for estimating the prevalence of heroin abuse, the key informant approach and the nomination technique, which were used because there were no other reliable sources of information. These methods produced estimates and brought to light a number of problems that researchers would have to solve in their pursuit of more reliable, relevant and useful data. However, speculating about the extent of illicit drug use in the country is still problematic. Basic data collection and analyses at the national level must be improved. It is of vital importance to develop strategies and methods for obtaining estimates and thus more adequate information on which to base demand reduction strategies, to increase the number of epidemiologists and to establish a central information unit in the country.
Tretjakovs, S; Paramonovs, J
The scientific article addresses the dependence of aircraft fleet safety on the human factor. The article demonstrates the significance of information exchange concerning the open fatigue cracks, which is necessary to bring a new type of aircraft into operation. The article provides numerical examples obtained by means of the Monte Carlo method and considers the dependences of failure probability on various factors.
This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be
Margolis, Katherine A; Bernat, Jennifer K; Keely O'Brien, Erin; Delahanty, Janine C
Tobacco products and smoke contain more than 7000 chemicals (ie, constituents). Research shows that consumers have poor understanding of tobacco constituents and find communication about them to be confusing. The current content analysis describes how information is communicated about tobacco constituents online in terms of source, target audience, and message. A search was conducted in September 2015 using tobacco constituent and tobacco terms and identified 226 relevant Web sites for coding. Web sites were coded for type, target audience, reading level, constituent information, type of tobacco product, health effects, and emotional valence by two coders who independently coded half of the sample. There was a 20% overlap to assess interrater reliability, which was high (κ = .83, p < .001). The mean reading grade level of information online was 8.2 (SD = 2.8) with 81.7% of Web sites above the sixth grade reading level. Nearly all Web sites presented information in a qualitative narrative format (93%) and almost half (48.2%) presented information in a quantitative format. Nicotine (59.3%) and nitrosamines (28.8%) were the mostly frequently mentioned tobacco constituents. Cancer was the most frequently mentioned health effect (51.3%). Nearly a quarter (23%) of the Web sites did not explicitly state that tobacco constituents or tobacco products are associated with health effects. Large gaps exist in online information about tobacco constituents including incomplete information about tobacco constituent-related health effects and limited information about tobacco products other than cigarettes and smokeless tobacco. This study highlights opportunities to improve the content and presentation of information related to tobacco constituents. The US Food and Drug Administration (FDA) is required to publicly display a list of tobacco constituents in tobacco products and tobacco smoke by brand. However, little is known about tobacco constituent information available to the
Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist
The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori...... to analyze the uncertainty of model predictions. This allows judging the fitness of the model to the purpose under uncertainty. Hence we recommend uncertainty analysis as a proactive solution when faced with model uncertainty, which is the case for biofuel process development research....
Fuller-Tyszkiewicz, Matthew; Hartley-Clark, Linda; Cummins, Robert A; Tomyn, Adrian J; Weinberg, Melissa K; Richardson, Ben
The past 2 decades have seen increasing use of experience sampling methods (ESMs) to gain insights into the daily experience of affective states (e.g., its variability, as well as antecedents and consequences of temporary shifts in affect). Much less attention has been given to methodological challenges, such as how to ensure reliability of test scores obtained using ESM. The present study demonstrates the use of dynamic factor analysis (DFA) to quantify reliability of test scores in ESM contexts, evaluates the potential impact of unreliable test scores, and seeks to identify characteristics of individuals that may account for their unreliable test scores. One hundred twenty-seven participants completed baseline measures (demographics and personality traits), followed by a 7-day ESM phase in which positive and negative state affect were measured up to 6 times per day. Analyses showed that although at the sample level, scores on these affect measures exhibited adequate levels of reliability, up to one third of participants failed to meet conventional standards of reliability. Where these low reliability estimates were not significantly associated with personality factors, they could-in some cases-be explained by model misspecification where a meaningful alternative structure was available. Despite these potential differences in factor structure across participants, subsequent modeling with and without these "unreliable" cases showed similar substantive results. Hence, the present findings suggest typical analyses based on ESM data may be robust to individual differences in data structure and/or quality. Ways to augment the DFA approach to better understand unreliable cases are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Fonseca, Renato Alves; Alvarenga, Marco Antonio Bayout; Gibelli, Sonia Maria Orlando [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)]. E-mails: firstname.lastname@example.org; email@example.com; firstname.lastname@example.org; Alvim, Antonio Carlos Marques; Frutuoso e Melo, Paulo Fernando Ferreira [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil)]. E-mails: Alvim@con.ufrj.br; email@example.com
The main purpose of this work is to perform a human reliability analysis using THERP (Technique for Human Error Prediction) and ATHEANA (A Technique for Human Error Analysis) methodologies, as well as their application to the development of qualitative and quantitative analysis of a nuclear power plant accident. The accident selected was the one that occurred at the Three Mile Island (TMI) Unit 2 Pressurized Water Reactor (PWR) nuclear power plan. The accident analysis has revealed a series of unsafe actions that resulted in permanent loss of the unit. This study also aims at enhancing the understanding of THERP and ATHEANA methodologies and their possible interactions with practical applications. The TMI accident analysis has pointed out the possibility of integration of THERP and ATHEANA methodologies. In this work, the integration between both methodologies is developed in a way to allow better understanding of the influence of operational context on human errors. (author)
Full Text Available The scientific article addresses the dependence of aircraft fleet safety on the human factor. The article demonstrates the significance of information exchange concerning the open fatigue cracks, which is necessary to bring a new type of aircraft into operation. The article provides numerical examples obtained by means of the Monte Carlo method and considers the dependences of failure probability on various factors.
Brezgin, V. I.; Brodov, Yu M.; Kultishev, A. Yu
The report presents improvement methods review in the fields of the steam turbine units design and operation based on modern information technologies application. In accordance with the life cycle methodology support, a conceptual model of the information support system during life cycle main stages (LC) of steam turbine unit is suggested. A classifying system, which ensures the creation of sustainable information links between the engineer team (manufacture’s plant) and customer organizations (power plants), is proposed. Within report, the principle of parameterization expansion beyond the geometric constructions at the design and improvement process of steam turbine unit equipment is proposed, studied and justified. The report presents the steam turbine unit equipment design methodology based on the brand new oil-cooler design system that have been developed and implemented by authors. This design system combines the construction subsystem, which is characterized by extensive usage of family tables and templates, and computation subsystem, which includes a methodology for the thermal-hydraulic zone-by-zone oil coolers design calculations. The report presents data about the developed software for operational monitoring, assessment of equipment parameters features as well as its implementation on five power plants.
AVRAM (BOITOŞ CAMELIA
Full Text Available Since the crediting activity occupies the primary role in any banking institution, the necessity of study and its analysis is required by following some steps both at a microeconomic level and at a macroeconomic one. Going over the theme, which aims at the importance of a thorough study of the clients' reliability for highlighting the existing risks in the banking activity in general, and particularly in the crediting activity, becomes as more interesting as their approach at the level of the member states of the European Union tends towards a common theme. Starting from these premises, we have considered as being extremely important the selection of a research theme which aims at the risks in the crediting activity, emphasizing the analysis of the clients' reliability both in the decision of crediting in order to assume the risk of credit and in determining the financial and banking performances. Being given the complexity of the existing risks in the banking activity in general and particularly of the credit risk, the clients' reliability represents an extremely important field, of research and application, bearing in mind both the actual stage of developing the Romanian banking system and the attempts of its alignment to the requirements imposed by the European Union.
Lannoy, A.; Nitoi, M.; Backstrom, O.; Burgazzi, L.; Couallier, V.; Nikulin, M.; Derode, A.; Rodionov, A.; Atwood, C.; Fradet, F.; Antonov, A.; Berezhnoy, A.; Choi, S.Y.; Starr, F.; Dawson, J.; Palmen, H.; Clerjaud, L
The purpose of the workshop was to present the experience of practical application of time-dependent reliability models. The program of the workshop comprises the following sessions: -) aging management and aging PSA (Probabilistic Safety Assessment), -) modeling, -) operation experience, and -) accelerating aging tests. In order to introduce time aging effect of particular component to the PSA model, it has been proposed to use the constant unavailability values on the short period of time (one year for example) calculated on the basis of age-dependent reliability models. As for modeling, it appears that the problem of too detailed statistical models for application is the lack of data for required parameters. As for operating experience, several methods of operating experience analysis have been presented (algorithms for reliability data elaboration and statistical identification of aging trend). As for accelerated aging tests, it is demonstrated that a combination of operating experience analysis with the results of accelerated aging tests of naturally aged equipment could provide a good basis for continuous operation of instrumentation and control systems.
Montani, S. [Dipartimento di Informatica, Universita del Piemonte Orientale, Via Bellini 25g, 15100 Alessandria (Italy)], E-mail: firstname.lastname@example.org; Portinale, L. [Dipartimento di Informatica, Universita del Piemonte Orientale, Via Bellini 25g, 15100 Alessandria (Italy)], E-mail: email@example.com; Bobbio, A. [Dipartimento di Informatica, Universita del Piemonte Orientale, Via Bellini 25g, 15100 Alessandria (Italy)], E-mail: firstname.lastname@example.org; Codetta-Raiteri, D. [Dipartimento di Informatica, Universita del Piemonte Orientale, Via Bellini 25g, 15100 Alessandria (Italy)], E-mail: email@example.com
In this paper, we present RADYBAN (Reliability Analysis with DYnamic BAyesian Networks), a software tool which allows to analyze a dynamic fault tree relying on its conversion into a dynamic Bayesian network. The tool implements a modular algorithm for automatically translating a dynamic fault tree into the corresponding dynamic Bayesian network and exploits classical algorithms for the inference on dynamic Bayesian networks, in order to compute reliability measures. After having described the basic features of the tool, we show how it operates on a real world example and we compare the unreliability results it generates with those returned by other methodologies, in order to verify the correctness and the consistency of the results obtained.
Frier, Christian; Sørensen, John Dalsgaard
For many reinforced concrete structures corrosion of the reinforcement is an important problem since it can result in maintenance and repair actions. Further, a reduction of the load-bearing capacity can occur. In the present paper the Finite Element Reliability Method (FERM) is employed...... for obtaining the probability of exceeding a critical chloride concentration level at the reinforcement bars, both using Monte Carlo Simulation (MCS) and the First Order Reliability Method (FORM). The chloride ingress is modelled by the Finite Element Method (FEM) and the diffusion coefficient, surface chloride...... concentration and reinforcement cover depth are modelled by stochastic fields, which are discretized using the Expansion Optimum Linear Estimation (EOLE) approach. The response gradients needed for FORM analysis are derived analytically using the Direct Differentiation Method (DDM). As an example, a bridge pier...
Full Text Available This paper proposes data-dependent reliability evaluation methodology for digital systems described at Register Transfer Level (RTL. It uses a hybrid hierarchical approach, combining the accuracy provided by Gate Level (GL Simulated Fault Injection (SFI and the low simulation overhead required by RTL fault injection. The methodology comprises the following steps: the correct simulation of the RTL system, according to a set of input vectors, hierarchical decomposition of the system into basic RTL blocks, logic synthesis of basic RTL blocks, data-dependent SFI for the GL netlists, and RTL SFI. The proposed methodology has been validated in terms of accuracy on a medium sized circuit – the parallel comparator used in Check Node Unit (CNU of the Low-Density Parity-Check (LDPC decoders. The methodology has been applied for the reliability analysis of a 128-bit Advanced Encryption Standard (AES crypto-core, for which the GL simulation was prohibitive in terms of required computational resources.
Dimitrov, Nikolay Krasimirov; Bitsche, Robert; Blasques, José Pedro Albergaria Amaral
This paper presents a methodology for structural reliability analysis of wind turbine blades. The study introduces several novel elements by taking into account loading direction using a multiaxial probabilistic load model, considering random material strength, spatial correlation between material...... properties, progressive material failure, and system reliability effects. An example analysis of reliability against material failure is demonstrated for a blade cross section. Based on the study we discuss the implications of using a system reliability approach, the effect of spatial correlation length......, type of material degradation algorithm, and reliability methods on the system failure probability, as well as the main factors that have an influence on the reliability. (C) 2017 Elsevier Ltd. All rights reserved....
Brink, Yolandi; Louw, Quinette; Grimmer, Karen; Schreve, Kristiaan; van der Westhuizen, Gareth; Jordaan, Esmè
The lack of clear understanding of the association between sitting posture and adolescent musculoskeletal pain, might reflect invalid and/or unreliable posture measurement instruments. The psychometric properties of any new measurement instrument should be demonstrated prior to use for research or clinical purposes. This paper describes psychometric testing of a new three-dimensional (3D), portable, non-invasive posture analysis tool (3D-PAT), from sequential studies using a mannequin and high school students. The first study compared the 3D-(X-, Y- and Z-) coordinates of reflective markers placed on a mannequin using the 3D-PAT, and the Vicon motion analysis system. This study also tested the reliability of taking repeated measures of the 3D-coordinates of the reflective markers. The second study determined the concurrent validity and test-retest reliability of the 3D-PAT measurements of nine sitting postural angles of high school students undertaking a standard computing task. In both studies, concordance correlation coefficients and Intraclass correlation coefficients described test-retest reliability, whilst Pearson product moment correlation coefficients and Bland-Altman plots demonstrated concurrent validity. The 3D-PAT provides reliable and valid 3D measurements of five of the nine postural angles i.e. head flexion, neck flexion, cranio-cervical angle, trunk flexion and head lateral bending in adolescents undertaking a standard task. The 3D-PAT is appropriate for research and clinical settings to measure five upper quadrant postural angles in three dimensions. As a measurement instrument it can provide further understanding of the relationship between sitting posture, changes to sitting posture and adolescent musculoskeletal pain.
Full Text Available We generalize the point information gain (PIG and derived quantities, i.e., point information gain entropy (PIE and point information gain entropy density (PIED, for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the real data with the examples of several images and discuss further possible utilizations in other fields of data processing.
Barbaranelli, Claudio; Lee, Christopher S.; Vellone, Ercole; Riegel, Barbara
The Self-Care of Heart Failure Index (SCHFI) is used widely, but issues with reliability have been evident. Cronbach alpha coefficient is usually used to assess reliability, but this approach assumes a unidimensional scale. The purpose of this article is to address the dimensionality and internal consistency reliability of the SCHFI. This was a secondary analysis of data from 629 adults with heart failure enrolled in three separate studies conducted in the northeastern and northwestern United States. Following testing for scale dimensionality using confirmatory factor analysis, reliability was tested using coefficient alpha and alternative options. Confirmatory factor analysis demonstrated that: a) the self-care maintenance scale has a multidimensional 4-factor structure; b) the self-care management scale has a 2-factor structure, but the primary factors loaded on a common higher-order factor; and c) the self-care confidence scale is unidimensional. Reliability estimates for the three scales, obtained with methods compatible with each scale’s dimensionality, were adequate or high. The results of the analysis demonstrate that issues of dimensionality and reliability cannot be separated. Appropriate estimates of reliability that are consistent with the dimensionality of the scale must be used. In the case of the SCHFI, coefficient alpha should not be used to assess reliability of the self-care maintenance and the self-care management scales, due to their multidimensionality. We recommend testing dimensionality before assessing reliability, as well using multiple indices of reliability, such as model-based internal consistency, composite reliability, and omega and maximal reliability coefficients. PMID:25324013
Barbaranelli, Claudio; Lee, Christopher S; Vellone, Ercole; Riegel, Barbara
The Self-Care of Heart Failure Index (SCHFI) is used widely, but issues with reliability have been evident. Cronbach alpha coefficient is usually used to assess reliability, but this approach assumes a unidimensional scale. The purpose of this article is to address the dimensionality and internal consistency reliability of the SCHFI. This was a secondary analysis of data from 629 adults with heart failure enrolled in three separate studies conducted in the northeastern and northwestern United States. Following testing for scale dimensionality using confirmatory factor analysis, reliability was tested using coefficient alpha and alternative options. Confirmatory factor analysis demonstrated that: (a) the Self-Care Maintenance Scale has a multidimensional four-factor structure; (b) the Self-Care Management Scale has a two-factor structure, but the primary factors loaded on a common higher-order factor; and (c) the Self-Care Confidence Scale is unidimensional. Reliability estimates for the three scales, obtained with methods compatible with each scale's dimensionality, were adequate or high. The results of the analysis demonstrate that issues of dimensionality and reliability cannot be separated. Appropriate estimates of reliability that are consistent with the dimensionality of the scale must be used. In the case of the SCHFI, coefficient alpha should not be used to assess reliability of the self-care maintenance and the self-care management scales, due to their multidimensionality. When performing psychometric evaluations, we recommend testing dimensionality before assessing reliability, as well using multiple indices of reliability, such as model-based internal consistency, composite reliability, and omega and maximal reliability coefficients. © 2014 Wiley Periodicals, Inc.
design issues that required detailed analysis of analog effects by circuit simulations that modeled parasitic resistances and capacitances. While large...for the configuration memory and appropriate spacing and shielding for the interconnect lines in the AFPGA. Extensive SPICE simulations predict that...adopted by the ASIC industry. As a result, this tool has broad support for commercial hardware description languages such as Verilog and VHDL. A
Full Text Available To estimate the success criteria time windows of operator actions the conservative approach was used in the conventional probabilistic safety assessment (PSA. The current PSA standard recommends the use of best-estimate codes. The purpose of the study was to estimate the operator action success criteria time windows in scenarios in which the human actions are supplement to safety systems actuations, needed for updated human reliability analysis (HRA. For calculations the RELAP5/MOD3.3 best estimate thermal-hydraulic computer code and the qualified RELAP5 input model representing a two-loop pressurized water reactor, Westinghouse type, were used. The results of deterministic safety analysis were examined what is the latest time to perform the operator action and still satisfy the safety criteria. The results showed that uncertainty analysis of realistic calculation in general is not needed for human reliability analysis when additional time is available and/or the event is not significant contributor to the risk.
The main objective of the paper is to developed a methodology, named as vague Lambda-Tau, for reliability analysis of repairable systems. Petri net tool is applied to represent the asynchronous and concurrent processing of the system instead of fault tree analysis. To enhance the relevance of the reliability study, vague set theory is used for representing the failure rate and repair times instead of classical(crisp) or fuzzy set theory because vague sets are characterized by a truth membership function and false membership functions (non-membership functions) so that sum of both values is less than 1. The proposed methodology involves qualitative modeling using PN and quantitative analysis using Lambda-Tau method of solution with the basic events represented by intuitionistic fuzzy numbers of triangular membership functions. Sensitivity analysis has also been performed and the effects on system MTBF are addressed. The methodology improves the shortcomings of the existing probabilistic approaches and gives a better understanding of the system behavior through its graphical representation. The washing unit of a paper mill situated in a northern part of India, producing approximately 200 ton of paper per day, has been considered to demonstrate the proposed approach. The results may be helpful for the plant personnel for analyzing the systems' behavior and to improve their performance by adopting suitable maintenance strategies. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
Full Text Available Rainwater harvesting (RWH may be an effective alternative water supply solution in regions affected by water scarcity. It has recently become a particularly important option in arid and semi-arid areas (like Mediterranean basins, mostly because of its many benefits and affordable costs. This study provides an analysis of the reliability of using a rainwater harvesting system to supply water for toilet flushing and garden irrigation purposes, with reference to a single-family home in a residential area of Sicily (Southern Italy. A flushing water demand pattern was evaluated using water consumption data collected from a sample of residential customers during an extended measurement campaign. A daily water balance simulation of the rainwater storage tank was performed, and the yield-after-spillage algorithm was used to define the tank release rule. The model’s performance was evaluated using rainfall data from more than 100 different sites located throughout the Sicilian territory. This regional analysis provided annual reliability curves for the system as a function of mean annual precipitation, which have practical applications in this area of study. The uncertainty related to the regional model predictions was also assessed. A cost-benefit analysis highlighted that the implementation of a rainwater harvesting system in Sicily can provide environmental and economic advantages over traditional water supply methods. In particular, the regional analysis identified areas where the application of this system would be most effective.
Full Text Available To better understand information about human health from databases we analyzed three datasets collected for different purposes in Canada: a biomedical database of older adults, a large population survey across all adult ages, and vital statistics. Redundancy in the variables was established, and this led us to derive a generalized (macroscopic state variable, being a fitness/frailty index that reflects both individual and group health status. Evaluation of the relationship between fitness/frailty and the mortality rate revealed that the latter could be expressed in terms of variables generally available from any cross-sectional database. In practical terms, this means that the risk of mortality might readily be assessed from standard biomedical appraisals collected for other purposes.
Xavier-Elsas, Pedro; Bastos, Sandra Epifânio; Gaspar-Elsas, Maria Ignez C
To determine whether online diffusion of the "Ten Warning Signs of Primary Immunodeficiency Diseases (PID)'' adheres to accepted scientific standards. We analyzed how reproducible is online diffusion of a unique instrument, the "Ten Warning Signs of PID", created by the Jeffrey Modell Foundation (JMF), by Google-assisted searches among highly visited sites from professional, academic and scientific organizations; governmental agencies; and patient support/advocacy organizations. We examined the diffusion, consistency of use and adequate referencing of this instrument. Where applicable, variant versions of the instrument were examined for changes in factual content that would have practical impact on physicians or on patients and their families. Among the first 100 sites identified by Google search, 85 faithfully reproduced the JMF model, and correctly referenced to its source. By contrast, the other 15 also referenced the JMF source but presented one or more changes in content relative to their purported model and therefore represent uncontrolled variants, of unknown origin. Discrepancies identified in the latter included changes in factual content of the original JMF list (C), as well as removal (R) and introduction (I) of novel signs (Table 2), all made without reference to any scientific publications that might account for the drastic changes in factual content. Factual changes include changes in the number of infectious episodes considered necessary to raise suspicion of PID, as well as the inclusion of various medical conditions not mentioned in the original. Together, these changes will affect the way physicians use the instrument to consult or to inform patients, and the way patients and families think about the need for specialist consultation in view of a possible PID diagnosis. The retrieved adaptations and variants, which significantly depart from the original instrument, raise concerns about standards for scientific information provided online to
Perge, János A; Zhang, Shaomin; Malik, Wasim Q; Homer, Mark L; Cash, Sydney; Friehs, Gerhard; Eskandar, Emad N; Donoghue, John P; Hochberg, Leigh R
Action potentials and local field potentials (LFPs) recorded in primary motor cortex contain information about the direction of movement. LFPs are assumed to be more robust to signal instabilities than action potentials, which makes LFPs, along with action potentials, a promising signal source for brain-computer interface applications. Still, relatively little research has directly compared the utility of LFPs to action potentials in decoding movement direction in human motor cortex. We conducted intracortical multi-electrode recordings in motor cortex of two persons (T2 and [S3]) as they performed a motor imagery task. We then compared the offline decoding performance of LFPs and spiking extracted from the same data recorded across a one-year period in each participant. We obtained offline prediction accuracy of movement direction and endpoint velocity in multiple LFP bands, with the best performance in the highest (200-400 Hz) LFP frequency band, presumably also containing low-pass filtered action potentials. Cross-frequency correlations of preferred directions and directional modulation index showed high similarity of directional information between action potential firing rates (spiking) and high frequency LFPs (70-400 Hz), and increasing disparity with lower frequency bands (0-7, 10-40 and 50-65 Hz). Spikes predicted the direction of intended movement more accurately than any individual LFP band, however combined decoding of all LFPs was statistically indistinguishable from spike-based performance. As the quality of spiking signals (i.e. signal amplitude) and the number of significantly modulated spiking units decreased, the offline decoding performance decreased 3.6[5.65]%/month (for T2 and [S3] respectively). The decrease in the number of significantly modulated LFP signals and their decoding accuracy followed a similar trend (2.4[2.85]%/month, ANCOVA, p = 0.27[0.03]). Field potentials provided comparable offline decoding performance to unsorted spikes. Thus
Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.
Ryan, T.G.; Haney, L.N.; Ostrom, L.T.
This paper addresses one major cause for large uncertainties in human reliability analysis (HRA) results, that is, an absence of detailed function, task, timeline, link and human vulnerability analyses. All too often this crucial step in the HRA process is done in a cursory fashion using word of mouth or written procedures which themselves may incompletely or inaccurately represent the human action sequences and human error vulnerabilities being analyzed. The paper examines the potential contributions these detailed analyses can make in achieving quantitative and qualitative HRA results which are: (1) creditable, that is, minimize uncertainty, (2) auditable, that is, systematically linking quantitative results and qualitative information from which the results are derived, (3) capable of supporting root cause analyses on human reliability factors determined to be major contributors to risk, and (4) capable of repeated measures and being combined with similar results from other analyses to examine HRA issues transcending individual systems and facilities. Based on experience analyzing test and commercial nuclear reactors, and medical applications of nuclear technology, an iterative process is suggested for doing detailed function, task, timeline, link and human vulnerability analyses using documentation reviews, open-ended and structured interviews, direct observations, and group techniques. Finally, the paper concludes that detailed analyses done in this manner by knowledgeable human factors practitioners, can contribute significantly to the credibility, auditability, causal factor analysis, and combining goals of the HRA.
Caruso, J C
The unreliability of difference scores is a well documented phenomenon in the social sciences and has led researchers and practitioners to interpret differences cautiously, if at all. In the case of the Kaufman Adult and Adolescent Intelligence Test (KAIT), the unreliability of the difference between the Fluid IQ and the Crystallized IQ is due to the high correlation between the two scales. The consequences of the lack of precision with which differences are identified are wide confidence intervals and unpowerful significance tests (i.e., large differences are required to be declared statistically significant). Reliable component analysis (RCA) was performed on the subtests of the KAIT in order to address these problems. RCA is a new data reduction technique that results in uncorrelated component scores with maximum proportions of reliable variance. Results indicate that the scores defined by RCA have discriminant and convergent validity (with respect to the equally weighted scores) and that differences between the scores, derived from a single testing session, were more reliable than differences derived from equal weighting for each age group (11-14 years, 15-34 years, 35-85+ years). This reliability advantage results in narrower confidence intervals around difference scores and smaller differences required for statistical significance.
Brandon, Raphael; Howatson, Glyn; Hunter, Angus
An analysis system for barbell weightlifting exercises is proposed to record reliable performance and neuromuscular responses. The system consists of surface electromyography (sEMG) synchronized with electrogoniometry and a barbell position transducer. The purpose of this study was to establish the reliability of the three components of the system. Nine males (age 28.9 ± 4.8 years, mass 85.7 ± 15.1 kg) performed squat exercise at three loads on three separate trial days. A data acquisition and software system processed maximal knee angle (flexion), mean power for the concentric phase of squat exercise, and normalized root mean square of the vastus lateralis. Inter-trial coefficients of variation for each variable were calculated as 5.3%, 7.8%, and 7.5% respectively. In addition, knee joint motion and barbell displacement were significantly related to each other (bar displacement (m) = 1.39-0.0057 × knee angle (degress), with goodness-of-fit value, r² = 0.817), suggesting knee goniometry alone can represent the kinematics of a multi-joint squat exercise. The proven reliability of the three components of this system allows for real-time monitoring of resistance exercise using the preferred training methods of athletes, which could be valuable in the understanding of the neuromuscular response of elite strength training methods.
Koo, Kevin; Shee, Kevin; Yap, Ronald L
Despite the prevalence of overactive bladder (OAB) and the widespread accessibility of patient education information on the Internet, the readability of this information and its potential impact on patient decision-making are not known. This study evaluates the readability of OAB material online in the context of website ownership and the Health on the Net standard for information reliability. Three Internet search platforms were queried daily with OAB-related keywords for 30 days. Readability analysis was performed using the SMOG test, Dale-Chall readability formula, and Fry readability graph. Websites were stratified by ownership type and Health on the Net certification to compare readability metrics. After 270 total searches, 57 websites were analyzed. Mean SMOG reading grade was 10.7 (SD = 1.6) and 10.1 in an adjusted calculation to reduce overestimation from medical jargon. Mean Dale-Chall score was 9.2 (SD = 0.9), or grade 13-15. Mean Fry graph coordinates (177 syllables, 5.9 sentences) corresponded to grade 15. Only seven sites (12%) were predicted to be readable by the average adult with an eighth-grade reading level. Mean reading grades were not significantly different between academic versus commercial sites and Health on the Net-certified versus non-certified sites. A large majority of online information about OAB treatment exceeds the reading ability of most adults. Neither websites sponsored by academic institutions nor those certified by the Health on the Net standard have easier readability. The readability of health information online may be distinct from reliability in the context of urological literacy. © 2017 Wiley Periodicals, Inc.
Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian
Reliability analysis of fiber-reinforced composite structures is a relatively unexplored field, and it is therefore expected that engineers and researchers trying to apply such an approach will meet certain challenges until more knowledge is accumulated. While doing the analyses included in the p...... to be a fast and efficient way to calculate the reliability index of a complex composite structure....
Dimitrov, Nikolay Krasimiroy; Friis-Hansen, Peter; Berggreen, Christian
This paper presents a reliability analysis of a composite blade profile. The so-called Model Correction Factor technique is applied as an effective alternate approach to the response surface technique. The structural reliability is determined by use of a simplified idealised analytical model whic...
Duncan, Laura; Comeau, Jinette; Wang, Li; Vitoroulis, Irene; Boyle, Michael H; Bennett, Kathryn
A better understanding of factors contributing to the observed variability in estimates of test-retest reliability in published studies on standardized diagnostic interviews (SDI) is needed. The objectives of this systematic review and meta-analysis were to estimate the pooled test-retest reliability for parent and youth assessments of seven common disorders, and to examine sources of between-study heterogeneity in reliability. Following a systematic review of the literature, multilevel random effects meta-analyses were used to analyse 202 reliability estimates (Cohen's kappa = ҡ) from 31 eligible studies and 5,369 assessments of 3,344 children and youth. Pooled reliability was moderate at ҡ = .58 (CI 95% 0.53-0.63) and between-study heterogeneity was substantial (Q = 2,063 (df = 201), p reliability varied across informants for specific types of psychiatric disorder (ҡ = .53-.69 for parent vs. ҡ = .39-.68 for youth) with estimates significantly higher for parents on attention deficit hyperactivity disorder, oppositional defiant disorder and the broad groupings of externalizing and any disorder. Reliability was also significantly higher in studies with indicators of poor or fair study methodology quality (sample size reliability of SDIs and the usefulness of these tools in both clinical and research contexts. Potential remedies include the introduction of standardized study and reporting requirements for reliability studies, and exploration of other approaches to assessing and classifying child and adolescent psychiatric disorder. © 2018 Association for Child and Adolescent Mental Health.
Kinilakodi, Harisha; Grayson, R Larry
The scrutiny of underground coal mine safety was heightened because of the disasters that occurred in 2006-2007, and more recently in 2010. In the aftermath of the 2006 incidents, the U.S. Congress passed the Mine Improvement and New Emergency Response Act of 2006 (MINER Act), which strengthened the existing regulations and mandated new laws to address various issues related to emergency preparedness and response, escape from an emergency situation, and protection of miners. The National Mining Association-sponsored Mine Safety Technology and Training Commission study highlighted the role of risk management in identifying and controlling major hazards, which are elements that could come together and cause a mine disaster. In 2007 MSHA revised its approach to the "Pattern of Violations" (POV) process in order to target unsafe mines and then force them to remediate conditions in their mines. The POV approach has certain limitations that make it difficult for it to be enforced. One very understandable way to focus on removing threats from major-hazard conditions is to use citation-related reliability analysis. The citation reliability approach, which focuses on the probability of not getting a citation on a given inspector day, is considered an analogue to the maintenance reliability approach, which many mine operators understand and use. In this study, the citation reliability approach was applied to a stratified random sample of 31 underground coal mines to examine its potential for broader application. The results clearly show the best-performing and worst-performing mines for compliance with mine safety standards, and they highlight differences among different mine sizes. Copyright © 2010 Elsevier Ltd. All rights reserved.
Lynall, Robert C; Zukowski, Lisa A; Plummer, Prudence; Mihalik, Jason P
Our purpose was to determine the validity and test-retest reliability of the Protokinetics Movement Analysis Software (PKMAS) in measuring center of pressure (COP) during walking as compared to a force plate gold standard. Twenty-five healthy participants (14 females, 11 males; age 20.0±1.5years) completed 2 testing sessions approximately 5days apart (mean=5.5±1.1 days). In each session, participants completed 16 total trials across a 6m walkway: 8 trials walking on a ProtoKinetics Zeno Walkway using PKMAS and 8 trials walking over 2 force plates arranged in an offset tandem pattern. COP path length (cm) and speed (cm/s) were calculated from data averaged across the 8 trials on a given device for a given foot. Intraclass correlation coefficients (ICC 2, k) were computed to determine between session reliability. Pearson correlation coefficients (r) and Bland-Altman plots were produced between the PKMAS and force plate outcomes for session 1 to determine validity. The PKMAS demonstrated excellent reliability (ICC 2, k≥0.962) for all COP measures. Pearson correlation coefficients between PKMAS and force plates were ≥0.75 for all outcome variables. Bland-Altman plots and 95% levels of agreement revealed a bias where the PKMAS appeared to underestimate COP path length and speed by approximately 4cm and 6cm/s, respectively. After correcting for bias, our findings suggest the PKMAS is a reliable tool to measure COP in healthy people during gait. Using the PKMAS with the ProtoKinetics Zeno Walkway may allow for more efficient investigation of dynamic balance variables during functional movement tasks. Copyright © 2016 Elsevier B.V. All rights reserved.
Ogah, Imhokhai; Wassersug, Richard J
Prostate cancer patients, as well as their caregivers and healthcare providers, often search the Internet for information about treatment options. We aimed to assess how accurate and up-to-date information about prostate cancer treatments is on websites owned and managed by health-related organizations that most patients and health care providers would consider to be the most trustworthy, based on the reputations of the site providers. We reviewed 43 noncommercial and easily found websites that offered extensive information on treatment options for prostate cancer patients. To assess how comprehensive the sites were, we focused on the information they provided on alternative hormonal therapies to commonly prescribed luteinizing hormone-releasing hormone (LHRH) agonists, namely GnRH antagonists and parenteral estradiol. Only 14 of 43 websites presented GnRH antagonists as a therapy option for prostate cancer. Sixteen of these 43 websites presented estrogen as a possible treatment option, but only 1 of the 43 websites contained current information on parenteral estrogen treatments. Less than half of the sites provided time stamps indicating when they were last updated. Furthermore, most sites with time stamps were not in fact up-to-date based on the information posted on the site. Few seemingly reputable Internet sources for medical information provide viewers with the detailed and up-to-date information that they may expect from such sites when searching for alternatives to standard treatment for androgen suppression. Strategies for keeping such websites up-to-date and reliable are discussed. Sites may improve their credibility and usefulness if they (1) present all evidence-based treatment options, (2) regularly update and time stamp their information, (3) acknowledge that their recommendations on treatments may become out-of-date quickly, (4) and direct viewers to information on relevant, active clinical trials. Maintaining high quality sites may ultimately depend
Full Text Available The objective of the present study was to examine the internal structure of the Test anxiety inventory-State (TAI-State in Spanish version. A sample of 125 college students from Lima (84.8% female between 18 and 31 years old (M = 22.51 was evaluated. The internal structure of the STAI was analyzed by a confirmatory factor analysis, evaluating three models: oblique, bifactor and unidimensional. The results indicate that a single dimension constitutes the STAI and there are coefficients of reliability with high magnitudes. In conclusion, the version studied shows favorable psychometric properties that support its use in Lima.
Sørensen, John Dalsgaard
be included in the safety system and how partial safety factors can be calibrated. An example is presented illustrating how redundancy is taken into account in the safety system in e.g. the Danish codes. The example shows how partial safety factors can be calibrated to comply with the safety level......Redundancy is important to include in the design and analysis of structural systems. In most codes of practice redundancy is not directly taken into account. In the paper various definitions of a deterministic and reliability based redundancy measure are reviewed. It is described how reundancy can...
Full Text Available CCTV systems are widely used across plethora of industrial areas including transport, where their function is to support transport telematics systems. Among others, they are used to ensure travel safety. This paper presented a reliability and maintenance analysis of CCTV. It led to building a relationships graph and then Chapman–Kolmogorov system of equations was derived to describe it. Drawing on those equations, relationships for calculating probability of system staying in state of full ability SPZ, state of the impendency over safety SZB1 as well as state of unreliability of safety SB were derived.
Wild, Christian; Eckhardt, Dave
The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.
C. L. Smith; R. Nims; K. J. Kvarfordt; C. Wharton
The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 6 and 7, what constitutes its parts, and limitations of those processes.
I I- = C # - g..-CL m0.JW f CL 00. C.J 06 CLL U 0) InvC0 A ~ L LA cu CLn t . a 4A If-.n C in~ Ln 0 to 4 . to 40 - 40404 ~4 jut0 C at Cw~- w LMJ ...IN RELIABILITY ANALYSIS CENTER BASIC TECIINOLOGY BIPOLAP OPERATIONAL TYPE TTL MANUFACTURER : PKG/ : SCF CL/ :DATE/ : TEST STRESS : SPEC. :NO. : DEVICE...00C 30OC : : 5164: 0: * : . : I : :SCyC N.E. S: : - :REVRIAS :100C : : 5164: -.68E 05 0. - . ± I . . : N.R. :: : -: : SCF EM :025C : : 5164: : 5
Full Text Available This paper presents the small signal modeling using the state space averaging technique and reliability analysis of a three-phase z-source ac-ac converter. By controlling the shoot-through duty ratio, it can operate in buck-boost mode and maintain desired output voltage during voltage sag and surge condition. It has faster dynamic response and higher efficiency as compared to the traditional voltage regulator. Small signal analysis derives different control transfer functions and this leads to design a suitable controller for a closed loop system during supply voltage variation. The closed loop system of the converter with a PID controller eliminates the transients in output voltage and provides steady state regulated output. The proposed model designed in the RT-LAB and executed in a field programming gate array (FPGA-based real-time digital simulator at a fixedtime step of 10 μs and a constant switching frequency of 10 kHz. The simulator was developed using very high speed integrated circuit hardware description language (VHDL, making it versatile and moveable. Hardware-in-the-loop (HIL simulation results are presented to justify the MATLAB simulation results during supply voltage variation of the three phase z-source ac-ac converter. The reliability analysis has been applied to the converter to find out the failure rate of its different components.
Oliveira, Lécio N. de; Santos, Isaac José A. Luquetti dos; Carvalho, Paulo V.R., E-mail: firstname.lastname@example.org, E-mail: email@example.com, E-mail: firstname.lastname@example.org [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)
This paper reviews the status of researches on the application of human reliability analysis methods at nuclear industry and its evolution along the years. Human reliability analysis (HRA) is one of the elements used in Probabilistic Safety Analysis (PSA) and is performed as part of PSAs to quantify the likelihood that people will fail to take action, such as errors of omission and errors of commission. Although HRA may be used at lots of areas, the focus of this paper is to review the applicability of HRA methods along the years at nuclear industry, especially in Nuclear Power Plants (NPP). An electronic search on CAPES Portal of Journals (A bibliographic database) was performed. This literature review covers original papers published since the first generation of HRA methods until the ones published on March 2017. A total of 94 papers were retrieved by the initial search and 13 were selected to be fully reviewed and for data extraction after the application of inclusion and exclusion criteria, quality and suitability evaluation according to applicability at nuclear industry. Results point out that the methods from first generation are more used in practice than methods from second generation. This occurs because it is more concentrated towards quantification, in terms of success or failure of human action what make them useful for quantitative risk assessment to PSA. Although the second generation considers context and error of commission in human error prediction, they are not wider used in practice at nuclear industry to PSA. (author)
Kim, I. S.; Kim, T. K.; Kim, M. C.; Kim, B. S.; Hwang, S. W.; Ryu, K. C. [Hanyang Univ., Seoul (Korea, Republic of)
Of the many items that should be checked out during a review stage of the licensing application for the I and C system of Ulchin 5 and 6 units, this report relates to a suitability review of the reliability analysis of Digital Plant Protection System (DPPS) and Digital Engineered Safety Features Actuation System (DESFAS). In the reliability analysis performed by the system designer, ABB-CE, fault tree analysis was used as the main methods along with Failure Modes and Effect Analysis (FMEA). However, the present regulatory technique dose not allow the system reliability analysis and its results to be appropriately evaluated. Hence, this study was carried out focusing on the following four items ; development of general review items by which to check the validity of a reliability analysis, and the subsequent review of suitability of the reliability analysis for Ulchin 5 and 6 DPPS and DESFAS L development of detailed review items by which to check the validity of an FMEA, and the subsequent review of suitability of the FMEA for Ulchin 5 and 6 DPPS and DESFAS ; development of detailed review items by which to check the validity of a fault tree analysis, and the subsequent review of suitability of the fault tree for Ulchin 5 and 6 DPPS and DESFAS ; an integrated review of the safety and reliability of the Ulchin 5 and 6 DPPS and DESFAS based on the results of the various reviews above and also of a reliability comparison between the digital systems and the comparable analog systems, i.e., and analog Plant Protection System (PPS) and and analog Engineered Safety Features Actuation System (ESFAS). According to the review mentioned above, the reliability analysis of Ulchin 5 and 6 DPPS and DESFAS generally satisfies the review requirements. However, some shortcomings of the analysis were identified in our review such that the assumed test periods for several equipment were not properly incorporated in the analysis, and failures of some equipment were not included in the
Full Text Available The paper deals with reliability analysis of square footing on soil with strength anisotropy. The strength of the soil has been described with identified anisotropic strength criterion dedicated to geomaterials with layered microstructure. The analysis assumes dip angle α and azimuth angle β which define direction of lamination of the structure to be random variables with given probability density functions. Bearing capacity being a function of these variables is approximated based on results of deterministic simulations obtained for variety of orientations. The weighted regression method by Kaymaz and McMahon within the framework of Response Surface Method is used for the approximation. As a result of analysis, global factor of safety that corresponds to assumed value of probability of failure is determined. The value of the safety factor denotes the ratio between the value of the design load and the mean value of bearing capacity which is needed to reduce the probability of failure to the acceptable level. The procedure of calculating the factor has been presented for two different cases. In the first case, no information about lamination direction of the soil has been provided and thus all the orientations are assumed to be equally probable (uniform distribution. In the second case, statistical information including mean, variance and assumed probability distribution for both α and β angle is known. For the latter case, using results obtained for few different values of mean of angle α, also the influence of strength anisotropy on the value of global factor of safety is shown.
Blanche, Erna Imperatore; Bodison, Stefanie; Chang, Megan C.; Reinoso, Gustavo
OBJECTIVE We developed an observational tool, the Comprehensive Observations of Proprioception (COP), for identifying proprioceptive processing issues in children with developmental disabilities. METHOD Development of the COP underwent three phases. First, we developed items representing proprioceptive functions on the basis of an extensive literature review and consultation with occupational therapists. We then established interrater reliability and content, construct, and criterion validity. Finally, we completed a factor analysis of COP ratings of 130 children with known developmental disabilities. RESULTS Adequate validity and reliability were established. Factor analysis revealed a four-factor model that explained the underlying structure of the measure as it was hypothesized. CONCLUSION The COP is a valid criterion-referenced short observational tool that structures the clinician’s observations by linking a child’s behaviors to areas identified in the literature as relevant to proprioceptive processing. It takes 15 min to administer and can be used in a variety of contexts, such as the home, clinic, and school. PMID:23106989
Kato, Kyoichi; Sato, Hisaya; Abe, Yoshihisa; Ishimori, Yoshiyuki; Hirano, Hiroshi; Higashimura, Kyoji; Amauchi, Hiroshi; Yanakita, Takashi; Kikuchi, Kei; Nakazawa, Yasuo
How the maintenance checks of the medical treatment system, including start of work check and the ending check, was effective for preventive maintenance and the safety improvement was verified. In this research, date on the failure of devices in multiple facilities was collected, and the data of the trouble repair record was analyzed by the technique of reliability engineering. An analysis of data on the system (8 general systems, 6 Angio systems, 11 CT systems, 8 MRI systems, 8 RI systems, and the radiation therapy system 9) used in eight hospitals was performed. The data collection period assumed nine months from April to December 2008. Seven items were analyzed. (1) Mean time between failures (MTBF) (2) Mean time to repair (MTTR) (3) Mean down time (MDT) (4) Number found by check in morning (5) Failure generation time according to modality. The classification of the breakdowns per device, the incidence, and the tendency could be understood by introducing reliability engineering. Analysis, evaluation, and feedback on the failure generation history are useful to keep downtime to a minimum and to ensure safety.
Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejon (Korea, Republic of); Park, Jinkyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Jong Hyun [KEPCO, Ulsan (Korea, Republic of)
The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper.
Boring, Ronald Laurids [Idaho National Laboratory
ABSTRACT: Human reliability analysis (HRA), as currently used in risk assessments, largely derives its methods and guidance from application in the nuclear energy domain. While there are many similarities be-tween nuclear energy and other safety critical domains such as oil and gas, there remain clear differences. This paper provides an overview of HRA state of the practice in nuclear energy and then describes areas where refinements to the methods may be necessary to capture the operational context of oil and gas. Many key distinctions important to nuclear energy HRA such as Level 1 vs. Level 2 analysis may prove insignifi-cant for oil and gas applications. On the other hand, existing HRA methods may not be sensitive enough to factors like the extensive use of digital controls in oil and gas. This paper provides an overview of these con-siderations to assist in the adaptation of existing nuclear-centered HRA methods to the petroleum sector.
Full Text Available The study introduced a finite element model of DQ75t-28m bridge crane metal structure and made finite element static analysis to obtain the stress response of the dangerous point of metal structure in the most extreme condition. The simulated samples of the random variable and the stress of the dangerous point were successfully obtained through the orthogonal design. Then, we utilized BP neural network nonlinear mapping function trains to get the explicit expression of stress in response to the random variable. Combined with random perturbation theory and first-order second-moment (FOSM method, the study analyzed the reliability and its sensitivity of metal structure. In conclusion, we established a novel method for accurately quantitative analysis and design of bridge crane metal structure.
Full Text Available The objective of the present study is to evaluate the time-dependent reliability for dynamic mechanics with insufficient time-varying uncertainty information. In this paper, the nonprobabilistic convex process model, which contains autocorrelation and cross-correlation, is firstly employed for the quantitative assessment of the time-variant uncertainty in structural performance characteristics. By combination of the set-theory method and the regularization treatment, the time-varying properties of structural limit state are determined and a standard convex process with autocorrelation for describing the limit state is formulated. By virtue of the classical first-passage method in random process theory, a new nonprobabilistic measure index of time-dependent reliability is proposed and its solution strategy is mathematically conducted. Furthermore, the Monte-Carlo simulation method is also discussed to illustrate the feasibility and accuracy of the developed approach. Three engineering cases clearly demonstrate that the proposed method may provide a reasonable and more efficient way to estimate structural safety than Monte-Carlo simulations throughout a product life-cycle.
Curlee, T.R.; Das, S.; Lee, R.; Trumble, D.
This report presents the findings of a study to identify the types of information and analysis that are needed for advanced materials. The project was sponsored by the US Bureau of Mines (BOM). It includes a conceptual description of information needs for advanced materials and the development and implementation of a questionnaire on the same subject. This report identifies twelve fundamental differences between advanced and traditional materials and discusses the implications of these differences for data and analysis needs. Advanced and traditional materials differ significantly in terms of physical and chemical properties. Advanced material properties can be customized more easily. The production of advanced materials may differ from traditional materials in terms of inputs, the importance of by-products, the importance of different processing steps (especially fabrication), and scale economies. The potential for change in advanced materials characteristics and markets is greater and is derived from the marriage of radically different materials and processes. In addition to the conceptual study, a questionnaire was developed and implemented to assess the opinions of people who are likely users of BOM information on advanced materials. The results of the questionnaire, which was sent to about 1000 people, generally confirm the propositions set forth in the conceptual part of the study. The results also provide data on the categories of advanced materials and the types of information that are of greatest interest to potential users. 32 refs., 1 fig., 12 tabs.
Motwani, Deepak; Madan, Madan Lal
This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.
Ahmad, Mansur; Hollender, Lars; Odont; Anderson, Quentin; Kartha, Krishnan; Ohrbach, Richard K.; Truelove, Edmond L.; John, Mike T.; Schiffman, Eric L.
Introduction As a part of a multi-site RDC/TMD Validation Project, comprehensive TMJ diagnostic criteria were developed for image analysis using panoramic radiography, magnetic resonance imaging (MRI), and computed tomography (CT). Methods Inter-examiner reliability was estimated using the kappa (k) statistic, and agreement between rater pairs was characterized by overall, positive, and negative percent agreement. CT was the reference standard for assessing validity of other imaging modalities for detecting osteoarthritis (OA). Results For the radiological diagnosis of OA, reliability of the three examiners was poor for panoramic radiography (k = 0.16), fair for MRI (k = 0.46), and close to the threshold for excellent for CT (k = 0.71). Using MRI, reliability was excellent for diagnosing disc displacements (DD) with reduction (k = 0.78) and for DD without reduction (k = 0.94), and was good for effusion (k = 0.64). Overall percent agreement for pair-wise ratings was ≥ 82% for all conditions. Positive percent agreement for diagnosing OA was 19% for panoramic radiography, 59% for MRI, and 84% for CT. Using MRI, positive percent agreement for diagnoses of any DD was 95% and for effusion was 81%. Negative percent agreement was ≥ 88% for all conditions. Compared to CT, panoramic radiography and MRI had poor to marginal sensitivity, respectively, but excellent specificity, in detecting OA. Conclusion Comprehensive image analysis criteria for RDC/TMD Validation Project were developed, which can reliably be employed for assessing OA using CT, and for disc position and effusion using MRI. PMID:19464658
hours were used. Usually, two methods are usedfor machine reliability modeling. The first is Pareto analysis and the second is statistical modeling of failure distributions (Barabadi and Kumar, 2007. For failures distribution modeling data need to be found, that are independent and identically (iid distributed or not. For this, trend test and serial correlation tests are used. If the data has a trend, those are not iid and its parameters are computed from the power law process. For the data that does not havea trend, serial correlation testare performed. If the correlation coefficient is less than 0.05 the data is not iid. Therefore, its parameters reach via branching poison process or other similar methods; if the correlation coefficient is more than 0.05, the data are iid. Therefore, the classical statistical methods will be used for reliability modeling. Trend test results are compared with statistical parameter. A test for serial correlation was also done by plotting the ith TBF against the (i-1th TBF, i ¼ 1; 2; . . . ; n: If the plotted points are randomly scattered without any pattern, it can be interpreted that there is no correlation in general among the TBFs data and the data is independent. To continue, one must choose as the best fit distribution for TBF data. Few tests can be used for best fit distribution that include chi squared test and Kolmogorov–Smirnov (K-S test. Chi squared test is not valid when the data are less than 50. Therefore, when the TBF data are less than 50, K-S test must be used. Hence, the K-S test can be used for each TBF data numbers. When the failure distribution has been determined, the reliability model may be computed by equation (2.Results and discussion: Results of trend analysis for TBF data of sugarcane harvester machines showed that the calculated statistics U for all machines was more than chi squared value that was extracted fromthe chi square table with 2 (n-1 degrees of freedom and 5 percent level of significance. Hence
Miyagi, Willian Eiji; de Poli, Rodrigo de Araujo Bonetti; Papoti, Marcelo; Bertuzzi, Romulo; Zagatto, Alessandro Moura
The aim was to verify the validity (i.e., study A) and reliability (i.e., study B) of the alternative maximal accumulated oxygen deficit determined using onlya supramaximal effort (MAODALT)to estimate anaerobic capacity [i.e., estimated by the gold standard maximal accumulated oxygen deficit method (MAOD)] during cycling. In study A, the effects of supramaximal intensities on MAODALT and the comparison with the MAOD were investigated in fourteen active subjects (26 ± 6 years). In study B, the test-retest reliability was investigated, where fourteen male amateur cyclists (29 ± 5 years) performed the MAODALT twice at 115% of the intensity associated to maximal oxygen uptake (). MAODALT determined at 130 and 150% of was lower than MAOD (p ≤ 0.048), but no differences between MAODALT determined at 100, 105, 110, 115, 120 and 140% of (3.58 ± 0.53L; 3.58 ± 0.59L; 3.53 ± 0.52L; 3.48 ± 0.72L; 3.52 ± 0.61L and 3.46 ± 0.69L, respectively) with MAOD (3.99 ± 0.64L). The MAODALT determined from the intensities between 110 and 120% of presented the better agreement and concordance with MAOD. In the test-retest, the MAODALT was not different (p > 0.05), showed high reproducibility when expressed in absolute values (ICC = 0.96, p < 0.01), and a good level of agreement in the Bland-Altman plot analysis (mean differences ± CI95%:-0.16 ± 0.53L). Thus, the MAODALT seems to be valid and reliable to assess anaerobic capacity in cycling.
Stamp, Jason E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jensen, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Munoz-Ramos, Karina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
Microgrids are a focus of localized energy production that support resiliency, security, local con- trol, and increased access to renewable resources (among other potential benefits). The Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) Joint Capa- bility Technology Demonstration (JCTD) program between the Department of Defense (DOD), Department of Energy (DOE), and Department of Homeland Security (DHS) resulted in the pre- liminary design and deployment of three microgrids at military installations. This paper is focused on the analysis process and supporting software used to determine optimal designs for energy surety microgrids (ESMs) in the SPIDERS project. There are two key pieces of software, an ex- isting software application developed by Sandia National Laboratories (SNL) called Technology Management Optimization (TMO) and a new simulation developed for SPIDERS called the per- formance reliability model (PRM). TMO is a decision support tool that performs multi-objective optimization over a mixed discrete/continuous search space for which the performance measures are unrestricted in form. The PRM is able to statistically quantify the performance and reliability of a microgrid operating in islanded mode (disconnected from any utility power source). Together, these two software applications were used as part of the ESM process to generate the preliminary designs presented by SNL-led DOE team to the DOD. Acknowledgements Sandia National Laboratories and the SPIDERS technical team would like to acknowledge the following for help in the project: * Mike Hightower, who has been the key driving force for Energy Surety Microgrids * Juan Torres and Abbas Akhil, who developed the concept of microgrids for military instal- lations * Merrill Smith, U.S. Department of Energy SPIDERS Program Manager * Ross Roley and Rich Trundy from U.S. Pacific Command * Bill Waugaman and Bill Beary from U.S. Northern Command * Tarek Abdallah, Melanie
Tsai, Ming-Yen; Chen, Shih-Yu; Lin, Chung-Chun
The Meridian Energy Analysis Device is currently a popular tool in the scientific research of meridian electrophysiology. In this field, it is generally believed that measuring the electrical conductivity of meridians provides information about the balance of bioenergy or Qi-blood in the body. PubMed database based on some original articles from 1956 to 2014 and the authoŕs clinical experience. In this short communication, we provide clinical examples of Meridian Energy Analysis Device application, especially in the field of traditional Chinese medicine, discuss the reliability of the measurements, and put the values obtained into context by considering items of considerable variability and by estimating sample size. The Meridian Energy Analysis Device is making a valuable contribution to the diagnosis of Qi-blood dysfunction. It can be assessed from short-term and long-term meridian bioenergy recordings. It is one of the few methods that allow outpatient traditional Chinese medicine diagnosis, monitoring the progress, therapeutic effect and evaluation of patient prognosis. The holistic approaches underlying the practice of traditional Chinese medicine and new trends in modern medicine toward the use of objective instruments require in-depth knowledge of the mechanisms of meridian energy, and the Meridian Energy Analysis Device can feasibly be used for understanding and interpreting traditional Chinese medicine theory, especially in view of its expansion in Western countries.
Jeffrey C. JOe; Ronald L. Boring
Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.
Christman, Matthew S; Kraft, Kate H; Tasian, Gregory E; Zderic, Stephen A; Kolon, Thomas F
There are few normative data on semen analyses in youths at risk for but not presenting with infertility. Standard practice among infertility specialists includes evaluation of 2 separate semen samples, given the degree of within subject variability. We hypothesized that males transitioning from pediatric to adult care who are at risk for infertility would similarly have this variability. We retrospectively reviewed patients with a history of cryptorchidism or varicocele who submitted 2 semen samples for evaluation of fertility potential. The within subject coefficient of variation and intraclass correlation coefficient were calculated for each semen parameter to evaluate reproducibility and reliability, respectively. A total of 79 subjects were studied. Mean ± SD age was 18.8 ± 1.2 years (range 17.8 to 24.7). The within subject coefficient of variation was high for each semen parameter, ranging from 36% for volume and motility to 82% for total motile count. Intraclass correlation coefficient for a single semen analysis ranged from 0.55 for motility to 0.88 for total count. Intraclass correlation coefficient for total motile count was 0.78 (95% CI 0.67-0.85), consistent with substantial reliability. Although we observed within patient variability of individual semen analysis parameters, overall there was substantial agreement between consecutive semen analyses in this population at risk for infertility, particularly regarding total motile count, which is the most important determinant of fertility from a semen analysis. Therefore, it is possible to appropriately classify some young men based on the result of a single measurement as they transition from pediatric to adult care. Copyright © 2013 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Nascimento, C.S. do, E-mail: email@example.com [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Av. Professor Lineu Prestes 2468, 05508-000 Sao Paulo, SP (Brazil); Mesquita, R.N. de, E-mail: firstname.lastname@example.org [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN - SP), Av. Professor Lineu Prestes 2242, 05508-000 Sao Paulo, SP (Brazil)
Highlights: Black-Right-Pointing-Pointer Human Error Probability estimates from operator's reactions to emergency situations. Black-Right-Pointing-Pointer Human Reliability Analysis input data obtainment through fuzzy logic inference. Black-Right-Pointing-Pointer Performance Shaping Factors evaluation influence level onto the operator's actions. - Abstract: Human error has been recognized as an important factor for many industrial and nuclear accidents occurrence. Human error data is scarcely available for different reasons among which, lapses in historical database registry methodology is an important one. Human Reliability Analysis (HRA) is an usual tool employed to estimate the probability that an operator will reasonably perform a system required task in required time without degrading the system. This meta-analysis requires specific Human Error Probability estimates for most of its procedure. This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations hypothesis on Research Reactor IEA-R1 from IPEN, Brazil. Through this proposed methodology HRA should be able to be performed even with shortage of related human error statistical data. A Performance Shaping Factors (PSF's) evaluation in order to classify and estimate their influence level onto the operator's actions and to determine their actual state over the plant was also done. Both HEP estimation and PSF evaluation were done based on expert judgment using interviews and questionnaires. Expert group was established based on selected IEA-R1 operators, and their evaluation were put into a knowledge representation system which used linguistic variables and group evaluation values that were obtained through Fuzzy Logic and Fuzzy Set theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on HRA.
Yokobayashi, Masao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Tamura, Kazuo
Many kinds of human reliability analysis (HRA) methods have been developed. However, users are required to be skillful so as to use them, and also required complicated works such as drawing event tree (ET) and calculation of uncertainty bounds. Moreover, each method is not so complete that only one method of them is not enough to evaluate human reliability. Therefore, a personal computer (PC) based support system for HRA has been developed to execute HRA practically and efficiently. The system consists of two methods, namely, simple method and detailed one. The former uses ASEP that is a simplified THERP-technique, and combined method of OAT and HRA-ET/DeBDA is used for the latter. Users can select a suitable method for their purpose. Human error probability (HEP) data were collected and a database of them was built to use for the support system. This paper describes outline of the HRA methods, support functions and user`s guide of the system. (author).
Ghaffarian, Reza; Evans, John W.
For five decades, the semiconductor industry has distinguished itself by the rapid pace of improvement in miniaturization of electronics products-Moore's Law. Now, scaling hits a brick wall, a paradigm shift. The industry roadmaps recognized the scaling limitation and project that packaging technologies will meet further miniaturization needs or ak.a "More than Moore". This paper presents packaging technology trends and accelerated reliability testing methods currently being practiced. Then, it presents industry status on key advanced electronic packages, factors affecting accelerated solder joint reliability of area array packages, and IPC/JEDEC/Mil specifications for characterizations of assemblies under accelerated thermal and mechanical loading. Finally, it presents an examples demonstrating how Accelerated Testing and Analysis have been effectively employed in the development of complex spacecraft thereby reducing risk. Quantitative assessments necessarily involve the mathematics of probability and statistics. In addition, accelerated tests need to be designed which consider the desired risk posture and schedule for particular project. Such assessments relieve risks without imposing additional costs. and constraints that are not value added for a particular mission. Furthermore, in the course of development of complex systems, variances and defects will inevitably present themselves and require a decision concerning their disposition, necessitating quantitative assessments. In summary, this paper presents a comprehensive view point, from technology to systems, including the benefits and impact of accelerated testing in offsetting risk.
Ronald L. Boring
It has been argued that human reliability analysis (HRA) has expended considerable energy on creating detailed representations of human performance through an increasingly long list of performance shaping factors (PSFs). It is not clear, however, to what extent this refinement and expansion of PSFs has enhanced the quality of HRA. Indeed, there is considerable range in the number of PSFs provided by individual HRA methods, ranging from single factor models such as time-reliability curves, up to 50 or more PSFs in some current HRA models. The US Nuclear Regulatory Commission advocates 15 PSFs in its HRA Good Practices (NUREG-1792), while its SPAR-H method (NUREG/CR-6883) espouses the use of eight PSFs and its ATHEANA method (NUREG-1624) features an open-ended number of PSFs. The apparent differences in the optimal number of PSFs can be explained in terms of the diverse functions of PSFs in HRA. The purpose of this paper is to explore the role of PSFs across different stages of HRA, including identification of potential human errors, modeling of these errors into an overall probabilistic risk assessment, quantifying errors, and preventing errors.
Safie, Fayssal M.; Ring, Robert W.; Cole, Stuart K.
This paper discusses a Reliability, Availability, and Maintainability (RAM) independent assessment conducted to support the refurbishment of the Compressor Station at the NASA Langley Research Center (LaRC). The paper discusses the methodologies used by the assessment team to derive the repair by replacement (RR) strategies to improve the reliability and availability of the Compressor Station (Ref.1). This includes a RAPTOR simulation model that was used to generate the statistical data analysis needed to derive a 15-year investment plan to support the refurbishment of the facility. To summarize, study results clearly indicate that the air compressors are well past their design life. The major failures of Compressors indicate that significant latent failure causes are present. Given the occurrence of these high-cost failures following compressor overhauls, future major failures should be anticipated if compressors are not replaced. Given the results from the RR analysis, the study team recommended a compressor replacement strategy. Based on the data analysis, the RR strategy will lead to sustainable operations through significant improvements in reliability, availability, and the probability of meeting the air demand with acceptable investment cost that should translate, in the long run, into major cost savings. For example, the probability of meeting air demand improved from 79.7 percent for the Base Case to 97.3 percent. Expressed in terms of a reduction in the probability of failing to meet demand (1 in 5 days to 1 in 37 days), the improvement is about 700 percent. Similarly, compressor replacement improved the operational availability of the facility from 97.5 percent to 99.8 percent. Expressed in terms of a reduction in system unavailability (1 in 40 to 1 in 500), the improvement is better than 1000 percent (an order of magnitude improvement). It is worthy to note that the methodologies, tools, and techniques used in the LaRC study can be used to evaluate
In this paper, the general reliability design process of the cross-sectional dimension of the support ring is introduced, which is used for the cylinder sealing. Then, taking a certain section shape support ring as an example, the every size parameters of section are determined from the view point of reliability design. Last, the static strength and reliability of the support ring are analyzed to verify the correctness of the reliability design result.
I have performed a reliability & maintainability analysis for the Amine Swingbed payload system. The Amine Swingbed is a carbon dioxide removal technology that has gone through 2,400 hours of International Space Station on-orbit use between 2013 and 2016. While the Amine Swingbed is currently an experimental payload system, the Amine Swingbed may be converted to system hardware. If the Amine Swingbed becomes system hardware, it will supplement the Carbon Dioxide Removal Assembly (CDRA) as the primary CO2 removal technology on the International Space Station. NASA is also considering using the Amine Swingbed as the primary carbon dioxide removal technology for future extravehicular mobility units and for the Orion, which will be used for the Asteroid Redirect and Journey to Mars missions. The qualitative component of the reliability and maintainability analysis is a Failure Modes and Effects Analysis (FMEA). In the FMEA, I have investigated how individual components in the Amine Swingbed may fail, and what the worst case scenario is should a failure occur. The significant failure effects are the loss of ability to remove carbon dioxide, the formation of ammonia due to chemical degradation of the amine, and loss of atmosphere because the Amine Swingbed uses the vacuum of space to regenerate the Amine Swingbed. In the quantitative component of the reliability and maintainability analysis, I have assumed a constant failure rate for both electronic and nonelectronic parts. Using this data, I have created a Poisson distribution to predict the failure rate of the Amine Swingbed as a whole. I have determined a mean time to failure for the Amine Swingbed to be approximately 1,400 hours. The observed mean time to failure for the system is between 600 and 1,200 hours. This range includes initial testing of the Amine Swingbed, as well as software faults that are understood to be non-critical. If many of the commercial parts were switched to military-grade parts, the expected
... Number NHTSA-2012-0168] Fatality Analysis Reporting System Information Collection AGENCY: National... comments on the following proposed collections of information: (1) Title: Fatal Analysis Reporting System... system that acquires national fatality information directly from existing State files and documents...
R. L. Boring
To date, there has been considerable work on dynamic event trees and other areas related to dynamic probabilistic safety assessment (PSA). The counterpart to these efforts in human reliability analysis (HRA) has centered on the development of specific methods to account for the dynamic nature of human performance. In this paper, the author posits that the key to dynamic HRA is not in the development of specific methods but in the utilization of cognitive modeling and simulation to produce a framework of data that may be used in quantifying the likelihood of human error. This paper provides an overview of simulation approaches to HRA; reviews differences between first, second, and dynamic generation HRA; and outlines potential benefits and challenges of this approach.
Pedersen, Thomas Espelund
is to analyze failure and maintenance data using mathematical and statistical models in order to improve maintenance procedures in the Danish Defence. The first part of the report introduces the maintenance planning problem and presents an overview of models for reliability, failure processes, and maintenance......This report describes a method for analysing failure and maintenance data for a population of complex repairable systems with the aim of improving maintenance efficiency. It is part of a Ph.D. study, titled "Maintenance and replacement strategies for complex systems", the objective of which...... planning. This overview is structured to highlight the process of choosing a proper model for a given data set, focusing on different measures of time and the data requirements for the different models. The second part of the report describes the analysis of two data sets from the Danish Defence. The data...
Sidorov, Michael S; Deck, Gina M; Dolatshahi, Marjan; Thibert, Ronald L; Bird, Lynne M; Chu, Catherine J; Philpot, Benjamin D
Clinicians have qualitatively described rhythmic delta activity as a prominent EEG abnormality in individuals with Angelman syndrome, but this phenotype has yet to be rigorously quantified in the clinical population or validated in a preclinical model. Here, we sought to quantitatively measure delta rhythmicity and evaluate its fidelity as a biomarker. We quantified delta oscillations in mouse and human using parallel spectral analysis methods and measured regional, state-specific, and developmental changes in delta rhythms in a patient population. Delta power was broadly increased and more dynamic in both the Angelman syndrome mouse model, relative to wild-type littermates, and in children with Angelman syndrome, relative to age-matched neurotypical controls. Enhanced delta oscillations in children with Angelman syndrome were present during wakefulness and sleep, were generalized across the neocortex, and were more pronounced at earlier ages. Delta rhythmicity phenotypes can serve as reliable biomarkers for Angelman syndrome in both preclinical and clinical settings.
Zhao, Yan-tao; Zhang, Yu-mei; Hou, Shu-xun; Kong, Liang; Lin, Jun; Zhao, Yi-min; Huo, Na
Varieties of restorative materials are widely used in dentistry. The aim of this study is to explore the influence of different dental restorative materials on bond interface reliability. A two-dimensional finite element analysis method was adopted to simulate the shear-bond efficacy test. The influence of elastic modulus and Poisson's ratio were investigated separately. Several dental restorative materials including resins, metals, and ceramics were analyzed in this study. The deformation and peak equivalent stress level of the dentin-adhesive interface rose sharply following a decrease in the elasticity of restorative materials, especially those with a low elastic modulus range. The influence of the Poisson's coefficient was not significant. Ceramics and gold alloy were preferred to resin composite in restorations bearing extensive shear load during service. Restorative materials with an elastic modulus similar to that of teeth are not always the best clinical choice. This research provides a helpful guide for the application of different restorative materials in clinical practice.